Sharon Worcester is an award-winning medical journalist for MDedge News. She has been with the company since 1996, first as the Southeast Bureau Chief (1996-2009) when the company was known as International Medical News Group, then as a freelance writer (2010-2015) before returning as a reporter in 2015. She previously worked as a daily newspaper reporter covering health and local government. Sharon currently reports primarily on oncology and hematology. She has a BA from Eckerd College and an MA in Mass Communication/Print Journalism from the University of Florida. Connect with her via LinkedIn and follow her on twitter @SW_MedReporter.

Hybrid colorectal cancer screening model reduced cancer rate

An important step toward hybrid screening
Article Type
Changed
Wed, 05/26/2021 - 14:01
Display Headline
Hybrid colorectal cancer screening model reduced cancer rate

A hybrid colorectal cancer screening strategy that incorporates annual fecal immunological testing beginning at age 50 years and a single colonoscopy at age 66 years proved both clinically effective and cost-effective in a simulation model.

Using the Archimedes Model – a "large-scale integrated simulation of human physiology, diseases, and health care systems" – Tuan Dinh, Ph.D., of Archimedes Inc., San Francisco, and colleagues found that compared with no screening, the hybrid strategy with annual fecal immunological testing (FIT) reduced colorectal cancer incidence by 73%, gained 11,200 quality-adjusted life years (QALYs), and saved $126.8 million for every 100,000 people screened during a 30-year period.

Source: American Gastroenterological Association

Without screening, a cohort of 100,000 members of Kaiser Permanente Northern California who were included in the virtual study experienced 6,004 colorectal cancers and 1,837 colorectal cancer deaths. All methods of screening that were evaluated in the model – including annual FIT, colonoscopy at 10-year intervals, sigmoidoscopy at 5-year intervals, both FIT and sigmoidoscopy, and both FIT and colonoscopy – substantially reduced the colorectal cancer incidence, by 53%-76%, and added a significant number of QALYs, compared with no screening, the investigators reported online March 28 in Clinical Gastroenterology and Hepatology.

Colonoscopy as a single-modality screening strategy was most effective for colorectal cancer reduction (76%), and FIT alone was the least costly approach (with savings of $142.6 million per 100,000 persons, compared with no screening), but FIT plus colonoscopy came close: The hybrid strategy reduced colorectal cancer by 73%, and compared with FIT alone, gained 1,400 QALYs/100,000 at an incremental cost of $9,700 per QALY gained. Colonoscopy gained 500 QALY/100,000 more than the hybrid strategy at an incremental cost of $35,100 per QALY gained.

Furthermore, the hybrid strategy required 55% fewer FITs and 41% more colonoscopies than FIT alone, and required 2.1-2.3 fewer colonoscopies per person during 30 years than screening by colonoscopy alone, they reported (Clin. Gastroenterol. Hepatol. 2013 March 28 [doi: 10.1016/j.cgh.2013.03.013]).

On sensitivity analysis, a hybrid approach using biennial FIT was also cost-effective, compared with either FIT or colonoscopy alone.

The core of the Archimedes Model is "a set of equations that represent physiological pathways at the clinical level (i.e., at the level of detail of basic medical tests, clinical trials, and patient charts)." The colorectal cancer submodel, which was derived from public databases, published epidemiologic studies, and clinical trials, was developed in collaboration with the American Cancer Society, the authors explained.

The simulated population included a cross section of 2008 Kaiser Permanente members who were aged 50-75 years at the start of the virtual trial comparing the screening strategies.

The findings are important, given that colorectal cancer is the second-leading cause of cancer deaths among adults in the United States, and although colonoscopy is the recommended approach for primary screening in most U.S. guidelines, it is the most invasive, risky, and costly screening modality, the investigators noted.

Conversely, stool tests with follow-up colonoscopy for positive results are the least expensive. In the past, stool test strategies have been hampered by low sensitivity for adenomas and low specificity, but recent improvements in sensitivity and specificity of FIT has renewed interest in the use of stool tests, they said.

Though the hybrid FIT/colonoscopy strategy is limited by several factors – for example, the accuracy of any simulation model is dependent on assumptions about test performance and adherence, which may vary – the findings of this study suggest the hybrid strategy could improve outcomes while lowering costs.

"The simulation results indicated that [the hybrid strategy] required 37% fewer colonoscopies than [colonoscopy alone], while delivering only slightly inferior health benefits," the investigators said. These results demonstrate that "it is possible to design hybrid colorectal cancer screening strategies that can deliver health benefits and cost-effectiveness that are comparable to those of single-modality strategies, with a favorable impact on resource demand," they noted.

Future clinical studies should address whether hybrid strategies have the additional advantage of increasing screening adherence, they concluded.

This study was carried out by Archimedes under a contract with The Permanente Medical Group (TPMG). One author, Dr. Theodore R. Levin, is a TPMG shareholder, and another, Cindy Caldwell, is a TPMG employee. The authors reported having no other conflicts of interest.

Click for Credit Link
Body

Screening for colorectal cancer (CRC) is currently based on strategies employing single tests, with the exception of the sigmoidoscopy/fecal occult blood test combination. In the United States, colonoscopy has emerged as a dominant CRC screening modality, given its effectiveness for CRC prevention. Drawbacks include increased risk for complications, especially with older patients, and higher cost. Fecal immunochemical testing (FIT) outperforms the older-generation guaiac-based stool tests and has emerged as the prime noninvasive CRC screening option.

Ongoing randomized controlled trials are focused on head-to-head comparisons of colonoscopy versus FIT (or usual care); however, colonoscopy and FIT have complementary strengths and limitations, which make hybrid screening approaches logical and attractive from the clinical and economical standpoints. For example, in the Spanish ColonPrev study, subjects randomized to the FIT group were more likely to participate in screening; however, subjects in the colonoscopy group had more adenomas detected.

A hybrid strategy could capitalize on colonoscopy's higher effectiveness and FIT's lower cost and better adherence, while attenuating the drawbacks of colonoscopy's invasiveness and FIT's lower sensitivity for adenoma detection.

In the present simulation model, a hybrid strategy based on annual or biennial FIT starting at age 50, followed by a single colonoscopy at age 66, resulted in decreased CRC incidence and mortality, gain in quality-adjusted life-years, and reduction in cost comparable to those of single-test strategies.

The study findings, as with any simulation exercise, depend largely upon the baseline assumptions, notably regarding test sensitivity and patient adherence. However, Dinh et al.'s study is an important first step to determine the viability of hybrid screening approaches, and paves the way for future clinical studies.

Dr. Charles Kahi is associate professor of clinical medicine at the Indiana University School of Medicine, and gastroenterology section chief at Roudebush VA Medical Center, both in Indianapolis. He had no relevant financial disclosures.

Author and Disclosure Information

Publications
Topics
Legacy Keywords
colorectal cancer, screening, fecal immunological testing, colonoscopy, Archimedes Model, Tuan Dinh, hybrid strategy, FIT
Sections
Click for Credit Link
Click for Credit Link
Author and Disclosure Information

Author and Disclosure Information

Body

Screening for colorectal cancer (CRC) is currently based on strategies employing single tests, with the exception of the sigmoidoscopy/fecal occult blood test combination. In the United States, colonoscopy has emerged as a dominant CRC screening modality, given its effectiveness for CRC prevention. Drawbacks include increased risk for complications, especially with older patients, and higher cost. Fecal immunochemical testing (FIT) outperforms the older-generation guaiac-based stool tests and has emerged as the prime noninvasive CRC screening option.

Ongoing randomized controlled trials are focused on head-to-head comparisons of colonoscopy versus FIT (or usual care); however, colonoscopy and FIT have complementary strengths and limitations, which make hybrid screening approaches logical and attractive from the clinical and economical standpoints. For example, in the Spanish ColonPrev study, subjects randomized to the FIT group were more likely to participate in screening; however, subjects in the colonoscopy group had more adenomas detected.

A hybrid strategy could capitalize on colonoscopy's higher effectiveness and FIT's lower cost and better adherence, while attenuating the drawbacks of colonoscopy's invasiveness and FIT's lower sensitivity for adenoma detection.

In the present simulation model, a hybrid strategy based on annual or biennial FIT starting at age 50, followed by a single colonoscopy at age 66, resulted in decreased CRC incidence and mortality, gain in quality-adjusted life-years, and reduction in cost comparable to those of single-test strategies.

The study findings, as with any simulation exercise, depend largely upon the baseline assumptions, notably regarding test sensitivity and patient adherence. However, Dinh et al.'s study is an important first step to determine the viability of hybrid screening approaches, and paves the way for future clinical studies.

Dr. Charles Kahi is associate professor of clinical medicine at the Indiana University School of Medicine, and gastroenterology section chief at Roudebush VA Medical Center, both in Indianapolis. He had no relevant financial disclosures.

Body

Screening for colorectal cancer (CRC) is currently based on strategies employing single tests, with the exception of the sigmoidoscopy/fecal occult blood test combination. In the United States, colonoscopy has emerged as a dominant CRC screening modality, given its effectiveness for CRC prevention. Drawbacks include increased risk for complications, especially with older patients, and higher cost. Fecal immunochemical testing (FIT) outperforms the older-generation guaiac-based stool tests and has emerged as the prime noninvasive CRC screening option.

Ongoing randomized controlled trials are focused on head-to-head comparisons of colonoscopy versus FIT (or usual care); however, colonoscopy and FIT have complementary strengths and limitations, which make hybrid screening approaches logical and attractive from the clinical and economical standpoints. For example, in the Spanish ColonPrev study, subjects randomized to the FIT group were more likely to participate in screening; however, subjects in the colonoscopy group had more adenomas detected.

A hybrid strategy could capitalize on colonoscopy's higher effectiveness and FIT's lower cost and better adherence, while attenuating the drawbacks of colonoscopy's invasiveness and FIT's lower sensitivity for adenoma detection.

In the present simulation model, a hybrid strategy based on annual or biennial FIT starting at age 50, followed by a single colonoscopy at age 66, resulted in decreased CRC incidence and mortality, gain in quality-adjusted life-years, and reduction in cost comparable to those of single-test strategies.

The study findings, as with any simulation exercise, depend largely upon the baseline assumptions, notably regarding test sensitivity and patient adherence. However, Dinh et al.'s study is an important first step to determine the viability of hybrid screening approaches, and paves the way for future clinical studies.

Dr. Charles Kahi is associate professor of clinical medicine at the Indiana University School of Medicine, and gastroenterology section chief at Roudebush VA Medical Center, both in Indianapolis. He had no relevant financial disclosures.

Title
An important step toward hybrid screening
An important step toward hybrid screening

A hybrid colorectal cancer screening strategy that incorporates annual fecal immunological testing beginning at age 50 years and a single colonoscopy at age 66 years proved both clinically effective and cost-effective in a simulation model.

Using the Archimedes Model – a "large-scale integrated simulation of human physiology, diseases, and health care systems" – Tuan Dinh, Ph.D., of Archimedes Inc., San Francisco, and colleagues found that compared with no screening, the hybrid strategy with annual fecal immunological testing (FIT) reduced colorectal cancer incidence by 73%, gained 11,200 quality-adjusted life years (QALYs), and saved $126.8 million for every 100,000 people screened during a 30-year period.

Source: American Gastroenterological Association

Without screening, a cohort of 100,000 members of Kaiser Permanente Northern California who were included in the virtual study experienced 6,004 colorectal cancers and 1,837 colorectal cancer deaths. All methods of screening that were evaluated in the model – including annual FIT, colonoscopy at 10-year intervals, sigmoidoscopy at 5-year intervals, both FIT and sigmoidoscopy, and both FIT and colonoscopy – substantially reduced the colorectal cancer incidence, by 53%-76%, and added a significant number of QALYs, compared with no screening, the investigators reported online March 28 in Clinical Gastroenterology and Hepatology.

Colonoscopy as a single-modality screening strategy was most effective for colorectal cancer reduction (76%), and FIT alone was the least costly approach (with savings of $142.6 million per 100,000 persons, compared with no screening), but FIT plus colonoscopy came close: The hybrid strategy reduced colorectal cancer by 73%, and compared with FIT alone, gained 1,400 QALYs/100,000 at an incremental cost of $9,700 per QALY gained. Colonoscopy gained 500 QALY/100,000 more than the hybrid strategy at an incremental cost of $35,100 per QALY gained.

Furthermore, the hybrid strategy required 55% fewer FITs and 41% more colonoscopies than FIT alone, and required 2.1-2.3 fewer colonoscopies per person during 30 years than screening by colonoscopy alone, they reported (Clin. Gastroenterol. Hepatol. 2013 March 28 [doi: 10.1016/j.cgh.2013.03.013]).

On sensitivity analysis, a hybrid approach using biennial FIT was also cost-effective, compared with either FIT or colonoscopy alone.

The core of the Archimedes Model is "a set of equations that represent physiological pathways at the clinical level (i.e., at the level of detail of basic medical tests, clinical trials, and patient charts)." The colorectal cancer submodel, which was derived from public databases, published epidemiologic studies, and clinical trials, was developed in collaboration with the American Cancer Society, the authors explained.

The simulated population included a cross section of 2008 Kaiser Permanente members who were aged 50-75 years at the start of the virtual trial comparing the screening strategies.

The findings are important, given that colorectal cancer is the second-leading cause of cancer deaths among adults in the United States, and although colonoscopy is the recommended approach for primary screening in most U.S. guidelines, it is the most invasive, risky, and costly screening modality, the investigators noted.

Conversely, stool tests with follow-up colonoscopy for positive results are the least expensive. In the past, stool test strategies have been hampered by low sensitivity for adenomas and low specificity, but recent improvements in sensitivity and specificity of FIT has renewed interest in the use of stool tests, they said.

Though the hybrid FIT/colonoscopy strategy is limited by several factors – for example, the accuracy of any simulation model is dependent on assumptions about test performance and adherence, which may vary – the findings of this study suggest the hybrid strategy could improve outcomes while lowering costs.

"The simulation results indicated that [the hybrid strategy] required 37% fewer colonoscopies than [colonoscopy alone], while delivering only slightly inferior health benefits," the investigators said. These results demonstrate that "it is possible to design hybrid colorectal cancer screening strategies that can deliver health benefits and cost-effectiveness that are comparable to those of single-modality strategies, with a favorable impact on resource demand," they noted.

Future clinical studies should address whether hybrid strategies have the additional advantage of increasing screening adherence, they concluded.

This study was carried out by Archimedes under a contract with The Permanente Medical Group (TPMG). One author, Dr. Theodore R. Levin, is a TPMG shareholder, and another, Cindy Caldwell, is a TPMG employee. The authors reported having no other conflicts of interest.

A hybrid colorectal cancer screening strategy that incorporates annual fecal immunological testing beginning at age 50 years and a single colonoscopy at age 66 years proved both clinically effective and cost-effective in a simulation model.

Using the Archimedes Model – a "large-scale integrated simulation of human physiology, diseases, and health care systems" – Tuan Dinh, Ph.D., of Archimedes Inc., San Francisco, and colleagues found that compared with no screening, the hybrid strategy with annual fecal immunological testing (FIT) reduced colorectal cancer incidence by 73%, gained 11,200 quality-adjusted life years (QALYs), and saved $126.8 million for every 100,000 people screened during a 30-year period.

Source: American Gastroenterological Association

Without screening, a cohort of 100,000 members of Kaiser Permanente Northern California who were included in the virtual study experienced 6,004 colorectal cancers and 1,837 colorectal cancer deaths. All methods of screening that were evaluated in the model – including annual FIT, colonoscopy at 10-year intervals, sigmoidoscopy at 5-year intervals, both FIT and sigmoidoscopy, and both FIT and colonoscopy – substantially reduced the colorectal cancer incidence, by 53%-76%, and added a significant number of QALYs, compared with no screening, the investigators reported online March 28 in Clinical Gastroenterology and Hepatology.

Colonoscopy as a single-modality screening strategy was most effective for colorectal cancer reduction (76%), and FIT alone was the least costly approach (with savings of $142.6 million per 100,000 persons, compared with no screening), but FIT plus colonoscopy came close: The hybrid strategy reduced colorectal cancer by 73%, and compared with FIT alone, gained 1,400 QALYs/100,000 at an incremental cost of $9,700 per QALY gained. Colonoscopy gained 500 QALY/100,000 more than the hybrid strategy at an incremental cost of $35,100 per QALY gained.

Furthermore, the hybrid strategy required 55% fewer FITs and 41% more colonoscopies than FIT alone, and required 2.1-2.3 fewer colonoscopies per person during 30 years than screening by colonoscopy alone, they reported (Clin. Gastroenterol. Hepatol. 2013 March 28 [doi: 10.1016/j.cgh.2013.03.013]).

On sensitivity analysis, a hybrid approach using biennial FIT was also cost-effective, compared with either FIT or colonoscopy alone.

The core of the Archimedes Model is "a set of equations that represent physiological pathways at the clinical level (i.e., at the level of detail of basic medical tests, clinical trials, and patient charts)." The colorectal cancer submodel, which was derived from public databases, published epidemiologic studies, and clinical trials, was developed in collaboration with the American Cancer Society, the authors explained.

The simulated population included a cross section of 2008 Kaiser Permanente members who were aged 50-75 years at the start of the virtual trial comparing the screening strategies.

The findings are important, given that colorectal cancer is the second-leading cause of cancer deaths among adults in the United States, and although colonoscopy is the recommended approach for primary screening in most U.S. guidelines, it is the most invasive, risky, and costly screening modality, the investigators noted.

Conversely, stool tests with follow-up colonoscopy for positive results are the least expensive. In the past, stool test strategies have been hampered by low sensitivity for adenomas and low specificity, but recent improvements in sensitivity and specificity of FIT has renewed interest in the use of stool tests, they said.

Though the hybrid FIT/colonoscopy strategy is limited by several factors – for example, the accuracy of any simulation model is dependent on assumptions about test performance and adherence, which may vary – the findings of this study suggest the hybrid strategy could improve outcomes while lowering costs.

"The simulation results indicated that [the hybrid strategy] required 37% fewer colonoscopies than [colonoscopy alone], while delivering only slightly inferior health benefits," the investigators said. These results demonstrate that "it is possible to design hybrid colorectal cancer screening strategies that can deliver health benefits and cost-effectiveness that are comparable to those of single-modality strategies, with a favorable impact on resource demand," they noted.

Future clinical studies should address whether hybrid strategies have the additional advantage of increasing screening adherence, they concluded.

This study was carried out by Archimedes under a contract with The Permanente Medical Group (TPMG). One author, Dr. Theodore R. Levin, is a TPMG shareholder, and another, Cindy Caldwell, is a TPMG employee. The authors reported having no other conflicts of interest.

Publications
Publications
Topics
Article Type
Display Headline
Hybrid colorectal cancer screening model reduced cancer rate
Display Headline
Hybrid colorectal cancer screening model reduced cancer rate
Legacy Keywords
colorectal cancer, screening, fecal immunological testing, colonoscopy, Archimedes Model, Tuan Dinh, hybrid strategy, FIT
Legacy Keywords
colorectal cancer, screening, fecal immunological testing, colonoscopy, Archimedes Model, Tuan Dinh, hybrid strategy, FIT
Sections
Article Source

FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY

PURLs Copyright

Inside the Article

Vitals

Major finding: A hybrid screening strategy reduced colorectal cancer by 73%.

Data source: A cost-effectiveness analysis using a simulation model in 100,000 subjects.

Disclosures: This study was carried out by Archimedes under a contract with The Permanente Medical Group (TPMG). One author, Dr. Theodore R. Levin, is a TPMG shareholder, and another, Cindy Caldwell, is a TPMG employee. The authors reported having no other conflicts of interest.

Carbonation affects brain processing of sweet stimuli

Article Type
Changed
Tue, 05/03/2022 - 15:53
Display Headline
Carbonation affects brain processing of sweet stimuli

Carbonation produces a decrease in the neural processing of sweetness-related signals, particularly those from sucrose, a small functional neuroimaging study shows.

The findings, which suggest that the combination of CO2 and sucrose might increase consumption of sucrose, could have implications for dietary interventions designed to regulate caloric intake, according to Dr. Francesco Di Salle of Salerno (Italy) University and his colleagues.

To assess the interference between CO2 and perception of sweetness, as well as the differential effects of CO2 on sucrose and aspartame-acesulfame, (As-Ac, an artificial sweetener combination commonly used in diet beverages), the investigators performed two functional magnetic resonance imaging (fMRI) experiments to evaluate changes in regional brain activity.

© Irochka/Fotolia.com
Findings from a new study suggest that the combination of CO2 and sucrose might increase consumption of sucrose, said Dr. Francesco Di Salle.

The first experiment, performed in nine volunteers, analyzed the effect of carbonation in four sweet Sprite-based solutions, including one carbonated and sweetened with sucrose, one noncarbonated and sweetened with sucrose, one carbonated and sweetened with As-Ac, and one noncarbonated and sweetened with As-Ac. The second experiment evaluated the spatial location of the strongest neural effects of sour taste and CO2 within the insular cortex of eight subjects.

On fMRI, the presence of carbonation in sweet solutions "independently of the sweetening agent, reduced neural activity in the anterior insula (AI), orbitofrontal cortex (OFC), and posterior pons ... the effect of carbonation on sucrose was much higher than on perception of As-Ac," they noted, explaining that "at the perceptual level ... carbonation reduced the perception of sweetness and the differences between the sensory profiles of sucrose and As-Ac."

This effect may increase sucrose intake, but is also favorable to diet beverage formulations being perceived as similar to regular beverage formulations, the investigators reported online May 28 ahead of print in Gastroenterology.

"It is also coherent with a process of prioritization among perceptual inputs (chemesthetic and gustatory information) deriving from the same body topography and converging to the same cortical regions (AI, OFC), they said (Gastroenterology 2013 [doi:10.1053/j.gastro.2013.05.041]).

To correlate neuroimaging with behavioral data, the ability of carbonation to modulate perception of sweetness was assessed in 14 subjects, who scored the level of perceived sweetness of the solutions on a visual analog scale ranging from 0 to 100 mm. The effect of 1,585 ppm of CO2 added to a 10% glucose solution on the perception of sweetness was also tested in seven subjects.

CO2 was able to significantly reduce sweet-induced taste perceptions as assessed by the volunteers’ visual analog scale recordings: The perception of Sprite-associated sweetness was significantly reduced by CO2 (48 vs. 63 and 48 vs. 55 for As-Ac and sucrose, respectively).

"Similarly, in the presence of carbonation, sweet-induced perception of a 10% glucose solution was significantly reduced (36 vs. 53), the investigators said.

Given the widespread use of CO2 in sweet beverages, the modulation of sweet perception by CO2 is of interest, they noted.

The findings, which suggest that CO2 modulates the perception of sweetness thereby reducing the global neural processing of sweetness, the processing of sucrose more than of As-Ac, and the processing difference between sweetening agents via modulation of the perception of sweetness, is "of utmost importance for designing carbonated beverages and is relevant to the regulation of caloric intake," they said.

"This effect is driven by the integration of information on gastric fullness and on nutrient depletion, conveyed to a brain network where the autonomic brainstem circuitry and tractus solitarius neurons play a critical role in homeostatic functions," they added.

It may be that taste and CO2-related information influence food choices and intake through integration in the tractus solitarius with input from the gastrointestinal tract, they suggested, explaining that "the reduced discrimination between sucrose and As-Ac induced by CO2 would promote the consumptions of low-calorie beverages and would converge with CO2-induced gastric distention in limiting caloric intake."

This study was supported in part by the Coca-Cola Company. One author, Dr. Rosario Cuomo, was sponsored by the Coca-Cola Company. The remaining authors reported having no disclosures.

Author and Disclosure Information

Publications
Topics
Legacy Keywords
Carbonation, sweetness, sucrose, neuroimaging, CO2, caloric intake, Dr. Francesco Di Salle
Sections
Author and Disclosure Information

Author and Disclosure Information

Carbonation produces a decrease in the neural processing of sweetness-related signals, particularly those from sucrose, a small functional neuroimaging study shows.

The findings, which suggest that the combination of CO2 and sucrose might increase consumption of sucrose, could have implications for dietary interventions designed to regulate caloric intake, according to Dr. Francesco Di Salle of Salerno (Italy) University and his colleagues.

To assess the interference between CO2 and perception of sweetness, as well as the differential effects of CO2 on sucrose and aspartame-acesulfame, (As-Ac, an artificial sweetener combination commonly used in diet beverages), the investigators performed two functional magnetic resonance imaging (fMRI) experiments to evaluate changes in regional brain activity.

© Irochka/Fotolia.com
Findings from a new study suggest that the combination of CO2 and sucrose might increase consumption of sucrose, said Dr. Francesco Di Salle.

The first experiment, performed in nine volunteers, analyzed the effect of carbonation in four sweet Sprite-based solutions, including one carbonated and sweetened with sucrose, one noncarbonated and sweetened with sucrose, one carbonated and sweetened with As-Ac, and one noncarbonated and sweetened with As-Ac. The second experiment evaluated the spatial location of the strongest neural effects of sour taste and CO2 within the insular cortex of eight subjects.

On fMRI, the presence of carbonation in sweet solutions "independently of the sweetening agent, reduced neural activity in the anterior insula (AI), orbitofrontal cortex (OFC), and posterior pons ... the effect of carbonation on sucrose was much higher than on perception of As-Ac," they noted, explaining that "at the perceptual level ... carbonation reduced the perception of sweetness and the differences between the sensory profiles of sucrose and As-Ac."

This effect may increase sucrose intake, but is also favorable to diet beverage formulations being perceived as similar to regular beverage formulations, the investigators reported online May 28 ahead of print in Gastroenterology.

"It is also coherent with a process of prioritization among perceptual inputs (chemesthetic and gustatory information) deriving from the same body topography and converging to the same cortical regions (AI, OFC), they said (Gastroenterology 2013 [doi:10.1053/j.gastro.2013.05.041]).

To correlate neuroimaging with behavioral data, the ability of carbonation to modulate perception of sweetness was assessed in 14 subjects, who scored the level of perceived sweetness of the solutions on a visual analog scale ranging from 0 to 100 mm. The effect of 1,585 ppm of CO2 added to a 10% glucose solution on the perception of sweetness was also tested in seven subjects.

CO2 was able to significantly reduce sweet-induced taste perceptions as assessed by the volunteers’ visual analog scale recordings: The perception of Sprite-associated sweetness was significantly reduced by CO2 (48 vs. 63 and 48 vs. 55 for As-Ac and sucrose, respectively).

"Similarly, in the presence of carbonation, sweet-induced perception of a 10% glucose solution was significantly reduced (36 vs. 53), the investigators said.

Given the widespread use of CO2 in sweet beverages, the modulation of sweet perception by CO2 is of interest, they noted.

The findings, which suggest that CO2 modulates the perception of sweetness thereby reducing the global neural processing of sweetness, the processing of sucrose more than of As-Ac, and the processing difference between sweetening agents via modulation of the perception of sweetness, is "of utmost importance for designing carbonated beverages and is relevant to the regulation of caloric intake," they said.

"This effect is driven by the integration of information on gastric fullness and on nutrient depletion, conveyed to a brain network where the autonomic brainstem circuitry and tractus solitarius neurons play a critical role in homeostatic functions," they added.

It may be that taste and CO2-related information influence food choices and intake through integration in the tractus solitarius with input from the gastrointestinal tract, they suggested, explaining that "the reduced discrimination between sucrose and As-Ac induced by CO2 would promote the consumptions of low-calorie beverages and would converge with CO2-induced gastric distention in limiting caloric intake."

This study was supported in part by the Coca-Cola Company. One author, Dr. Rosario Cuomo, was sponsored by the Coca-Cola Company. The remaining authors reported having no disclosures.

Carbonation produces a decrease in the neural processing of sweetness-related signals, particularly those from sucrose, a small functional neuroimaging study shows.

The findings, which suggest that the combination of CO2 and sucrose might increase consumption of sucrose, could have implications for dietary interventions designed to regulate caloric intake, according to Dr. Francesco Di Salle of Salerno (Italy) University and his colleagues.

To assess the interference between CO2 and perception of sweetness, as well as the differential effects of CO2 on sucrose and aspartame-acesulfame, (As-Ac, an artificial sweetener combination commonly used in diet beverages), the investigators performed two functional magnetic resonance imaging (fMRI) experiments to evaluate changes in regional brain activity.

© Irochka/Fotolia.com
Findings from a new study suggest that the combination of CO2 and sucrose might increase consumption of sucrose, said Dr. Francesco Di Salle.

The first experiment, performed in nine volunteers, analyzed the effect of carbonation in four sweet Sprite-based solutions, including one carbonated and sweetened with sucrose, one noncarbonated and sweetened with sucrose, one carbonated and sweetened with As-Ac, and one noncarbonated and sweetened with As-Ac. The second experiment evaluated the spatial location of the strongest neural effects of sour taste and CO2 within the insular cortex of eight subjects.

On fMRI, the presence of carbonation in sweet solutions "independently of the sweetening agent, reduced neural activity in the anterior insula (AI), orbitofrontal cortex (OFC), and posterior pons ... the effect of carbonation on sucrose was much higher than on perception of As-Ac," they noted, explaining that "at the perceptual level ... carbonation reduced the perception of sweetness and the differences between the sensory profiles of sucrose and As-Ac."

This effect may increase sucrose intake, but is also favorable to diet beverage formulations being perceived as similar to regular beverage formulations, the investigators reported online May 28 ahead of print in Gastroenterology.

"It is also coherent with a process of prioritization among perceptual inputs (chemesthetic and gustatory information) deriving from the same body topography and converging to the same cortical regions (AI, OFC), they said (Gastroenterology 2013 [doi:10.1053/j.gastro.2013.05.041]).

To correlate neuroimaging with behavioral data, the ability of carbonation to modulate perception of sweetness was assessed in 14 subjects, who scored the level of perceived sweetness of the solutions on a visual analog scale ranging from 0 to 100 mm. The effect of 1,585 ppm of CO2 added to a 10% glucose solution on the perception of sweetness was also tested in seven subjects.

CO2 was able to significantly reduce sweet-induced taste perceptions as assessed by the volunteers’ visual analog scale recordings: The perception of Sprite-associated sweetness was significantly reduced by CO2 (48 vs. 63 and 48 vs. 55 for As-Ac and sucrose, respectively).

"Similarly, in the presence of carbonation, sweet-induced perception of a 10% glucose solution was significantly reduced (36 vs. 53), the investigators said.

Given the widespread use of CO2 in sweet beverages, the modulation of sweet perception by CO2 is of interest, they noted.

The findings, which suggest that CO2 modulates the perception of sweetness thereby reducing the global neural processing of sweetness, the processing of sucrose more than of As-Ac, and the processing difference between sweetening agents via modulation of the perception of sweetness, is "of utmost importance for designing carbonated beverages and is relevant to the regulation of caloric intake," they said.

"This effect is driven by the integration of information on gastric fullness and on nutrient depletion, conveyed to a brain network where the autonomic brainstem circuitry and tractus solitarius neurons play a critical role in homeostatic functions," they added.

It may be that taste and CO2-related information influence food choices and intake through integration in the tractus solitarius with input from the gastrointestinal tract, they suggested, explaining that "the reduced discrimination between sucrose and As-Ac induced by CO2 would promote the consumptions of low-calorie beverages and would converge with CO2-induced gastric distention in limiting caloric intake."

This study was supported in part by the Coca-Cola Company. One author, Dr. Rosario Cuomo, was sponsored by the Coca-Cola Company. The remaining authors reported having no disclosures.

Publications
Publications
Topics
Article Type
Display Headline
Carbonation affects brain processing of sweet stimuli
Display Headline
Carbonation affects brain processing of sweet stimuli
Legacy Keywords
Carbonation, sweetness, sucrose, neuroimaging, CO2, caloric intake, Dr. Francesco Di Salle
Legacy Keywords
Carbonation, sweetness, sucrose, neuroimaging, CO2, caloric intake, Dr. Francesco Di Salle
Sections
Article Source

FROM GASTROENTEROLOGY

PURLs Copyright

Inside the Article

Vitals

Major finding: The presence of CO2 produces an overall decrease in neural processing of sweetness-related signals.

Data source: A small brain neuroimaging study.

Disclosures: This study was supported in part by the Coca-Cola Company. One author, Dr. Rosario Cuomo, was sponsored by the Coca-Cola Company. The remaining authors reported having no disclosures.

Endoscopy, surgery for pancreatic pseudocysts show equal efficacy

A paradigm shift in clinical practice
Article Type
Changed
Wed, 01/02/2019 - 08:32
Display Headline
Endoscopy, surgery for pancreatic pseudocysts show equal efficacy

Endoscopic cystogastrostomy was as effective as surgical cystogastrostomy for pancreatic pseudocyst drainage in a randomized trial comparing the two approaches.

None of the 20 patients randomized to undergo endoscopic treatment, and 1 of 20 patients randomized to undergo surgery, experienced pseudocyst recurrence within 24 months of follow-up, Dr. Shyam Varadarajulu of the University of Alabama at Birmingham and his colleagues reported online May 31, ahead of print in Gastroenterology.

Source: American Gastroenterological Association

Moreover, those in the endoscopy group had a shorter hospital length of stay than did the patients in the surgery group (median of 2 vs. 6 days) and a lower mean cost of care ($7,011 vs. $15,052), the investigators reported (Gastroenterology 2013 May 31 [doi: 10.1053/j.gastro.2013.05.046]).

Patients included in the study were adults with intrapancreatic or extrapancreatic pseudocysts who were enrolled between Jan. 20 and Dec. 28, 2009, following evaluation by a gastroenterologist or surgeon in an outpatient clinic or inpatient setting.

The 20 patients in the endoscopy group underwent cystogastrostomy using endoscopic ultrasound guidance and fluoroscopy while they were under conscious sedation.

"Once the pseudocyst was identified, it was accessed using a 19-gauge needle, and the gastric wall was dilated up to 15 mm using a wire-guided balloon. Two plastic stents then were deployed to facilitate the drainage of pseudocyst contents into the stomach," the investigators explained, noting that endoscopy patients were discharged following the procedure.

No procedural complications occurred in any of the 20 patients. However, one patient presented to the hospital 13 days later with persistent abdominal pain; a computed tomography scan showed a residual 7-cm pseudocyst, which was successfully treated by deployment of additional stents. At 8-week follow-up, abdominal CT scans showed pseudocyst resolution in all 20 patients.

Endoscopic retrograde cholangiopancreatography (ERCP), which was performed in all of the endoscopy patients to assess and treat any pancreatic duct leaks, was successful in 18 of the 20 patients. Magnetic resonance cholangiopancreatography (MRCP), performed in those two patients, showed a normal pancreatic duct in one and a disconnected duct in the other, the investigators said.

The 20 patients in the surgery group were all treated by the same pancreatic surgeon, who used an endovascular stapler to create at least a 6-cm cystogastrostomy after obtaining entry to the pseudocyst.

"A nasogastric tube then was left in the stomach and passed into the pseudocyst cavity to allow for intermittent irrigation until postoperative day 1 ... the nasogastric tube was removed on postoperative day 1 and clear liquids were started on day 2," they said.

Patients were discharged once a soft diet was tolerated and pain adequately controlled.

One patient with ongoing alcohol consumption developed pseudocyst recurrence at 4 months and was managed by endoscopic cystogastrostomy.

Two surgery patients experienced complications, including a wound infection treated by local debridement and antibiotics in one patient, and a case of hematemesis in one patient who was on anticoagulation and who was readmitted 9 days after discharge. "At endoscopy, a visible clot was noted at the site of surgical anastomosis, and hemostasis was achieved by application of electrocautery," the investigators said.

Two other patients were not able to tolerate oral intake postoperatively; one of them was managed conservatively, and one required surgical placement of a temporary enteral feeding tube. In addition, one patient presented at 6 months with abdominal pain and was found on ERCP to have a stricture in the pancreatic tail that required management by distal pancreatectomy.

Overall, there were no differences in the rates of treatment success, treatment failure, complications, or reinterventions between the endoscopy and surgery groups.

However, in addition to the shorter hospital stay and lower costs in the endoscopy group, patients in that group had significantly greater improvement over time in physical and mental health component scores on the Medical Outcomes Study 36-Item Short-Form General Survey. Although the scores improved for both cohorts, they were 4.48 points and 4.41 points lower, respectively, in the surgery group than the endoscopy group, the investigators said.

The findings are of note because although endoscopic drainage of pancreatic pseudocysts is increasingly performed, surgical cystogastrostomy is still considered the gold standard for treatment, as randomized trials comparing the two approaches had not previously been performed.

"The clinical relevance of this study is substantial because it shows that endoscopically managed patients can be discharged home earlier with a better health-related quality of life, and treatment can be delivered at a lower cost," the investigators said.

The authors reported having no disclosures.

ginews@gastro.org

Body

There has been marked evolution in the understanding and management of acute and chronic pancreatitis over the last decade. Walled-off necroses and pseudocysts are consequences of pancreatitis that may be intrapancreatic, extrapancreatic, or both. These two entities are often confused. Fortunately, a recent international consensus has clarified that pseudocysts are liquid-filled, are almost always extrapancreatic, and rarely occur as the consequence of severe pancreatitis or involve "disconnected duct" (Gut 2013;62:102-11).

Dr. Martin L. Freeman
Walled-off necroses may be intra- or extrapancreatic and almost always contain solid material. Regardless of their name, encapsulated collections in or around the pancreas have traditionally been treated by surgical drainage or debridement. There is now international consensus based on prospective randomized trials that for walled-off necroses, whether infected or sterile, minimally invasive approaches including a minimally invasive step-up approach and/or endoscopic necrosectomy are superior to open surgery (Pancreas 2012;8:1176-94).  Although pseudocysts are much easier to manage endoscopically than are walled-off necroses, there has not previously been a randomized trial comparing treatment strategies.

Dr. Varadarajulu and his colleagues are to be congratulated for performing a landmark study comparing surgery and endoscopy for internal drainage of pseudocysts (Gastroenterology 2013 May 31 [doi: 10.1053/j.gastro.2013.05.046]). They covered all the bases for an outstanding efficacy trial, including performance by experts at a tertiary center, and careful definitions of endpoints. Although the title of the paper is "Equal efficacy … [of the two approaches]," based on the primary endpoint of recurrence at 24 months, they addressed cost, hospital stay, and quality of life measures, all increasingly important in the current health care environment. In the latter regard, endoscopic ultrasound-guided cystgastrostomy emerged to be clearly superior to open surgery. If patients with more comorbidity such as portal hypertension were included, the differences would likely have been even more striking.

Thus, for pseudocysts, as for walled-off necroses, the picture is becoming increasingly clear: Minimally invasive and in particular endoscopic techniques are superior to open surgical approaches. This represents a paradigm shift in clinical practice. However, to be effective and safe in widespread applicability, it is incumbent that endoscopists attempting to manage these conditions have highly specialized expertise in pancreatic diseases and techniques, and manage these complex patients in close collaboration with their colleagues in surgery and interventional radiology.  

Dr. Martin L. Freeman, FACG, FASGE, is professor of medicine at the University of Minnesota, Minneapolis. He disclosed receiving speaking honoraria from Boston Scientific and Cook, and consulting for Boston Scientific.
Author and Disclosure Information

Publications
Topics
Legacy Keywords
Endoscopic cystogastrostomy, surgical cystogastrostomy, pancreatic pseudocyst drainage, surgery, Dr. Shyam Varadarajulu, Gastroenterology
Sections
Author and Disclosure Information

Author and Disclosure Information

Body

There has been marked evolution in the understanding and management of acute and chronic pancreatitis over the last decade. Walled-off necroses and pseudocysts are consequences of pancreatitis that may be intrapancreatic, extrapancreatic, or both. These two entities are often confused. Fortunately, a recent international consensus has clarified that pseudocysts are liquid-filled, are almost always extrapancreatic, and rarely occur as the consequence of severe pancreatitis or involve "disconnected duct" (Gut 2013;62:102-11).

Dr. Martin L. Freeman
Walled-off necroses may be intra- or extrapancreatic and almost always contain solid material. Regardless of their name, encapsulated collections in or around the pancreas have traditionally been treated by surgical drainage or debridement. There is now international consensus based on prospective randomized trials that for walled-off necroses, whether infected or sterile, minimally invasive approaches including a minimally invasive step-up approach and/or endoscopic necrosectomy are superior to open surgery (Pancreas 2012;8:1176-94).  Although pseudocysts are much easier to manage endoscopically than are walled-off necroses, there has not previously been a randomized trial comparing treatment strategies.

Dr. Varadarajulu and his colleagues are to be congratulated for performing a landmark study comparing surgery and endoscopy for internal drainage of pseudocysts (Gastroenterology 2013 May 31 [doi: 10.1053/j.gastro.2013.05.046]). They covered all the bases for an outstanding efficacy trial, including performance by experts at a tertiary center, and careful definitions of endpoints. Although the title of the paper is "Equal efficacy … [of the two approaches]," based on the primary endpoint of recurrence at 24 months, they addressed cost, hospital stay, and quality of life measures, all increasingly important in the current health care environment. In the latter regard, endoscopic ultrasound-guided cystgastrostomy emerged to be clearly superior to open surgery. If patients with more comorbidity such as portal hypertension were included, the differences would likely have been even more striking.

Thus, for pseudocysts, as for walled-off necroses, the picture is becoming increasingly clear: Minimally invasive and in particular endoscopic techniques are superior to open surgical approaches. This represents a paradigm shift in clinical practice. However, to be effective and safe in widespread applicability, it is incumbent that endoscopists attempting to manage these conditions have highly specialized expertise in pancreatic diseases and techniques, and manage these complex patients in close collaboration with their colleagues in surgery and interventional radiology.  

Dr. Martin L. Freeman, FACG, FASGE, is professor of medicine at the University of Minnesota, Minneapolis. He disclosed receiving speaking honoraria from Boston Scientific and Cook, and consulting for Boston Scientific.
Body

There has been marked evolution in the understanding and management of acute and chronic pancreatitis over the last decade. Walled-off necroses and pseudocysts are consequences of pancreatitis that may be intrapancreatic, extrapancreatic, or both. These two entities are often confused. Fortunately, a recent international consensus has clarified that pseudocysts are liquid-filled, are almost always extrapancreatic, and rarely occur as the consequence of severe pancreatitis or involve "disconnected duct" (Gut 2013;62:102-11).

Dr. Martin L. Freeman
Walled-off necroses may be intra- or extrapancreatic and almost always contain solid material. Regardless of their name, encapsulated collections in or around the pancreas have traditionally been treated by surgical drainage or debridement. There is now international consensus based on prospective randomized trials that for walled-off necroses, whether infected or sterile, minimally invasive approaches including a minimally invasive step-up approach and/or endoscopic necrosectomy are superior to open surgery (Pancreas 2012;8:1176-94).  Although pseudocysts are much easier to manage endoscopically than are walled-off necroses, there has not previously been a randomized trial comparing treatment strategies.

Dr. Varadarajulu and his colleagues are to be congratulated for performing a landmark study comparing surgery and endoscopy for internal drainage of pseudocysts (Gastroenterology 2013 May 31 [doi: 10.1053/j.gastro.2013.05.046]). They covered all the bases for an outstanding efficacy trial, including performance by experts at a tertiary center, and careful definitions of endpoints. Although the title of the paper is "Equal efficacy … [of the two approaches]," based on the primary endpoint of recurrence at 24 months, they addressed cost, hospital stay, and quality of life measures, all increasingly important in the current health care environment. In the latter regard, endoscopic ultrasound-guided cystgastrostomy emerged to be clearly superior to open surgery. If patients with more comorbidity such as portal hypertension were included, the differences would likely have been even more striking.

Thus, for pseudocysts, as for walled-off necroses, the picture is becoming increasingly clear: Minimally invasive and in particular endoscopic techniques are superior to open surgical approaches. This represents a paradigm shift in clinical practice. However, to be effective and safe in widespread applicability, it is incumbent that endoscopists attempting to manage these conditions have highly specialized expertise in pancreatic diseases and techniques, and manage these complex patients in close collaboration with their colleagues in surgery and interventional radiology.  

Dr. Martin L. Freeman, FACG, FASGE, is professor of medicine at the University of Minnesota, Minneapolis. He disclosed receiving speaking honoraria from Boston Scientific and Cook, and consulting for Boston Scientific.
Title
A paradigm shift in clinical practice
A paradigm shift in clinical practice

Endoscopic cystogastrostomy was as effective as surgical cystogastrostomy for pancreatic pseudocyst drainage in a randomized trial comparing the two approaches.

None of the 20 patients randomized to undergo endoscopic treatment, and 1 of 20 patients randomized to undergo surgery, experienced pseudocyst recurrence within 24 months of follow-up, Dr. Shyam Varadarajulu of the University of Alabama at Birmingham and his colleagues reported online May 31, ahead of print in Gastroenterology.

Source: American Gastroenterological Association

Moreover, those in the endoscopy group had a shorter hospital length of stay than did the patients in the surgery group (median of 2 vs. 6 days) and a lower mean cost of care ($7,011 vs. $15,052), the investigators reported (Gastroenterology 2013 May 31 [doi: 10.1053/j.gastro.2013.05.046]).

Patients included in the study were adults with intrapancreatic or extrapancreatic pseudocysts who were enrolled between Jan. 20 and Dec. 28, 2009, following evaluation by a gastroenterologist or surgeon in an outpatient clinic or inpatient setting.

The 20 patients in the endoscopy group underwent cystogastrostomy using endoscopic ultrasound guidance and fluoroscopy while they were under conscious sedation.

"Once the pseudocyst was identified, it was accessed using a 19-gauge needle, and the gastric wall was dilated up to 15 mm using a wire-guided balloon. Two plastic stents then were deployed to facilitate the drainage of pseudocyst contents into the stomach," the investigators explained, noting that endoscopy patients were discharged following the procedure.

No procedural complications occurred in any of the 20 patients. However, one patient presented to the hospital 13 days later with persistent abdominal pain; a computed tomography scan showed a residual 7-cm pseudocyst, which was successfully treated by deployment of additional stents. At 8-week follow-up, abdominal CT scans showed pseudocyst resolution in all 20 patients.

Endoscopic retrograde cholangiopancreatography (ERCP), which was performed in all of the endoscopy patients to assess and treat any pancreatic duct leaks, was successful in 18 of the 20 patients. Magnetic resonance cholangiopancreatography (MRCP), performed in those two patients, showed a normal pancreatic duct in one and a disconnected duct in the other, the investigators said.

The 20 patients in the surgery group were all treated by the same pancreatic surgeon, who used an endovascular stapler to create at least a 6-cm cystogastrostomy after obtaining entry to the pseudocyst.

"A nasogastric tube then was left in the stomach and passed into the pseudocyst cavity to allow for intermittent irrigation until postoperative day 1 ... the nasogastric tube was removed on postoperative day 1 and clear liquids were started on day 2," they said.

Patients were discharged once a soft diet was tolerated and pain adequately controlled.

One patient with ongoing alcohol consumption developed pseudocyst recurrence at 4 months and was managed by endoscopic cystogastrostomy.

Two surgery patients experienced complications, including a wound infection treated by local debridement and antibiotics in one patient, and a case of hematemesis in one patient who was on anticoagulation and who was readmitted 9 days after discharge. "At endoscopy, a visible clot was noted at the site of surgical anastomosis, and hemostasis was achieved by application of electrocautery," the investigators said.

Two other patients were not able to tolerate oral intake postoperatively; one of them was managed conservatively, and one required surgical placement of a temporary enteral feeding tube. In addition, one patient presented at 6 months with abdominal pain and was found on ERCP to have a stricture in the pancreatic tail that required management by distal pancreatectomy.

Overall, there were no differences in the rates of treatment success, treatment failure, complications, or reinterventions between the endoscopy and surgery groups.

However, in addition to the shorter hospital stay and lower costs in the endoscopy group, patients in that group had significantly greater improvement over time in physical and mental health component scores on the Medical Outcomes Study 36-Item Short-Form General Survey. Although the scores improved for both cohorts, they were 4.48 points and 4.41 points lower, respectively, in the surgery group than the endoscopy group, the investigators said.

The findings are of note because although endoscopic drainage of pancreatic pseudocysts is increasingly performed, surgical cystogastrostomy is still considered the gold standard for treatment, as randomized trials comparing the two approaches had not previously been performed.

"The clinical relevance of this study is substantial because it shows that endoscopically managed patients can be discharged home earlier with a better health-related quality of life, and treatment can be delivered at a lower cost," the investigators said.

The authors reported having no disclosures.

ginews@gastro.org

Endoscopic cystogastrostomy was as effective as surgical cystogastrostomy for pancreatic pseudocyst drainage in a randomized trial comparing the two approaches.

None of the 20 patients randomized to undergo endoscopic treatment, and 1 of 20 patients randomized to undergo surgery, experienced pseudocyst recurrence within 24 months of follow-up, Dr. Shyam Varadarajulu of the University of Alabama at Birmingham and his colleagues reported online May 31, ahead of print in Gastroenterology.

Source: American Gastroenterological Association

Moreover, those in the endoscopy group had a shorter hospital length of stay than did the patients in the surgery group (median of 2 vs. 6 days) and a lower mean cost of care ($7,011 vs. $15,052), the investigators reported (Gastroenterology 2013 May 31 [doi: 10.1053/j.gastro.2013.05.046]).

Patients included in the study were adults with intrapancreatic or extrapancreatic pseudocysts who were enrolled between Jan. 20 and Dec. 28, 2009, following evaluation by a gastroenterologist or surgeon in an outpatient clinic or inpatient setting.

The 20 patients in the endoscopy group underwent cystogastrostomy using endoscopic ultrasound guidance and fluoroscopy while they were under conscious sedation.

"Once the pseudocyst was identified, it was accessed using a 19-gauge needle, and the gastric wall was dilated up to 15 mm using a wire-guided balloon. Two plastic stents then were deployed to facilitate the drainage of pseudocyst contents into the stomach," the investigators explained, noting that endoscopy patients were discharged following the procedure.

No procedural complications occurred in any of the 20 patients. However, one patient presented to the hospital 13 days later with persistent abdominal pain; a computed tomography scan showed a residual 7-cm pseudocyst, which was successfully treated by deployment of additional stents. At 8-week follow-up, abdominal CT scans showed pseudocyst resolution in all 20 patients.

Endoscopic retrograde cholangiopancreatography (ERCP), which was performed in all of the endoscopy patients to assess and treat any pancreatic duct leaks, was successful in 18 of the 20 patients. Magnetic resonance cholangiopancreatography (MRCP), performed in those two patients, showed a normal pancreatic duct in one and a disconnected duct in the other, the investigators said.

The 20 patients in the surgery group were all treated by the same pancreatic surgeon, who used an endovascular stapler to create at least a 6-cm cystogastrostomy after obtaining entry to the pseudocyst.

"A nasogastric tube then was left in the stomach and passed into the pseudocyst cavity to allow for intermittent irrigation until postoperative day 1 ... the nasogastric tube was removed on postoperative day 1 and clear liquids were started on day 2," they said.

Patients were discharged once a soft diet was tolerated and pain adequately controlled.

One patient with ongoing alcohol consumption developed pseudocyst recurrence at 4 months and was managed by endoscopic cystogastrostomy.

Two surgery patients experienced complications, including a wound infection treated by local debridement and antibiotics in one patient, and a case of hematemesis in one patient who was on anticoagulation and who was readmitted 9 days after discharge. "At endoscopy, a visible clot was noted at the site of surgical anastomosis, and hemostasis was achieved by application of electrocautery," the investigators said.

Two other patients were not able to tolerate oral intake postoperatively; one of them was managed conservatively, and one required surgical placement of a temporary enteral feeding tube. In addition, one patient presented at 6 months with abdominal pain and was found on ERCP to have a stricture in the pancreatic tail that required management by distal pancreatectomy.

Overall, there were no differences in the rates of treatment success, treatment failure, complications, or reinterventions between the endoscopy and surgery groups.

However, in addition to the shorter hospital stay and lower costs in the endoscopy group, patients in that group had significantly greater improvement over time in physical and mental health component scores on the Medical Outcomes Study 36-Item Short-Form General Survey. Although the scores improved for both cohorts, they were 4.48 points and 4.41 points lower, respectively, in the surgery group than the endoscopy group, the investigators said.

The findings are of note because although endoscopic drainage of pancreatic pseudocysts is increasingly performed, surgical cystogastrostomy is still considered the gold standard for treatment, as randomized trials comparing the two approaches had not previously been performed.

"The clinical relevance of this study is substantial because it shows that endoscopically managed patients can be discharged home earlier with a better health-related quality of life, and treatment can be delivered at a lower cost," the investigators said.

The authors reported having no disclosures.

ginews@gastro.org

Publications
Publications
Topics
Article Type
Display Headline
Endoscopy, surgery for pancreatic pseudocysts show equal efficacy
Display Headline
Endoscopy, surgery for pancreatic pseudocysts show equal efficacy
Legacy Keywords
Endoscopic cystogastrostomy, surgical cystogastrostomy, pancreatic pseudocyst drainage, surgery, Dr. Shyam Varadarajulu, Gastroenterology
Legacy Keywords
Endoscopic cystogastrostomy, surgical cystogastrostomy, pancreatic pseudocyst drainage, surgery, Dr. Shyam Varadarajulu, Gastroenterology
Sections
Article Source

FROM GASTROENTEROLOGY

PURLs Copyright

Inside the Article

Vitals

Major finding: Pseudocysts recurred in 0 of 20 endoscopy patients, and 1 of 20 surgery patients.

Data source: An open-label randomized trial involving 40 patients.

Disclosures: The authors reported having no disclosures.

LV fibrosis predicts mortality in atrial fib patients

Findings raise cause/effect questions
Article Type
Changed
Fri, 12/07/2018 - 15:44
Display Headline
LV fibrosis predicts mortality in atrial fib patients

Left ventricular late gadolinium enhancement – a marker of myocardial fibrosis – occurs commonly and may predict mortality in patients with atrial fibrillation, a prospective observational study has shown.

Of 664 consecutive patients with atrial fibrillation and no known prior myocardial infarction who were referred for radiofrequency ablation of the pulmonary veins between September 2006 and June 2011, 88 (13%) had unanticipated left ventricular late gadolinium enhancement (LV LGE) identified via cardiac magnetic resonance imaging, and 68 died over a median follow-up of 42 months. The mortality rate was 8.1% among those with LV LGE, compared with 2.3% among those without LV LGE, Dr. Thomas G. Neilan of Brigham and Women’s Hospital and Massachusetts General Hospital, Boston, and his colleagues reported online Aug. 28 in the Journal of the American College of Cardiology.

After adjustment for key variables, including sex, diabetes, and heart failure, the presence of LV LGE was significantly linked with mortality. Age and the extent of LGE were the strongest independent predictors of mortality (hazard ratios, 1.05 and 1.15, respectively), the investigators said (J. Am. Coll. Cardiol. 2013 Aug. 28 [doi.org/10.1016/j.jacc.2013.07.067 ]).

The presence of LV LGE provided strong prognostic information, they said, noting that each 1% increase in LGE was associated with a 15% increased risk of death.

The findings were similar when an additional 56 patients with a history of myocardial infarction were included in the analysis, they noted.

Patients in the study had an average age of 56 years. Most (73%) were men.

The pattern of LV LGE was ischemic in 59% and nonischemic in 41%; among those with no history of myocardial infarction, the pattern was ischemic in 50% and nonischemic in 50%.

The findings provide needed information about the presence, pattern, and prognostic significance of left ventricular myocardial fibrosis in patients with atrial fibrillation.

The presence of LV LGE provides "strong and complementary" prognostic information in patients with several conditions, such as congenital heart disease, myocardial infarction, and myocarditis to name a few, but limited data are available regarding the presence and prognostic significance of LV LGE in patients with atrial fibrillation, the investigators said.

The current findings highlight the frequency of LV LGE in this population and its strong association with mortality, and support "the robust and additive prognostic information" provided by cardiac magnetic resonance imaging in patients being referred for pulmonary vein isolation, they concluded, adding that the findings also may support further study in this high-risk cohort.

Body

These findings by Dr. Neilan and his colleagues suggest that the presence of late gadolinium enhancement on cardiac magnetic resonance has important consequences, but the causal link between atrial fibrillation and left ventricular fibrosis is difficult to ascertain, Dr. Zhiyu Ling and Dr. Harikrishna Tandri wrote in an editorial.

The findings, which are the result of "an attempt to connect the dots between AF and all cause mortality," also suggest that cardiac magnetic resonance may be the preferred imaging modality for certain higher-risk patients undergoing catheter ablation for AF, they said (J. Am. Coll. Cardiol. 2013 Aug. 28 [doi.org/10.1016/j.jacc.2013.08.692]).

"The finding of LGE in the LV might identify patients with occult coronary disease or a cardiomyopathy, or trigger aggressive risk factor management of modifiable risks such as sleep apnea and hypertension. Whether this strategy will better risk stratify patients at risk of mortality and thus change the observed outcomes needs to be tested in prospective studies designed specifically to answer this question," they wrote.

Additional study is also warranted to investigate the prevalence of LV LGE and its association with cardiovascular mortality in patients with persistent AF and severe comorbidities and to "establish whether LGE is a major independent predictive factor of cardiovascular mortality in patients with AF," they said.

Dr. Ling and Dr. Tandri are with Johns Hopkins Hospital, Baltimore. They reported having no relevant financial disclosures.

Author and Disclosure Information

Publications
Topics
Legacy Keywords
Left ventricular late gadolinium enhancement, myocardial fibrosis, mortality, atrial fibrillation
Author and Disclosure Information

Author and Disclosure Information

Body

These findings by Dr. Neilan and his colleagues suggest that the presence of late gadolinium enhancement on cardiac magnetic resonance has important consequences, but the causal link between atrial fibrillation and left ventricular fibrosis is difficult to ascertain, Dr. Zhiyu Ling and Dr. Harikrishna Tandri wrote in an editorial.

The findings, which are the result of "an attempt to connect the dots between AF and all cause mortality," also suggest that cardiac magnetic resonance may be the preferred imaging modality for certain higher-risk patients undergoing catheter ablation for AF, they said (J. Am. Coll. Cardiol. 2013 Aug. 28 [doi.org/10.1016/j.jacc.2013.08.692]).

"The finding of LGE in the LV might identify patients with occult coronary disease or a cardiomyopathy, or trigger aggressive risk factor management of modifiable risks such as sleep apnea and hypertension. Whether this strategy will better risk stratify patients at risk of mortality and thus change the observed outcomes needs to be tested in prospective studies designed specifically to answer this question," they wrote.

Additional study is also warranted to investigate the prevalence of LV LGE and its association with cardiovascular mortality in patients with persistent AF and severe comorbidities and to "establish whether LGE is a major independent predictive factor of cardiovascular mortality in patients with AF," they said.

Dr. Ling and Dr. Tandri are with Johns Hopkins Hospital, Baltimore. They reported having no relevant financial disclosures.

Body

These findings by Dr. Neilan and his colleagues suggest that the presence of late gadolinium enhancement on cardiac magnetic resonance has important consequences, but the causal link between atrial fibrillation and left ventricular fibrosis is difficult to ascertain, Dr. Zhiyu Ling and Dr. Harikrishna Tandri wrote in an editorial.

The findings, which are the result of "an attempt to connect the dots between AF and all cause mortality," also suggest that cardiac magnetic resonance may be the preferred imaging modality for certain higher-risk patients undergoing catheter ablation for AF, they said (J. Am. Coll. Cardiol. 2013 Aug. 28 [doi.org/10.1016/j.jacc.2013.08.692]).

"The finding of LGE in the LV might identify patients with occult coronary disease or a cardiomyopathy, or trigger aggressive risk factor management of modifiable risks such as sleep apnea and hypertension. Whether this strategy will better risk stratify patients at risk of mortality and thus change the observed outcomes needs to be tested in prospective studies designed specifically to answer this question," they wrote.

Additional study is also warranted to investigate the prevalence of LV LGE and its association with cardiovascular mortality in patients with persistent AF and severe comorbidities and to "establish whether LGE is a major independent predictive factor of cardiovascular mortality in patients with AF," they said.

Dr. Ling and Dr. Tandri are with Johns Hopkins Hospital, Baltimore. They reported having no relevant financial disclosures.

Title
Findings raise cause/effect questions
Findings raise cause/effect questions

Left ventricular late gadolinium enhancement – a marker of myocardial fibrosis – occurs commonly and may predict mortality in patients with atrial fibrillation, a prospective observational study has shown.

Of 664 consecutive patients with atrial fibrillation and no known prior myocardial infarction who were referred for radiofrequency ablation of the pulmonary veins between September 2006 and June 2011, 88 (13%) had unanticipated left ventricular late gadolinium enhancement (LV LGE) identified via cardiac magnetic resonance imaging, and 68 died over a median follow-up of 42 months. The mortality rate was 8.1% among those with LV LGE, compared with 2.3% among those without LV LGE, Dr. Thomas G. Neilan of Brigham and Women’s Hospital and Massachusetts General Hospital, Boston, and his colleagues reported online Aug. 28 in the Journal of the American College of Cardiology.

After adjustment for key variables, including sex, diabetes, and heart failure, the presence of LV LGE was significantly linked with mortality. Age and the extent of LGE were the strongest independent predictors of mortality (hazard ratios, 1.05 and 1.15, respectively), the investigators said (J. Am. Coll. Cardiol. 2013 Aug. 28 [doi.org/10.1016/j.jacc.2013.07.067 ]).

The presence of LV LGE provided strong prognostic information, they said, noting that each 1% increase in LGE was associated with a 15% increased risk of death.

The findings were similar when an additional 56 patients with a history of myocardial infarction were included in the analysis, they noted.

Patients in the study had an average age of 56 years. Most (73%) were men.

The pattern of LV LGE was ischemic in 59% and nonischemic in 41%; among those with no history of myocardial infarction, the pattern was ischemic in 50% and nonischemic in 50%.

The findings provide needed information about the presence, pattern, and prognostic significance of left ventricular myocardial fibrosis in patients with atrial fibrillation.

The presence of LV LGE provides "strong and complementary" prognostic information in patients with several conditions, such as congenital heart disease, myocardial infarction, and myocarditis to name a few, but limited data are available regarding the presence and prognostic significance of LV LGE in patients with atrial fibrillation, the investigators said.

The current findings highlight the frequency of LV LGE in this population and its strong association with mortality, and support "the robust and additive prognostic information" provided by cardiac magnetic resonance imaging in patients being referred for pulmonary vein isolation, they concluded, adding that the findings also may support further study in this high-risk cohort.

Left ventricular late gadolinium enhancement – a marker of myocardial fibrosis – occurs commonly and may predict mortality in patients with atrial fibrillation, a prospective observational study has shown.

Of 664 consecutive patients with atrial fibrillation and no known prior myocardial infarction who were referred for radiofrequency ablation of the pulmonary veins between September 2006 and June 2011, 88 (13%) had unanticipated left ventricular late gadolinium enhancement (LV LGE) identified via cardiac magnetic resonance imaging, and 68 died over a median follow-up of 42 months. The mortality rate was 8.1% among those with LV LGE, compared with 2.3% among those without LV LGE, Dr. Thomas G. Neilan of Brigham and Women’s Hospital and Massachusetts General Hospital, Boston, and his colleagues reported online Aug. 28 in the Journal of the American College of Cardiology.

After adjustment for key variables, including sex, diabetes, and heart failure, the presence of LV LGE was significantly linked with mortality. Age and the extent of LGE were the strongest independent predictors of mortality (hazard ratios, 1.05 and 1.15, respectively), the investigators said (J. Am. Coll. Cardiol. 2013 Aug. 28 [doi.org/10.1016/j.jacc.2013.07.067 ]).

The presence of LV LGE provided strong prognostic information, they said, noting that each 1% increase in LGE was associated with a 15% increased risk of death.

The findings were similar when an additional 56 patients with a history of myocardial infarction were included in the analysis, they noted.

Patients in the study had an average age of 56 years. Most (73%) were men.

The pattern of LV LGE was ischemic in 59% and nonischemic in 41%; among those with no history of myocardial infarction, the pattern was ischemic in 50% and nonischemic in 50%.

The findings provide needed information about the presence, pattern, and prognostic significance of left ventricular myocardial fibrosis in patients with atrial fibrillation.

The presence of LV LGE provides "strong and complementary" prognostic information in patients with several conditions, such as congenital heart disease, myocardial infarction, and myocarditis to name a few, but limited data are available regarding the presence and prognostic significance of LV LGE in patients with atrial fibrillation, the investigators said.

The current findings highlight the frequency of LV LGE in this population and its strong association with mortality, and support "the robust and additive prognostic information" provided by cardiac magnetic resonance imaging in patients being referred for pulmonary vein isolation, they concluded, adding that the findings also may support further study in this high-risk cohort.

Publications
Publications
Topics
Article Type
Display Headline
LV fibrosis predicts mortality in atrial fib patients
Display Headline
LV fibrosis predicts mortality in atrial fib patients
Legacy Keywords
Left ventricular late gadolinium enhancement, myocardial fibrosis, mortality, atrial fibrillation
Legacy Keywords
Left ventricular late gadolinium enhancement, myocardial fibrosis, mortality, atrial fibrillation
Article Source

FROM jOURNAL OF THE AMERICAN COLLEGE OF CARDIOLOGY

PURLs Copyright

Inside the Article

Vitals

Major finding: Mortality was 8.1% in the patients with LV VGE identified via cardiac magnetic resonance imaging, vs. 2.3% in those without LV LGE.

Data source: A prospective cohort study in 720 AF patients.

Disclosures: This study was supported by a Fellow to Faculty grant from the American Heart Association, and by project grants from the National Institutes of Health to study coauthors M. Jerosch-Herold, Ph.D., and Dr. Raymond Y. Kwong. The remaining authors reported having no relevant financial disclosures.

Vitamin C protects kidneys against angiography contrast

Ascorbic acid findings represent advancement
Article Type
Changed
Wed, 12/12/2018 - 19:59
Display Headline
Vitamin C protects kidneys against angiography contrast

Ascorbic acid may protect against contrast-induced-acute kidney injury in patients undergoing coronary angiography, a meta-analysis of nine randomized controlled trials has shown.

The overall incidence of contrast-induced acute kidney injury (CI-AKI) among 740 patients who received ascorbic acid and who were included in the final analysis was 9.6%, compared with 16.8% in 796 patients who received placebo or an alternative pharmacologic treatment, Dr. Umar Sadat of Cambridge (England) University Hospitals NHS Foundation Trust, and colleagues reported.

"In the pooled analysis using random effects model, patients receiving ascorbic acid had 33% less risk of CI-AKI compared to the control group (risk ratio, 0.672)," a statistically significant difference, the investigators wrote.

The findings were published online Aug. 28 in the Journal of the American College of Cardiology.

The investigators systematically reviewed Medline, Embase, and Cochrane central databases for studies published from inception to May 2013 on the incidence of CI-AKI in patients undergoing coronary angiography. Studies that were included in the meta-analysis had at least one arm that involved treatment with ascorbic acid alone or in combination with saline hydration. Ultimately, nine studies involving a total of 1,536 patients with baseline renal impairment were included (J. Am. Coll. Cardiol. 2013 Aug. 28).

The findings, which provide "robust evidence that ascorbic acid reduces the risk of CI-AKI, albeit by a small magnitude," suggest that ascorbic acid has nephroprotective qualities and could form a part of effective prophylactic pharmacologic regimens to protect patients undergoing coronary angiography against CI-AKI.

It makes sense that ascorbic acid – a form of vitamin C – could provide nephroprotection, because strong evidence suggests it acts as a potent antioxidant by scavenging physiologically relevant reactive oxygen species (ROS), they explained, noting that ROS-induced oxidative stress and renal vasoconstriction have been implicated in the etiology of CI-AKI.

The findings are important, because the incidence of CI-AKI is rising in tandem with the increasing number of contrast media–enhanced radiologic procedures and with a rise in the octogenarian population with comorbidities such as hypertension, diabetes, and renovascular disease that predispose patients to renal impairment, the investigators said.

However, further investigation regarding the optimal dosage and route of administration of ascorbic acid in order to assess its full potential as a nephroprotective agent is warranted, they concluded.

Body

The findings of this meta-analysis represent an advancement in the field, and frame ascorbic acid as a potential therapy to be evaluated in large-scale clinical trials, according to Dr. Peter A. McCullough and Dr. Krittapoom Akrawinthawong.

This is important given the lack of "bona fide preventive approaches" to protect against CK-AKI until less-toxic iodinated contrast becomes available, they wrote in an editorial (J. Am. Coll. Cardiol. 2013 Aug. 28).

However, several questions about the potential benefits of ascorbic acid in this setting remain unanswered, they noted.

The key question remains: "If we prevent or lessen CI-AKI as determined by serum creatinine, will we reduce the rates of clinical outcomes including end-stage renal disease, mortality, and more secondary events including the development of heart failure, recurrent acute coronary syndromes, or stroke?"

This question can be answered only by large-scale trials with effective therapies and adequate follow-up, they said.

Dr. McCullough is with Providence Hospitals and Medical Centers, Southfield and Novi, Mich. Dr. Akrawinthawong is with St. John Hospital and Medical Center, Detroit. They said they had no relevant financial disclosures.

Author and Disclosure Information

Publications
Topics
Legacy Keywords
Ascorbic acid, contrast-induced-acute kidney injury, coronary angiography, CI-AKI, Dr. Umar Sadat,
Author and Disclosure Information

Author and Disclosure Information

Body

The findings of this meta-analysis represent an advancement in the field, and frame ascorbic acid as a potential therapy to be evaluated in large-scale clinical trials, according to Dr. Peter A. McCullough and Dr. Krittapoom Akrawinthawong.

This is important given the lack of "bona fide preventive approaches" to protect against CK-AKI until less-toxic iodinated contrast becomes available, they wrote in an editorial (J. Am. Coll. Cardiol. 2013 Aug. 28).

However, several questions about the potential benefits of ascorbic acid in this setting remain unanswered, they noted.

The key question remains: "If we prevent or lessen CI-AKI as determined by serum creatinine, will we reduce the rates of clinical outcomes including end-stage renal disease, mortality, and more secondary events including the development of heart failure, recurrent acute coronary syndromes, or stroke?"

This question can be answered only by large-scale trials with effective therapies and adequate follow-up, they said.

Dr. McCullough is with Providence Hospitals and Medical Centers, Southfield and Novi, Mich. Dr. Akrawinthawong is with St. John Hospital and Medical Center, Detroit. They said they had no relevant financial disclosures.

Body

The findings of this meta-analysis represent an advancement in the field, and frame ascorbic acid as a potential therapy to be evaluated in large-scale clinical trials, according to Dr. Peter A. McCullough and Dr. Krittapoom Akrawinthawong.

This is important given the lack of "bona fide preventive approaches" to protect against CK-AKI until less-toxic iodinated contrast becomes available, they wrote in an editorial (J. Am. Coll. Cardiol. 2013 Aug. 28).

However, several questions about the potential benefits of ascorbic acid in this setting remain unanswered, they noted.

The key question remains: "If we prevent or lessen CI-AKI as determined by serum creatinine, will we reduce the rates of clinical outcomes including end-stage renal disease, mortality, and more secondary events including the development of heart failure, recurrent acute coronary syndromes, or stroke?"

This question can be answered only by large-scale trials with effective therapies and adequate follow-up, they said.

Dr. McCullough is with Providence Hospitals and Medical Centers, Southfield and Novi, Mich. Dr. Akrawinthawong is with St. John Hospital and Medical Center, Detroit. They said they had no relevant financial disclosures.

Title
Ascorbic acid findings represent advancement
Ascorbic acid findings represent advancement

Ascorbic acid may protect against contrast-induced-acute kidney injury in patients undergoing coronary angiography, a meta-analysis of nine randomized controlled trials has shown.

The overall incidence of contrast-induced acute kidney injury (CI-AKI) among 740 patients who received ascorbic acid and who were included in the final analysis was 9.6%, compared with 16.8% in 796 patients who received placebo or an alternative pharmacologic treatment, Dr. Umar Sadat of Cambridge (England) University Hospitals NHS Foundation Trust, and colleagues reported.

"In the pooled analysis using random effects model, patients receiving ascorbic acid had 33% less risk of CI-AKI compared to the control group (risk ratio, 0.672)," a statistically significant difference, the investigators wrote.

The findings were published online Aug. 28 in the Journal of the American College of Cardiology.

The investigators systematically reviewed Medline, Embase, and Cochrane central databases for studies published from inception to May 2013 on the incidence of CI-AKI in patients undergoing coronary angiography. Studies that were included in the meta-analysis had at least one arm that involved treatment with ascorbic acid alone or in combination with saline hydration. Ultimately, nine studies involving a total of 1,536 patients with baseline renal impairment were included (J. Am. Coll. Cardiol. 2013 Aug. 28).

The findings, which provide "robust evidence that ascorbic acid reduces the risk of CI-AKI, albeit by a small magnitude," suggest that ascorbic acid has nephroprotective qualities and could form a part of effective prophylactic pharmacologic regimens to protect patients undergoing coronary angiography against CI-AKI.

It makes sense that ascorbic acid – a form of vitamin C – could provide nephroprotection, because strong evidence suggests it acts as a potent antioxidant by scavenging physiologically relevant reactive oxygen species (ROS), they explained, noting that ROS-induced oxidative stress and renal vasoconstriction have been implicated in the etiology of CI-AKI.

The findings are important, because the incidence of CI-AKI is rising in tandem with the increasing number of contrast media–enhanced radiologic procedures and with a rise in the octogenarian population with comorbidities such as hypertension, diabetes, and renovascular disease that predispose patients to renal impairment, the investigators said.

However, further investigation regarding the optimal dosage and route of administration of ascorbic acid in order to assess its full potential as a nephroprotective agent is warranted, they concluded.

Ascorbic acid may protect against contrast-induced-acute kidney injury in patients undergoing coronary angiography, a meta-analysis of nine randomized controlled trials has shown.

The overall incidence of contrast-induced acute kidney injury (CI-AKI) among 740 patients who received ascorbic acid and who were included in the final analysis was 9.6%, compared with 16.8% in 796 patients who received placebo or an alternative pharmacologic treatment, Dr. Umar Sadat of Cambridge (England) University Hospitals NHS Foundation Trust, and colleagues reported.

"In the pooled analysis using random effects model, patients receiving ascorbic acid had 33% less risk of CI-AKI compared to the control group (risk ratio, 0.672)," a statistically significant difference, the investigators wrote.

The findings were published online Aug. 28 in the Journal of the American College of Cardiology.

The investigators systematically reviewed Medline, Embase, and Cochrane central databases for studies published from inception to May 2013 on the incidence of CI-AKI in patients undergoing coronary angiography. Studies that were included in the meta-analysis had at least one arm that involved treatment with ascorbic acid alone or in combination with saline hydration. Ultimately, nine studies involving a total of 1,536 patients with baseline renal impairment were included (J. Am. Coll. Cardiol. 2013 Aug. 28).

The findings, which provide "robust evidence that ascorbic acid reduces the risk of CI-AKI, albeit by a small magnitude," suggest that ascorbic acid has nephroprotective qualities and could form a part of effective prophylactic pharmacologic regimens to protect patients undergoing coronary angiography against CI-AKI.

It makes sense that ascorbic acid – a form of vitamin C – could provide nephroprotection, because strong evidence suggests it acts as a potent antioxidant by scavenging physiologically relevant reactive oxygen species (ROS), they explained, noting that ROS-induced oxidative stress and renal vasoconstriction have been implicated in the etiology of CI-AKI.

The findings are important, because the incidence of CI-AKI is rising in tandem with the increasing number of contrast media–enhanced radiologic procedures and with a rise in the octogenarian population with comorbidities such as hypertension, diabetes, and renovascular disease that predispose patients to renal impairment, the investigators said.

However, further investigation regarding the optimal dosage and route of administration of ascorbic acid in order to assess its full potential as a nephroprotective agent is warranted, they concluded.

Publications
Publications
Topics
Article Type
Display Headline
Vitamin C protects kidneys against angiography contrast
Display Headline
Vitamin C protects kidneys against angiography contrast
Legacy Keywords
Ascorbic acid, contrast-induced-acute kidney injury, coronary angiography, CI-AKI, Dr. Umar Sadat,
Legacy Keywords
Ascorbic acid, contrast-induced-acute kidney injury, coronary angiography, CI-AKI, Dr. Umar Sadat,
Article Source

FROM JACC

PURLs Copyright

Inside the Article

Vitals

Major finding: Ascorbic acid treatment was associated with a 33% reduction in CI-AKI risk.

Data source: A meta-analysis of nine randomized controlled trials involving 1,536 patients.

Disclosures: The investigators reported having no relevant financial disclosures.

Omega-3 fatty acid found in fish may reduce RA risk

Article Type
Changed
Fri, 12/07/2018 - 15:44
Display Headline
Omega-3 fatty acid found in fish may reduce RA risk

Consistent dietary intake of long-chain n-3 polyunsaturated fatty acids typically found in fish was associated with a reduced risk of rheumatoid arthritis in a prospective cohort study of Swedish women.

The study of 32,232 middle-aged and older Swedish women included 205 who developed rheumatoid arthritis (RA) during a mean follow-up of 7.5 years. Those who reported intake of more than 0.21 g/day of dietary long-chain n-3 polyunsaturated fatty acids (PUFAs), also called omega-3 fatty acids, at the time of an assessment in 1997 had a 35% reduction in the risk of developing RA, compared with women who reported lower intake (adjusted relative risk, 0.65), reported Dr. Daniela Di Giuseppe of the Karolinska Institute, Stockholm, and her colleagues.

© Suprijono Suharjoto/Fotolia.com
A new study found that consumption of long-chain n-3 polyunsaturated fatty acids typically found in fish was associated with a reduced risk of rheumatoid arthritis.

"The population prevented fraction, the equivalent of the population attributable risk when analyzing a protective factor, was 0.28. This indicates that 28% of the hypothetical total load of disease could be prevented by exposure to greater than 0.21 g/day of dietary long-chain n-3 PUFAs," the investigators wrote.

Consistent long-term intake of more than 0.21 g/day was associated with even greater risk reduction. Women who reported intake at that level at assessments in both 1987 and 1997 had a 52% reduction in risk, compared with those with lower intake (adjusted RR, 0.48), they said.

Those who reported consistent long-term consumption of at least one serving of fish weekly had a 29% reduction in risk, compared with those who had less than one serving weekly (adjusted RR, 0.71), they noted (Ann. Rheum. Dis. 2013 Aug. 12 [doi: 10.1136/annrheumdis-2013-203338]).

"The inverse association between fish consumption and RA can be attributed mainly to its content of long-chain n-3 PUFAs," they wrote.

Of note, the use of fish oil supplementation was not associated with the development of RA in this study (relative risk, 1.32 for ever vs. never users). However, there were only 17 users, and information about dose and duration of use was lacking.

Study subjects were participants in the population-based Swedish Mammography Cohort. The women, who were aged 54-89 years at follow-up, completed detailed food-frequency questionnaires at the 1987 and 1997 assessments.

The multivariable relative risks for dietary long-chain n-3 PUFAs were adjusted for cigarette smoking, alcohol intake, aspirin use, and energy intake. The multivariable relative risks for long-term fish consumption were also adjusted for quartiles of red meat and dairy food consumption.

The findings have implications for dietary guidelines with respect to fish consumption, the investigators said.

RA is known to be influenced by both genetic and environmental factors, but little is known about important modifiable risk factors other than smoking and alcohol consumption, they said, noting that the evidence regarding a role for dietary long-chain n-3 PUFAs has been limited, and study results have been conflicting.

However, the current study suggests that intake of at least 0.21 g/day is indeed of benefit. That intake is about the equivalent of at least one serving of fatty fish, such as salmon, or four servings of lean fish, such as cod, each week, the investigators explained. They suggested that dietary long-chain n-3 PUFAs may protect against RA through their anti-inflammatory properties, particularly through the synthesis of eicosanoids.

These intake levels are in line with recommendations from the American Dietary Guideline Advisory Committee, which advised eating fish twice weekly.

"In conclusion, the study indicates a potentially important role for dietary long-chain n-3 PUFAs in the etiology of RA, and that adherence to existing dietary guidelines regarding fish consumption may also be beneficial in terms of RA risk," they wrote.

The study was supported by research grants from the Swedish Research Council/Committee for Research Infrastructure for Maintenance of the Swedish Mammography Cohort, and from the Karolinska Institute’s Award for Ph.D. Students. The authors reported having no disclosures.

Click for Credit Link
Author and Disclosure Information

Publications
Topics
Legacy Keywords
long-chain n-3 polyunsaturated fatty acids, fish consumption, rheumatoid arthritis, RA, fatty acids, PUFAs, omega-3 fatty acids, Dr. Daniela Di Giuseppe,
Click for Credit Link
Click for Credit Link
Author and Disclosure Information

Author and Disclosure Information

Consistent dietary intake of long-chain n-3 polyunsaturated fatty acids typically found in fish was associated with a reduced risk of rheumatoid arthritis in a prospective cohort study of Swedish women.

The study of 32,232 middle-aged and older Swedish women included 205 who developed rheumatoid arthritis (RA) during a mean follow-up of 7.5 years. Those who reported intake of more than 0.21 g/day of dietary long-chain n-3 polyunsaturated fatty acids (PUFAs), also called omega-3 fatty acids, at the time of an assessment in 1997 had a 35% reduction in the risk of developing RA, compared with women who reported lower intake (adjusted relative risk, 0.65), reported Dr. Daniela Di Giuseppe of the Karolinska Institute, Stockholm, and her colleagues.

© Suprijono Suharjoto/Fotolia.com
A new study found that consumption of long-chain n-3 polyunsaturated fatty acids typically found in fish was associated with a reduced risk of rheumatoid arthritis.

"The population prevented fraction, the equivalent of the population attributable risk when analyzing a protective factor, was 0.28. This indicates that 28% of the hypothetical total load of disease could be prevented by exposure to greater than 0.21 g/day of dietary long-chain n-3 PUFAs," the investigators wrote.

Consistent long-term intake of more than 0.21 g/day was associated with even greater risk reduction. Women who reported intake at that level at assessments in both 1987 and 1997 had a 52% reduction in risk, compared with those with lower intake (adjusted RR, 0.48), they said.

Those who reported consistent long-term consumption of at least one serving of fish weekly had a 29% reduction in risk, compared with those who had less than one serving weekly (adjusted RR, 0.71), they noted (Ann. Rheum. Dis. 2013 Aug. 12 [doi: 10.1136/annrheumdis-2013-203338]).

"The inverse association between fish consumption and RA can be attributed mainly to its content of long-chain n-3 PUFAs," they wrote.

Of note, the use of fish oil supplementation was not associated with the development of RA in this study (relative risk, 1.32 for ever vs. never users). However, there were only 17 users, and information about dose and duration of use was lacking.

Study subjects were participants in the population-based Swedish Mammography Cohort. The women, who were aged 54-89 years at follow-up, completed detailed food-frequency questionnaires at the 1987 and 1997 assessments.

The multivariable relative risks for dietary long-chain n-3 PUFAs were adjusted for cigarette smoking, alcohol intake, aspirin use, and energy intake. The multivariable relative risks for long-term fish consumption were also adjusted for quartiles of red meat and dairy food consumption.

The findings have implications for dietary guidelines with respect to fish consumption, the investigators said.

RA is known to be influenced by both genetic and environmental factors, but little is known about important modifiable risk factors other than smoking and alcohol consumption, they said, noting that the evidence regarding a role for dietary long-chain n-3 PUFAs has been limited, and study results have been conflicting.

However, the current study suggests that intake of at least 0.21 g/day is indeed of benefit. That intake is about the equivalent of at least one serving of fatty fish, such as salmon, or four servings of lean fish, such as cod, each week, the investigators explained. They suggested that dietary long-chain n-3 PUFAs may protect against RA through their anti-inflammatory properties, particularly through the synthesis of eicosanoids.

These intake levels are in line with recommendations from the American Dietary Guideline Advisory Committee, which advised eating fish twice weekly.

"In conclusion, the study indicates a potentially important role for dietary long-chain n-3 PUFAs in the etiology of RA, and that adherence to existing dietary guidelines regarding fish consumption may also be beneficial in terms of RA risk," they wrote.

The study was supported by research grants from the Swedish Research Council/Committee for Research Infrastructure for Maintenance of the Swedish Mammography Cohort, and from the Karolinska Institute’s Award for Ph.D. Students. The authors reported having no disclosures.

Consistent dietary intake of long-chain n-3 polyunsaturated fatty acids typically found in fish was associated with a reduced risk of rheumatoid arthritis in a prospective cohort study of Swedish women.

The study of 32,232 middle-aged and older Swedish women included 205 who developed rheumatoid arthritis (RA) during a mean follow-up of 7.5 years. Those who reported intake of more than 0.21 g/day of dietary long-chain n-3 polyunsaturated fatty acids (PUFAs), also called omega-3 fatty acids, at the time of an assessment in 1997 had a 35% reduction in the risk of developing RA, compared with women who reported lower intake (adjusted relative risk, 0.65), reported Dr. Daniela Di Giuseppe of the Karolinska Institute, Stockholm, and her colleagues.

© Suprijono Suharjoto/Fotolia.com
A new study found that consumption of long-chain n-3 polyunsaturated fatty acids typically found in fish was associated with a reduced risk of rheumatoid arthritis.

"The population prevented fraction, the equivalent of the population attributable risk when analyzing a protective factor, was 0.28. This indicates that 28% of the hypothetical total load of disease could be prevented by exposure to greater than 0.21 g/day of dietary long-chain n-3 PUFAs," the investigators wrote.

Consistent long-term intake of more than 0.21 g/day was associated with even greater risk reduction. Women who reported intake at that level at assessments in both 1987 and 1997 had a 52% reduction in risk, compared with those with lower intake (adjusted RR, 0.48), they said.

Those who reported consistent long-term consumption of at least one serving of fish weekly had a 29% reduction in risk, compared with those who had less than one serving weekly (adjusted RR, 0.71), they noted (Ann. Rheum. Dis. 2013 Aug. 12 [doi: 10.1136/annrheumdis-2013-203338]).

"The inverse association between fish consumption and RA can be attributed mainly to its content of long-chain n-3 PUFAs," they wrote.

Of note, the use of fish oil supplementation was not associated with the development of RA in this study (relative risk, 1.32 for ever vs. never users). However, there were only 17 users, and information about dose and duration of use was lacking.

Study subjects were participants in the population-based Swedish Mammography Cohort. The women, who were aged 54-89 years at follow-up, completed detailed food-frequency questionnaires at the 1987 and 1997 assessments.

The multivariable relative risks for dietary long-chain n-3 PUFAs were adjusted for cigarette smoking, alcohol intake, aspirin use, and energy intake. The multivariable relative risks for long-term fish consumption were also adjusted for quartiles of red meat and dairy food consumption.

The findings have implications for dietary guidelines with respect to fish consumption, the investigators said.

RA is known to be influenced by both genetic and environmental factors, but little is known about important modifiable risk factors other than smoking and alcohol consumption, they said, noting that the evidence regarding a role for dietary long-chain n-3 PUFAs has been limited, and study results have been conflicting.

However, the current study suggests that intake of at least 0.21 g/day is indeed of benefit. That intake is about the equivalent of at least one serving of fatty fish, such as salmon, or four servings of lean fish, such as cod, each week, the investigators explained. They suggested that dietary long-chain n-3 PUFAs may protect against RA through their anti-inflammatory properties, particularly through the synthesis of eicosanoids.

These intake levels are in line with recommendations from the American Dietary Guideline Advisory Committee, which advised eating fish twice weekly.

"In conclusion, the study indicates a potentially important role for dietary long-chain n-3 PUFAs in the etiology of RA, and that adherence to existing dietary guidelines regarding fish consumption may also be beneficial in terms of RA risk," they wrote.

The study was supported by research grants from the Swedish Research Council/Committee for Research Infrastructure for Maintenance of the Swedish Mammography Cohort, and from the Karolinska Institute’s Award for Ph.D. Students. The authors reported having no disclosures.

Publications
Publications
Topics
Article Type
Display Headline
Omega-3 fatty acid found in fish may reduce RA risk
Display Headline
Omega-3 fatty acid found in fish may reduce RA risk
Legacy Keywords
long-chain n-3 polyunsaturated fatty acids, fish consumption, rheumatoid arthritis, RA, fatty acids, PUFAs, omega-3 fatty acids, Dr. Daniela Di Giuseppe,
Legacy Keywords
long-chain n-3 polyunsaturated fatty acids, fish consumption, rheumatoid arthritis, RA, fatty acids, PUFAs, omega-3 fatty acids, Dr. Daniela Di Giuseppe,
Article Source

FROM ANNALS OF THE RHEUMATIC DISEASES

PURLs Copyright

Inside the Article

Vitals

Major finding: More than 0.21 g/day intake of dietary long-chain n-3 PUFAs was associated with a 35% reduction in RA risk (adjusted relative risk, 0.65).

Data source: A population-based, prospective cohort study of 32,232 Swedish women.

Disclosures: The study was supported by research grants from the Swedish Research Council/Committee for Research Infrastructure for Maintenance of the Swedish Mammography Cohort, and from the Karolinska Institute’s Award for Ph.D. Students. The authors reported having no disclosures.

High dependency predicts BMI increase during smoking cessation

Article Type
Changed
Fri, 01/18/2019 - 12:55
Display Headline
High dependency predicts BMI increase during smoking cessation

Smokers who are heavily addicted to nicotine are significantly more likely to gain weight when they try to quit, researchers report.

Investigators studying 186 patients who successfully quit smoking after receiving nicotine replacement therapy at an outpatient clinic found that mean body mass index increased significantly from 23.5 kg/m2 at an initial consultation to 23.9 kg/m2 at 3 months after the start of therapy. A high Fagerstrom Test for Nicotine Dependence (FTND) score, indicating severe dependency, was found on multivariate analysis to be the strongest predictor of increase (using a gender-adjusted standardized coefficient).

Dr. Maki Komiyama of Kyoto (Japan) Medical Center and colleagues reported their findings online Aug. 21 in the open access journal PLoS One (PLoS One 2013 Aug. 21 [doi:10.1371/journal.pone.0072010]).

The findings are important because, while smoking cessation is known to reduce cardiovascular and cancer risk and to reduce all-cause mortality, associated weight gain is linked with greater risk of glucose intolerance and a reduction in the beneficial effects that quitting has on pulmonary function. Concerns about weight gain also can lead to a failure to quit smoking, they said.

"Even if one is expected to experience post-cessation weight gain, quitting smoking still leads to a reduced cardiovascular risk. However, there is also a possibility that if one can prevent post-cessation weight gain, then this will further reduce the cardiovascular risk due to having ceased smoking," they wrote.

"Even if one is expected to experience post-cessation weight gain, quitting smoking still leads to a reduced cardiovascular risk"

Thus, they continued, the ability to predict which patients are likely to gain weight during smoking cessation therapy – and performing weight control accordingly at the outset – could lead to improved outcomes, and the findings of this study may be useful for discriminating such patient groups.

Study participants were 132 men and 54 women with a mean age of 59.6 years who visited the smoking cessation clinic at the National Hospital Organization Kyoto Medical Center between July 2007 and November 2011 and successfully quit smoking.

Other factors found on univariate analysis to be significantly associated with BMI increase included triglyceride level, high-density lipoprotein cholesterol, and daily cigarette consumption. "To further investigate ... we performed multivariate analysis. The results demonstrated that the triglyceride level and FTND score were factors determining the post-cessation BMI increase, and that the FTND score was the strongest one," the investigators wrote.

An FTND score of 8 or more (on a scale of 1-10) was associated with larger postcessation BMI increase, and the increase was statistically significant when compared with the level of BMI increase in those with a score of 7 or less, they noted.

"The result that a high FTND score was the most important determinant of a BMI increase supports the hypothesis that post-cessation weight gain is one of the nicotine withdrawal symptoms," they said.

As for the association between triglyceride elevation and weight gain, the mechanism is not clearly understood and requires further study, they noted.

With the exception of two patients who did not receive medical treatment, study participants were treated with either oral varenicline (95 patients) or nicotine patch (89 patients). No difference was seen between the varenicline and nicotine patch groups with respect to BMI increase, but the varenicline group had higher nicotine dependency.

They also noted that, in their study, "although a significant increase in BMI was confirmed after smoking-cessation therapy, the BMI increase was only 0.4 kg/m2 (1.1 kg), which is much smaller than reported in previous studies for people who quit smoking on their own initiative (2.8-3.8 kg)."

Additional study is needed to determine the appropriate timing for initiating interventions against post–smoking cessation weight gain, they noted.

This study was supported by a grant-in-aid for clinical research from the National Hospital Organization and the Pfizer Health Research Foundation. The authors reported that one study drug (varenicline) is manufactured by Pfizer but confirmed "that this does not alter their adherence to all the PLoS One policies on sharing data and materials."

Click for Credit Link
Author and Disclosure Information

Publications
Topics
Legacy Keywords
Smoker, nicotine, gain weight, smoking cessation, quitting smoking, nicotine dependence, nicotine replacement therapy,
Sections
Click for Credit Link
Click for Credit Link
Author and Disclosure Information

Author and Disclosure Information

Smokers who are heavily addicted to nicotine are significantly more likely to gain weight when they try to quit, researchers report.

Investigators studying 186 patients who successfully quit smoking after receiving nicotine replacement therapy at an outpatient clinic found that mean body mass index increased significantly from 23.5 kg/m2 at an initial consultation to 23.9 kg/m2 at 3 months after the start of therapy. A high Fagerstrom Test for Nicotine Dependence (FTND) score, indicating severe dependency, was found on multivariate analysis to be the strongest predictor of increase (using a gender-adjusted standardized coefficient).

Dr. Maki Komiyama of Kyoto (Japan) Medical Center and colleagues reported their findings online Aug. 21 in the open access journal PLoS One (PLoS One 2013 Aug. 21 [doi:10.1371/journal.pone.0072010]).

The findings are important because, while smoking cessation is known to reduce cardiovascular and cancer risk and to reduce all-cause mortality, associated weight gain is linked with greater risk of glucose intolerance and a reduction in the beneficial effects that quitting has on pulmonary function. Concerns about weight gain also can lead to a failure to quit smoking, they said.

"Even if one is expected to experience post-cessation weight gain, quitting smoking still leads to a reduced cardiovascular risk. However, there is also a possibility that if one can prevent post-cessation weight gain, then this will further reduce the cardiovascular risk due to having ceased smoking," they wrote.

"Even if one is expected to experience post-cessation weight gain, quitting smoking still leads to a reduced cardiovascular risk"

Thus, they continued, the ability to predict which patients are likely to gain weight during smoking cessation therapy – and performing weight control accordingly at the outset – could lead to improved outcomes, and the findings of this study may be useful for discriminating such patient groups.

Study participants were 132 men and 54 women with a mean age of 59.6 years who visited the smoking cessation clinic at the National Hospital Organization Kyoto Medical Center between July 2007 and November 2011 and successfully quit smoking.

Other factors found on univariate analysis to be significantly associated with BMI increase included triglyceride level, high-density lipoprotein cholesterol, and daily cigarette consumption. "To further investigate ... we performed multivariate analysis. The results demonstrated that the triglyceride level and FTND score were factors determining the post-cessation BMI increase, and that the FTND score was the strongest one," the investigators wrote.

An FTND score of 8 or more (on a scale of 1-10) was associated with larger postcessation BMI increase, and the increase was statistically significant when compared with the level of BMI increase in those with a score of 7 or less, they noted.

"The result that a high FTND score was the most important determinant of a BMI increase supports the hypothesis that post-cessation weight gain is one of the nicotine withdrawal symptoms," they said.

As for the association between triglyceride elevation and weight gain, the mechanism is not clearly understood and requires further study, they noted.

With the exception of two patients who did not receive medical treatment, study participants were treated with either oral varenicline (95 patients) or nicotine patch (89 patients). No difference was seen between the varenicline and nicotine patch groups with respect to BMI increase, but the varenicline group had higher nicotine dependency.

They also noted that, in their study, "although a significant increase in BMI was confirmed after smoking-cessation therapy, the BMI increase was only 0.4 kg/m2 (1.1 kg), which is much smaller than reported in previous studies for people who quit smoking on their own initiative (2.8-3.8 kg)."

Additional study is needed to determine the appropriate timing for initiating interventions against post–smoking cessation weight gain, they noted.

This study was supported by a grant-in-aid for clinical research from the National Hospital Organization and the Pfizer Health Research Foundation. The authors reported that one study drug (varenicline) is manufactured by Pfizer but confirmed "that this does not alter their adherence to all the PLoS One policies on sharing data and materials."

Smokers who are heavily addicted to nicotine are significantly more likely to gain weight when they try to quit, researchers report.

Investigators studying 186 patients who successfully quit smoking after receiving nicotine replacement therapy at an outpatient clinic found that mean body mass index increased significantly from 23.5 kg/m2 at an initial consultation to 23.9 kg/m2 at 3 months after the start of therapy. A high Fagerstrom Test for Nicotine Dependence (FTND) score, indicating severe dependency, was found on multivariate analysis to be the strongest predictor of increase (using a gender-adjusted standardized coefficient).

Dr. Maki Komiyama of Kyoto (Japan) Medical Center and colleagues reported their findings online Aug. 21 in the open access journal PLoS One (PLoS One 2013 Aug. 21 [doi:10.1371/journal.pone.0072010]).

The findings are important because, while smoking cessation is known to reduce cardiovascular and cancer risk and to reduce all-cause mortality, associated weight gain is linked with greater risk of glucose intolerance and a reduction in the beneficial effects that quitting has on pulmonary function. Concerns about weight gain also can lead to a failure to quit smoking, they said.

"Even if one is expected to experience post-cessation weight gain, quitting smoking still leads to a reduced cardiovascular risk. However, there is also a possibility that if one can prevent post-cessation weight gain, then this will further reduce the cardiovascular risk due to having ceased smoking," they wrote.

"Even if one is expected to experience post-cessation weight gain, quitting smoking still leads to a reduced cardiovascular risk"

Thus, they continued, the ability to predict which patients are likely to gain weight during smoking cessation therapy – and performing weight control accordingly at the outset – could lead to improved outcomes, and the findings of this study may be useful for discriminating such patient groups.

Study participants were 132 men and 54 women with a mean age of 59.6 years who visited the smoking cessation clinic at the National Hospital Organization Kyoto Medical Center between July 2007 and November 2011 and successfully quit smoking.

Other factors found on univariate analysis to be significantly associated with BMI increase included triglyceride level, high-density lipoprotein cholesterol, and daily cigarette consumption. "To further investigate ... we performed multivariate analysis. The results demonstrated that the triglyceride level and FTND score were factors determining the post-cessation BMI increase, and that the FTND score was the strongest one," the investigators wrote.

An FTND score of 8 or more (on a scale of 1-10) was associated with larger postcessation BMI increase, and the increase was statistically significant when compared with the level of BMI increase in those with a score of 7 or less, they noted.

"The result that a high FTND score was the most important determinant of a BMI increase supports the hypothesis that post-cessation weight gain is one of the nicotine withdrawal symptoms," they said.

As for the association between triglyceride elevation and weight gain, the mechanism is not clearly understood and requires further study, they noted.

With the exception of two patients who did not receive medical treatment, study participants were treated with either oral varenicline (95 patients) or nicotine patch (89 patients). No difference was seen between the varenicline and nicotine patch groups with respect to BMI increase, but the varenicline group had higher nicotine dependency.

They also noted that, in their study, "although a significant increase in BMI was confirmed after smoking-cessation therapy, the BMI increase was only 0.4 kg/m2 (1.1 kg), which is much smaller than reported in previous studies for people who quit smoking on their own initiative (2.8-3.8 kg)."

Additional study is needed to determine the appropriate timing for initiating interventions against post–smoking cessation weight gain, they noted.

This study was supported by a grant-in-aid for clinical research from the National Hospital Organization and the Pfizer Health Research Foundation. The authors reported that one study drug (varenicline) is manufactured by Pfizer but confirmed "that this does not alter their adherence to all the PLoS One policies on sharing data and materials."

Publications
Publications
Topics
Article Type
Display Headline
High dependency predicts BMI increase during smoking cessation
Display Headline
High dependency predicts BMI increase during smoking cessation
Legacy Keywords
Smoker, nicotine, gain weight, smoking cessation, quitting smoking, nicotine dependence, nicotine replacement therapy,
Legacy Keywords
Smoker, nicotine, gain weight, smoking cessation, quitting smoking, nicotine dependence, nicotine replacement therapy,
Sections
Article Source

FROM PLOS ONE

PURLs Copyright

Inside the Article

Vitals

Major finding: FTND score was a strong predictor of BMI increase (gender-adjusted standardized coefficient, 0.236). BMI increased significantly from 23.5 kg/m2 at an initial consultation to 23.9 kg/m2 at 3 months after the start of smoking cessation therapy.

Data source: A study of 186 adults who successfully completed smoking cessation therapy.

Disclosures: This study was supported by a grant-in-aid for clinical research from the National Hospital Organization and the Pfizer Health Research Foundation. The authors reported that one study drug (varenicline) is manufactured by Pfizer but confirmed "that this does not alter their adherence to all the PLOS ONE policies on sharing data and materials."

Vedolizumab shows promise in Crohn’s, ulcerative colitis

A new biologic on the block: a little slower but maybe safer
Article Type
Changed
Fri, 01/18/2019 - 12:55
Display Headline
Vedolizumab shows promise in Crohn’s, ulcerative colitis

The humanized monoclonal antibody vedolizumab is more effective than placebo for induction and maintenance therapy in both ulcerative colitis and Crohn’s disease, according to two separate randomized, controlled phase III studies: GEMINI 1 and GEMINI 2.

The findings from the double-blind, multinational studies were published in the Aug. 22 issue of the New England Journal of Medicine.

GEMINI 1

In GEMINI 1, the clinical response rates at 6 weeks in 374 patients (cohort 1) with active ulcerative colitis who were randomized to receive either induction therapy with vedolizumab or placebo were 47.1% and 25.5%, respectively (P less than .001). In 521 patients (cohort 2) who received open-label vedolizumab, the clinical response rate at 6 weeks was 44.3%.

In a trial of maintenance therapy, those patients from both cohorts who responded to vedolizumab at week 6 were then randomized to receive either vedolizumab or placebo every 4 or 8 weeks for up to 52 weeks. The clinical remission rates at 52 weeks were 44.8% in the group that received vedolizumab every 4 weeks, 41.8% in the group that received it every 8 weeks, and 15.9% among the patients who were switched to placebo, Dr. Brian G. Feagan of the University of Western Ontario, London, and his colleagues reported on behalf of the GEMINI 1 Study Group.

Patients in the vedolizumab groups were treated with 300 mg IV at weeks 0 and 2. Clinical response was defined as a reduction of at least 3 points in the 0- to 12-point Mayo Clinic score and a decrease of at least 30% from the baseline score, along with a decrease in the rectal bleeding subscore of at least 1 point or an absolute rectal bleeding subscore of 0 or 1. The secondary outcome of clinical remission was defined as a Mayo Clinic score of 2 or less, with no subscore higher than 1, along with mucosal healing, defined by an endoscopic subscore of 0 or 1, the investigators said (N. Engl. J. Med. 2013;369:699-710).

The GEMINI 1 researchers noted that "all prespecified, primary and secondary outcomes in the trial of induction and maintenance therapy were superior in vedolizumab-treated patients versus those who received placebo," and added that longitudinal assessment of a number of factors, such as Mayo Clinic scores and use or dose of glucocorticoids, provided further evidence of a treatment benefit.

Furthermore, disease had been refractory to other treatments in many patients, they noted.

While the study was not designed to identify the time of the maximal effect of treatment as induction therapy, or a minimally effective dose regimen, it appears that treatment every 8 weeks may be an acceptable starting regimen – with dose intensification if needed, they said. "Vedolizumab is effective as both induction and maintenance therapy for patients with moderately to severely active ulcerative colitis," they concluded.

GEMINI 2

In GEMINI 2, the clinical remission rates at 6 weeks in 368 patients with active Crohn’s disease who were randomized to receive either induction therapy with vedolizumab or placebo were 14.5% and 6.8%, respectively (P = .02), and a total of 31.4% and 25.7% of patients, respectively (P = .023), had a Crohn’s Disease Activity Index-100 (CDAI-100) response, defined as a decrease in the CDAI score of at least 100 points. Of 747 patients who received open-label vedolizumab, 17.7% had a clinical remission and 34.4% had a CDAI-100 response at 6 weeks.

Those patients who responded to vedolizumab in the induction phase were randomly assigned to receive either placebo or maintenance treatment every 4 or 8 weeks until week 52. The clinical remission rates at 52 weeks were 36.4% in the group that received the drug every 4 weeks, 39.0% in the group receiving it every 8 weeks, and 21.6% in the placebo group, Dr. William J. Sandborn of the University of California, San Diego, La Jolla, and his colleagues reported on behalf of the GEMINI 2 Study Group (N. Engl. J. Med. 2013; 369:711-21).

As in GEMINI 1, patients in GEMINI 2 who were assigned to the vedolizumab groups were treated with 300 mg IV at weeks 0 and 2. In GEMINI 2, clinical remission was defined as a CDAI score of 150 or less.

In GEMINI 1, the frequency of adverse events was similar in both the treatment and placebo groups, whereas in GEMINI 2, vedolizumab, compared with placebo, was associated with a higher rate of serious adverse events (24.4% vs. 15.3%), infections (44.1% vs. 40.2%), and serious infections (5.5% vs. 3.0%), the investigators said.

Patients in both GEMINI 1 and 2 were aged 18-80 years who were enrolled between 2008 and 2012 through more than 200 participating medical centers in more than 30 countries.

 

 

The findings are important because existing medical therapies for ulcerative colitis and Crohn’s disease have significant limitations, including toxic effects, and new treatment strategies are needed, the investigators said.

The GEMINI 2 investigators noted that patients with moderately to severely active Crohn’s disease, in whom conventional therapy failed, were more likely than those receiving placebo to experience remission at 6 weeks. They were not more likely to have a CDAI-100 response, however.

While the modest effect of treatment on induction of clinical remission, as well as the nonsignificant effect on the CDAI-100, require consideration, and while questions remain about which specific patients with Crohn’s disease may derive the most benefit from vedolizumab and about potential synergistic effects of combining vedolizumab with immunosuppressive agents, the findings nonetheless suggest a role for vedolizumab in Crohn’s disease, they noted.

"In an analysis of patients who had a response to induction therapy with vedolizumab, the rates of clinical remission, CDAI-100 response, and glucocorticoid-free remission at week 52 were higher among patients receiving vedolizumab every 8 weeks or every 4 weeks than among patients who were switched to placebo," they said.

GEMINI 1 and GEMINI 2 were funded by Millennium Pharmaceuticals. The authors disclosed multiple potential conflicts of interest; the details are available with the full text of the articles at NEJM.org.

Body

Dr. Siddhartha Parker
Vedolizumab uses gut-selective blockade of the

alpha4beta7 subunit to affect lymphocyte trafficking. In the GEMINI studies,

benefit over placebo was seen in both ulcerative colitis (UC) and Crohn’s

disease. While both primary and secondary endpoints were clinically significant

in UC, the results were not quite as robust for Crohn’s disease, which may be

due to the relatively early timing (6 weeks) of the coprimary endpoint

assessment. Even so, the GEMINI studies show some of the most promising

maintenance data seen for inflammatory bowel disease therapy, in addition to a

low rate of developing antibodies against vedolizumab (4%).

The safety profile is

equally encouraging. Likely due to vedolizumab’s gut-selective blockage,

serious infections may occur less often than with other biologic agents.

Furthermore, its alpha4beta7 selectivity differentiates it from natalizumab,

theoretically eliminating the risk of progressive multifocal

leukoencephalopathy (PML). No cases of PML have been reported in the large drug

development program.

Dr. Corey A. Siegel
With its apparent

durability of response and reassuring safety profile, vedolizumab may in fact

be positioned earlier in the treatment paradigm than other immune-suppressive

agents. At least for UC, it is reasonable to consider its use after

5-aminosalicylates fail. Vedolizumab’s somewhat slower onset when compared to anti–tumor

necrosis factor (anti-TNF) agents may require either patience if symptoms are

tolerable, or the coadministration of corticosteroids to induce remission while

waiting for its maintenance benefit to kick in. We hope to use what we’ve

learned about biologics from 15 years of anti-TNFs to quickly determine how to

best optimize vedolizumab in our clinical practice.

Dr. Siddhartha Parker is a fellow in gastroenterology at

Dartmouth-Hitchcock Medical Center, Lebanon, N.H., and Dr. Corey A. Siegel is associate

professor of medicine at the Geisel School of Medicine at Dartmouth, Hanover,

N.H., and director of the Dartmouth-Hitchcock Inflammatory Bowel Disease Center.

Dr. Siegel serves on the advisory boards for Takeda Pharmaceuticals, Abbvie,

Janssen, and UCB.

Author and Disclosure Information

Publications
Topics
Legacy Keywords
vedolizumab, ulcerative colitis, Crohn’s disease
Author and Disclosure Information

Author and Disclosure Information

Body

Dr. Siddhartha Parker
Vedolizumab uses gut-selective blockade of the

alpha4beta7 subunit to affect lymphocyte trafficking. In the GEMINI studies,

benefit over placebo was seen in both ulcerative colitis (UC) and Crohn’s

disease. While both primary and secondary endpoints were clinically significant

in UC, the results were not quite as robust for Crohn’s disease, which may be

due to the relatively early timing (6 weeks) of the coprimary endpoint

assessment. Even so, the GEMINI studies show some of the most promising

maintenance data seen for inflammatory bowel disease therapy, in addition to a

low rate of developing antibodies against vedolizumab (4%).

The safety profile is

equally encouraging. Likely due to vedolizumab’s gut-selective blockage,

serious infections may occur less often than with other biologic agents.

Furthermore, its alpha4beta7 selectivity differentiates it from natalizumab,

theoretically eliminating the risk of progressive multifocal

leukoencephalopathy (PML). No cases of PML have been reported in the large drug

development program.

Dr. Corey A. Siegel
With its apparent

durability of response and reassuring safety profile, vedolizumab may in fact

be positioned earlier in the treatment paradigm than other immune-suppressive

agents. At least for UC, it is reasonable to consider its use after

5-aminosalicylates fail. Vedolizumab’s somewhat slower onset when compared to anti–tumor

necrosis factor (anti-TNF) agents may require either patience if symptoms are

tolerable, or the coadministration of corticosteroids to induce remission while

waiting for its maintenance benefit to kick in. We hope to use what we’ve

learned about biologics from 15 years of anti-TNFs to quickly determine how to

best optimize vedolizumab in our clinical practice.

Dr. Siddhartha Parker is a fellow in gastroenterology at

Dartmouth-Hitchcock Medical Center, Lebanon, N.H., and Dr. Corey A. Siegel is associate

professor of medicine at the Geisel School of Medicine at Dartmouth, Hanover,

N.H., and director of the Dartmouth-Hitchcock Inflammatory Bowel Disease Center.

Dr. Siegel serves on the advisory boards for Takeda Pharmaceuticals, Abbvie,

Janssen, and UCB.

Body

Dr. Siddhartha Parker
Vedolizumab uses gut-selective blockade of the

alpha4beta7 subunit to affect lymphocyte trafficking. In the GEMINI studies,

benefit over placebo was seen in both ulcerative colitis (UC) and Crohn’s

disease. While both primary and secondary endpoints were clinically significant

in UC, the results were not quite as robust for Crohn’s disease, which may be

due to the relatively early timing (6 weeks) of the coprimary endpoint

assessment. Even so, the GEMINI studies show some of the most promising

maintenance data seen for inflammatory bowel disease therapy, in addition to a

low rate of developing antibodies against vedolizumab (4%).

The safety profile is

equally encouraging. Likely due to vedolizumab’s gut-selective blockage,

serious infections may occur less often than with other biologic agents.

Furthermore, its alpha4beta7 selectivity differentiates it from natalizumab,

theoretically eliminating the risk of progressive multifocal

leukoencephalopathy (PML). No cases of PML have been reported in the large drug

development program.

Dr. Corey A. Siegel
With its apparent

durability of response and reassuring safety profile, vedolizumab may in fact

be positioned earlier in the treatment paradigm than other immune-suppressive

agents. At least for UC, it is reasonable to consider its use after

5-aminosalicylates fail. Vedolizumab’s somewhat slower onset when compared to anti–tumor

necrosis factor (anti-TNF) agents may require either patience if symptoms are

tolerable, or the coadministration of corticosteroids to induce remission while

waiting for its maintenance benefit to kick in. We hope to use what we’ve

learned about biologics from 15 years of anti-TNFs to quickly determine how to

best optimize vedolizumab in our clinical practice.

Dr. Siddhartha Parker is a fellow in gastroenterology at

Dartmouth-Hitchcock Medical Center, Lebanon, N.H., and Dr. Corey A. Siegel is associate

professor of medicine at the Geisel School of Medicine at Dartmouth, Hanover,

N.H., and director of the Dartmouth-Hitchcock Inflammatory Bowel Disease Center.

Dr. Siegel serves on the advisory boards for Takeda Pharmaceuticals, Abbvie,

Janssen, and UCB.

Title
A new biologic on the block: a little slower but maybe safer
A new biologic on the block: a little slower but maybe safer

The humanized monoclonal antibody vedolizumab is more effective than placebo for induction and maintenance therapy in both ulcerative colitis and Crohn’s disease, according to two separate randomized, controlled phase III studies: GEMINI 1 and GEMINI 2.

The findings from the double-blind, multinational studies were published in the Aug. 22 issue of the New England Journal of Medicine.

GEMINI 1

In GEMINI 1, the clinical response rates at 6 weeks in 374 patients (cohort 1) with active ulcerative colitis who were randomized to receive either induction therapy with vedolizumab or placebo were 47.1% and 25.5%, respectively (P less than .001). In 521 patients (cohort 2) who received open-label vedolizumab, the clinical response rate at 6 weeks was 44.3%.

In a trial of maintenance therapy, those patients from both cohorts who responded to vedolizumab at week 6 were then randomized to receive either vedolizumab or placebo every 4 or 8 weeks for up to 52 weeks. The clinical remission rates at 52 weeks were 44.8% in the group that received vedolizumab every 4 weeks, 41.8% in the group that received it every 8 weeks, and 15.9% among the patients who were switched to placebo, Dr. Brian G. Feagan of the University of Western Ontario, London, and his colleagues reported on behalf of the GEMINI 1 Study Group.

Patients in the vedolizumab groups were treated with 300 mg IV at weeks 0 and 2. Clinical response was defined as a reduction of at least 3 points in the 0- to 12-point Mayo Clinic score and a decrease of at least 30% from the baseline score, along with a decrease in the rectal bleeding subscore of at least 1 point or an absolute rectal bleeding subscore of 0 or 1. The secondary outcome of clinical remission was defined as a Mayo Clinic score of 2 or less, with no subscore higher than 1, along with mucosal healing, defined by an endoscopic subscore of 0 or 1, the investigators said (N. Engl. J. Med. 2013;369:699-710).

The GEMINI 1 researchers noted that "all prespecified, primary and secondary outcomes in the trial of induction and maintenance therapy were superior in vedolizumab-treated patients versus those who received placebo," and added that longitudinal assessment of a number of factors, such as Mayo Clinic scores and use or dose of glucocorticoids, provided further evidence of a treatment benefit.

Furthermore, disease had been refractory to other treatments in many patients, they noted.

While the study was not designed to identify the time of the maximal effect of treatment as induction therapy, or a minimally effective dose regimen, it appears that treatment every 8 weeks may be an acceptable starting regimen – with dose intensification if needed, they said. "Vedolizumab is effective as both induction and maintenance therapy for patients with moderately to severely active ulcerative colitis," they concluded.

GEMINI 2

In GEMINI 2, the clinical remission rates at 6 weeks in 368 patients with active Crohn’s disease who were randomized to receive either induction therapy with vedolizumab or placebo were 14.5% and 6.8%, respectively (P = .02), and a total of 31.4% and 25.7% of patients, respectively (P = .023), had a Crohn’s Disease Activity Index-100 (CDAI-100) response, defined as a decrease in the CDAI score of at least 100 points. Of 747 patients who received open-label vedolizumab, 17.7% had a clinical remission and 34.4% had a CDAI-100 response at 6 weeks.

Those patients who responded to vedolizumab in the induction phase were randomly assigned to receive either placebo or maintenance treatment every 4 or 8 weeks until week 52. The clinical remission rates at 52 weeks were 36.4% in the group that received the drug every 4 weeks, 39.0% in the group receiving it every 8 weeks, and 21.6% in the placebo group, Dr. William J. Sandborn of the University of California, San Diego, La Jolla, and his colleagues reported on behalf of the GEMINI 2 Study Group (N. Engl. J. Med. 2013; 369:711-21).

As in GEMINI 1, patients in GEMINI 2 who were assigned to the vedolizumab groups were treated with 300 mg IV at weeks 0 and 2. In GEMINI 2, clinical remission was defined as a CDAI score of 150 or less.

In GEMINI 1, the frequency of adverse events was similar in both the treatment and placebo groups, whereas in GEMINI 2, vedolizumab, compared with placebo, was associated with a higher rate of serious adverse events (24.4% vs. 15.3%), infections (44.1% vs. 40.2%), and serious infections (5.5% vs. 3.0%), the investigators said.

Patients in both GEMINI 1 and 2 were aged 18-80 years who were enrolled between 2008 and 2012 through more than 200 participating medical centers in more than 30 countries.

 

 

The findings are important because existing medical therapies for ulcerative colitis and Crohn’s disease have significant limitations, including toxic effects, and new treatment strategies are needed, the investigators said.

The GEMINI 2 investigators noted that patients with moderately to severely active Crohn’s disease, in whom conventional therapy failed, were more likely than those receiving placebo to experience remission at 6 weeks. They were not more likely to have a CDAI-100 response, however.

While the modest effect of treatment on induction of clinical remission, as well as the nonsignificant effect on the CDAI-100, require consideration, and while questions remain about which specific patients with Crohn’s disease may derive the most benefit from vedolizumab and about potential synergistic effects of combining vedolizumab with immunosuppressive agents, the findings nonetheless suggest a role for vedolizumab in Crohn’s disease, they noted.

"In an analysis of patients who had a response to induction therapy with vedolizumab, the rates of clinical remission, CDAI-100 response, and glucocorticoid-free remission at week 52 were higher among patients receiving vedolizumab every 8 weeks or every 4 weeks than among patients who were switched to placebo," they said.

GEMINI 1 and GEMINI 2 were funded by Millennium Pharmaceuticals. The authors disclosed multiple potential conflicts of interest; the details are available with the full text of the articles at NEJM.org.

The humanized monoclonal antibody vedolizumab is more effective than placebo for induction and maintenance therapy in both ulcerative colitis and Crohn’s disease, according to two separate randomized, controlled phase III studies: GEMINI 1 and GEMINI 2.

The findings from the double-blind, multinational studies were published in the Aug. 22 issue of the New England Journal of Medicine.

GEMINI 1

In GEMINI 1, the clinical response rates at 6 weeks in 374 patients (cohort 1) with active ulcerative colitis who were randomized to receive either induction therapy with vedolizumab or placebo were 47.1% and 25.5%, respectively (P less than .001). In 521 patients (cohort 2) who received open-label vedolizumab, the clinical response rate at 6 weeks was 44.3%.

In a trial of maintenance therapy, those patients from both cohorts who responded to vedolizumab at week 6 were then randomized to receive either vedolizumab or placebo every 4 or 8 weeks for up to 52 weeks. The clinical remission rates at 52 weeks were 44.8% in the group that received vedolizumab every 4 weeks, 41.8% in the group that received it every 8 weeks, and 15.9% among the patients who were switched to placebo, Dr. Brian G. Feagan of the University of Western Ontario, London, and his colleagues reported on behalf of the GEMINI 1 Study Group.

Patients in the vedolizumab groups were treated with 300 mg IV at weeks 0 and 2. Clinical response was defined as a reduction of at least 3 points in the 0- to 12-point Mayo Clinic score and a decrease of at least 30% from the baseline score, along with a decrease in the rectal bleeding subscore of at least 1 point or an absolute rectal bleeding subscore of 0 or 1. The secondary outcome of clinical remission was defined as a Mayo Clinic score of 2 or less, with no subscore higher than 1, along with mucosal healing, defined by an endoscopic subscore of 0 or 1, the investigators said (N. Engl. J. Med. 2013;369:699-710).

The GEMINI 1 researchers noted that "all prespecified, primary and secondary outcomes in the trial of induction and maintenance therapy were superior in vedolizumab-treated patients versus those who received placebo," and added that longitudinal assessment of a number of factors, such as Mayo Clinic scores and use or dose of glucocorticoids, provided further evidence of a treatment benefit.

Furthermore, disease had been refractory to other treatments in many patients, they noted.

While the study was not designed to identify the time of the maximal effect of treatment as induction therapy, or a minimally effective dose regimen, it appears that treatment every 8 weeks may be an acceptable starting regimen – with dose intensification if needed, they said. "Vedolizumab is effective as both induction and maintenance therapy for patients with moderately to severely active ulcerative colitis," they concluded.

GEMINI 2

In GEMINI 2, the clinical remission rates at 6 weeks in 368 patients with active Crohn’s disease who were randomized to receive either induction therapy with vedolizumab or placebo were 14.5% and 6.8%, respectively (P = .02), and a total of 31.4% and 25.7% of patients, respectively (P = .023), had a Crohn’s Disease Activity Index-100 (CDAI-100) response, defined as a decrease in the CDAI score of at least 100 points. Of 747 patients who received open-label vedolizumab, 17.7% had a clinical remission and 34.4% had a CDAI-100 response at 6 weeks.

Those patients who responded to vedolizumab in the induction phase were randomly assigned to receive either placebo or maintenance treatment every 4 or 8 weeks until week 52. The clinical remission rates at 52 weeks were 36.4% in the group that received the drug every 4 weeks, 39.0% in the group receiving it every 8 weeks, and 21.6% in the placebo group, Dr. William J. Sandborn of the University of California, San Diego, La Jolla, and his colleagues reported on behalf of the GEMINI 2 Study Group (N. Engl. J. Med. 2013; 369:711-21).

As in GEMINI 1, patients in GEMINI 2 who were assigned to the vedolizumab groups were treated with 300 mg IV at weeks 0 and 2. In GEMINI 2, clinical remission was defined as a CDAI score of 150 or less.

In GEMINI 1, the frequency of adverse events was similar in both the treatment and placebo groups, whereas in GEMINI 2, vedolizumab, compared with placebo, was associated with a higher rate of serious adverse events (24.4% vs. 15.3%), infections (44.1% vs. 40.2%), and serious infections (5.5% vs. 3.0%), the investigators said.

Patients in both GEMINI 1 and 2 were aged 18-80 years who were enrolled between 2008 and 2012 through more than 200 participating medical centers in more than 30 countries.

 

 

The findings are important because existing medical therapies for ulcerative colitis and Crohn’s disease have significant limitations, including toxic effects, and new treatment strategies are needed, the investigators said.

The GEMINI 2 investigators noted that patients with moderately to severely active Crohn’s disease, in whom conventional therapy failed, were more likely than those receiving placebo to experience remission at 6 weeks. They were not more likely to have a CDAI-100 response, however.

While the modest effect of treatment on induction of clinical remission, as well as the nonsignificant effect on the CDAI-100, require consideration, and while questions remain about which specific patients with Crohn’s disease may derive the most benefit from vedolizumab and about potential synergistic effects of combining vedolizumab with immunosuppressive agents, the findings nonetheless suggest a role for vedolizumab in Crohn’s disease, they noted.

"In an analysis of patients who had a response to induction therapy with vedolizumab, the rates of clinical remission, CDAI-100 response, and glucocorticoid-free remission at week 52 were higher among patients receiving vedolizumab every 8 weeks or every 4 weeks than among patients who were switched to placebo," they said.

GEMINI 1 and GEMINI 2 were funded by Millennium Pharmaceuticals. The authors disclosed multiple potential conflicts of interest; the details are available with the full text of the articles at NEJM.org.

Publications
Publications
Topics
Article Type
Display Headline
Vedolizumab shows promise in Crohn’s, ulcerative colitis
Display Headline
Vedolizumab shows promise in Crohn’s, ulcerative colitis
Legacy Keywords
vedolizumab, ulcerative colitis, Crohn’s disease
Legacy Keywords
vedolizumab, ulcerative colitis, Crohn’s disease
Article Source

FROM THE NEW ENGLAND JOURNAL OF MEDICINE

PURLs Copyright

Inside the Article

Vitals

Major finding: Six-week clinical response with vedolizumab vs. placebo for ulcerative colitis: 47.1% vs. 25.5%; 6-week clinical remission with vedolizumab vs. placebo for Crohn’s disease: 14.5% vs. 6.8%.

Data source: Two separate phase III studies including a total of more than 2,000 patients.

Disclosures: GEMINI 1 and GEMINI 2 were funded by Millennium Pharmaceuticals. The authors disclosed multiple potential conflicts of interest; the details are available with the full text of the articles at NEJM.org.

Meta-analysis: Lateral wedges don’t reduce medial knee OA pain

Article Type
Changed
Fri, 01/18/2019 - 12:55
Display Headline
Meta-analysis: Lateral wedges don’t reduce medial knee OA pain

Lateral wedge insoles are ineffective, compared with control interventions, for reducing pain in patients with medial knee osteoarthritis, according to a meta-analysis of data from 12 randomized, controlled trials.

The findings of this meta-analysis suggest that although lateral wedge insoles have been considered a possible means for reducing medial loading by easing "the physical stress applied to that compartment of the joint" and thereby reducing painful knee symptoms, the available evidence does not support their use for this indication, first author Matthew J. Parkes of the University of Manchester (England) Institute of Inflammation and Repair and his colleagues reported. The study was published in the Aug. 21 issue of JAMA.

"...We found that compared with neutral inserts, lateral wedges had no association with knee pain (SMD, -0.03) and heterogeneity was much lower across trial findings."

When data from all 12 trials were considered, the overall effect estimate for lateral wedge insoles was a standardized mean difference (SMD) in pain between interventions of –0.47. This represents moderately significant pain reduction for lateral wedges, and translates into an effect size of –2.12 on the 0-20 Western Ontario and McMaster Universities Arthritis Index (WOMAC) pain scale.

However, the effects were highly heterogenous across the studies, and a significant difference in treatment effect was noted based on the type of control condition used, with a lesser effect seen in the seven trials that used a neutral wedge as the control, the investigators said.

"When trials were grouped according to the control group treatment, we found that compared with neutral inserts, lateral wedges had no association with knee pain (SMD, -0.03) and heterogeneity was much lower across trial findings," they wrote.

The SMD of –0.03 based on these studies represented an effect size of only –0.12 between lateral wedges and neutral wedges on the WOMAC pain subscale (JAMA 2013;310:722-30).

The investigators identified the studies included in the meta-analysis through an extensive search of the literature, including searches of multiple databases for studies published from the earliest available date to May 2013. The 12 trials that met inclusion criteria involved a total of 885 patients, including 502 who received lateral wedge treatment. The primary outcome in the trials was self-reported pain.

The findings are of note because the results of studies examining knee pain following treatment have been inconsistent, and have led to conflicting recommendations.

"For example, in recent osteoarthritis treatment guidelines, the American College of Rheumatology did not recommend lateral wedge insoles as a treatment for medial knee osteoarthritis. On the other hand, the Osteoarthritis Research Society International treatment guidelines state, ‘Lateral wedged insoles can be of symptomatic benefit for some patients with medial tibiofemoral compartment [osteoarthritis] OA,’ " the investigators wrote, adding that in the United Kingdom, the National Institute for Health and Care Excellence considers "footwear with shock-absorbing properties" to be worth consideration in the absence of well-designed trial data.

The identification of effective nonsurgical treatment for knee OA is a high priority, given the increasing prevalence of the disease, the limited efficacious treatment options, and the increase in the rates of knee replacement, they said, noting that medial osteoarthritis is one of the most common subtypes of knee osteoarthritis.

This study was funded by a grant from Arthritis Research UK and by a grant from the National Institute for Health and Care Excellence to two individual authors. Multiple authors disclosed potential conflicts of interest, including a National Institute for Health Research clinical doctoral fellowship; institutional salary or grant support from Arthritis Research UK, serving as a consultant for Sunovion Pharmaceuticals and Knee Creations Ltd., and as associate editor for Arthritis Care & Research; serving as a continuing medical education activity editor and receiving payment for CME case presentations from Vindico Medical Education; and receiving grants from the Arthritis Foundation, the National Institute on Aging, and the Foundation for Physical Medicine & Rehabilitation.

Author and Disclosure Information

Publications
Topics
Legacy Keywords
Lateral wedge insoles, pain, medial knee osteoarthritis, wedge insoles, medial loading, painful knee, Matthew J. Parkes,
Author and Disclosure Information

Author and Disclosure Information

Lateral wedge insoles are ineffective, compared with control interventions, for reducing pain in patients with medial knee osteoarthritis, according to a meta-analysis of data from 12 randomized, controlled trials.

The findings of this meta-analysis suggest that although lateral wedge insoles have been considered a possible means for reducing medial loading by easing "the physical stress applied to that compartment of the joint" and thereby reducing painful knee symptoms, the available evidence does not support their use for this indication, first author Matthew J. Parkes of the University of Manchester (England) Institute of Inflammation and Repair and his colleagues reported. The study was published in the Aug. 21 issue of JAMA.

"...We found that compared with neutral inserts, lateral wedges had no association with knee pain (SMD, -0.03) and heterogeneity was much lower across trial findings."

When data from all 12 trials were considered, the overall effect estimate for lateral wedge insoles was a standardized mean difference (SMD) in pain between interventions of –0.47. This represents moderately significant pain reduction for lateral wedges, and translates into an effect size of –2.12 on the 0-20 Western Ontario and McMaster Universities Arthritis Index (WOMAC) pain scale.

However, the effects were highly heterogenous across the studies, and a significant difference in treatment effect was noted based on the type of control condition used, with a lesser effect seen in the seven trials that used a neutral wedge as the control, the investigators said.

"When trials were grouped according to the control group treatment, we found that compared with neutral inserts, lateral wedges had no association with knee pain (SMD, -0.03) and heterogeneity was much lower across trial findings," they wrote.

The SMD of –0.03 based on these studies represented an effect size of only –0.12 between lateral wedges and neutral wedges on the WOMAC pain subscale (JAMA 2013;310:722-30).

The investigators identified the studies included in the meta-analysis through an extensive search of the literature, including searches of multiple databases for studies published from the earliest available date to May 2013. The 12 trials that met inclusion criteria involved a total of 885 patients, including 502 who received lateral wedge treatment. The primary outcome in the trials was self-reported pain.

The findings are of note because the results of studies examining knee pain following treatment have been inconsistent, and have led to conflicting recommendations.

"For example, in recent osteoarthritis treatment guidelines, the American College of Rheumatology did not recommend lateral wedge insoles as a treatment for medial knee osteoarthritis. On the other hand, the Osteoarthritis Research Society International treatment guidelines state, ‘Lateral wedged insoles can be of symptomatic benefit for some patients with medial tibiofemoral compartment [osteoarthritis] OA,’ " the investigators wrote, adding that in the United Kingdom, the National Institute for Health and Care Excellence considers "footwear with shock-absorbing properties" to be worth consideration in the absence of well-designed trial data.

The identification of effective nonsurgical treatment for knee OA is a high priority, given the increasing prevalence of the disease, the limited efficacious treatment options, and the increase in the rates of knee replacement, they said, noting that medial osteoarthritis is one of the most common subtypes of knee osteoarthritis.

This study was funded by a grant from Arthritis Research UK and by a grant from the National Institute for Health and Care Excellence to two individual authors. Multiple authors disclosed potential conflicts of interest, including a National Institute for Health Research clinical doctoral fellowship; institutional salary or grant support from Arthritis Research UK, serving as a consultant for Sunovion Pharmaceuticals and Knee Creations Ltd., and as associate editor for Arthritis Care & Research; serving as a continuing medical education activity editor and receiving payment for CME case presentations from Vindico Medical Education; and receiving grants from the Arthritis Foundation, the National Institute on Aging, and the Foundation for Physical Medicine & Rehabilitation.

Lateral wedge insoles are ineffective, compared with control interventions, for reducing pain in patients with medial knee osteoarthritis, according to a meta-analysis of data from 12 randomized, controlled trials.

The findings of this meta-analysis suggest that although lateral wedge insoles have been considered a possible means for reducing medial loading by easing "the physical stress applied to that compartment of the joint" and thereby reducing painful knee symptoms, the available evidence does not support their use for this indication, first author Matthew J. Parkes of the University of Manchester (England) Institute of Inflammation and Repair and his colleagues reported. The study was published in the Aug. 21 issue of JAMA.

"...We found that compared with neutral inserts, lateral wedges had no association with knee pain (SMD, -0.03) and heterogeneity was much lower across trial findings."

When data from all 12 trials were considered, the overall effect estimate for lateral wedge insoles was a standardized mean difference (SMD) in pain between interventions of –0.47. This represents moderately significant pain reduction for lateral wedges, and translates into an effect size of –2.12 on the 0-20 Western Ontario and McMaster Universities Arthritis Index (WOMAC) pain scale.

However, the effects were highly heterogenous across the studies, and a significant difference in treatment effect was noted based on the type of control condition used, with a lesser effect seen in the seven trials that used a neutral wedge as the control, the investigators said.

"When trials were grouped according to the control group treatment, we found that compared with neutral inserts, lateral wedges had no association with knee pain (SMD, -0.03) and heterogeneity was much lower across trial findings," they wrote.

The SMD of –0.03 based on these studies represented an effect size of only –0.12 between lateral wedges and neutral wedges on the WOMAC pain subscale (JAMA 2013;310:722-30).

The investigators identified the studies included in the meta-analysis through an extensive search of the literature, including searches of multiple databases for studies published from the earliest available date to May 2013. The 12 trials that met inclusion criteria involved a total of 885 patients, including 502 who received lateral wedge treatment. The primary outcome in the trials was self-reported pain.

The findings are of note because the results of studies examining knee pain following treatment have been inconsistent, and have led to conflicting recommendations.

"For example, in recent osteoarthritis treatment guidelines, the American College of Rheumatology did not recommend lateral wedge insoles as a treatment for medial knee osteoarthritis. On the other hand, the Osteoarthritis Research Society International treatment guidelines state, ‘Lateral wedged insoles can be of symptomatic benefit for some patients with medial tibiofemoral compartment [osteoarthritis] OA,’ " the investigators wrote, adding that in the United Kingdom, the National Institute for Health and Care Excellence considers "footwear with shock-absorbing properties" to be worth consideration in the absence of well-designed trial data.

The identification of effective nonsurgical treatment for knee OA is a high priority, given the increasing prevalence of the disease, the limited efficacious treatment options, and the increase in the rates of knee replacement, they said, noting that medial osteoarthritis is one of the most common subtypes of knee osteoarthritis.

This study was funded by a grant from Arthritis Research UK and by a grant from the National Institute for Health and Care Excellence to two individual authors. Multiple authors disclosed potential conflicts of interest, including a National Institute for Health Research clinical doctoral fellowship; institutional salary or grant support from Arthritis Research UK, serving as a consultant for Sunovion Pharmaceuticals and Knee Creations Ltd., and as associate editor for Arthritis Care & Research; serving as a continuing medical education activity editor and receiving payment for CME case presentations from Vindico Medical Education; and receiving grants from the Arthritis Foundation, the National Institute on Aging, and the Foundation for Physical Medicine & Rehabilitation.

Publications
Publications
Topics
Article Type
Display Headline
Meta-analysis: Lateral wedges don’t reduce medial knee OA pain
Display Headline
Meta-analysis: Lateral wedges don’t reduce medial knee OA pain
Legacy Keywords
Lateral wedge insoles, pain, medial knee osteoarthritis, wedge insoles, medial loading, painful knee, Matthew J. Parkes,
Legacy Keywords
Lateral wedge insoles, pain, medial knee osteoarthritis, wedge insoles, medial loading, painful knee, Matthew J. Parkes,
Article Source

FROM JAMA

PURLs Copyright

Inside the Article

Vitals

Major finding: Lateral wedges vs. neutral controls did not reduce pain based on a standard mean difference of –0.03 and an effect size of –0.12 out of 20 points on the WOMAC pain scale.

Data source: A meta-analysis of 12 trials involving a total of 885 participants.

Disclosures: This study was funded by a grant from Arthritis Research UK and by a grant from the National Institute for Health and Care Excellence to two individual authors. Multiple authors disclosed potential conflicts of interest, including a National Institute for Health Research clinical doctoral fellowship; institutional salary or grant support from Arthritis Research UK, serving as a consultant for Sunovion Pharmaceuticals and Knee Creations Ltd., and as associate editor for Arthritis Care & Research; serving as a continuing medical education activity editor and receiving payment for CME case presentations from Vindico Medical Education; and receiving grants from the Arthritis Foundation, the National Institute on Aging, and the Foundation for Physical Medicine & Rehabilitation.

Urinary albumin, incident heart disease linked in black adults

Findings reinforce urinary ACR importance
Article Type
Changed
Fri, 01/18/2019 - 12:55
Display Headline
Urinary albumin, incident heart disease linked in black adults

A higher urinary albumin-to-creatinine ratio was associated with an increased risk of incident coronary heart disease in black adults in the large, population-based REGARDS study.

No such association was seen in white adults in the prospective cohort study, suggesting that black individuals are more susceptible to vascular injury, according to Dr. Orlando M. Gutierrez of the University of Alabama at Birmingham and his colleagues, who reported the findings on behalf of the Reasons for Geographic and Racial Differences in Stroke (REGARDS) investigators.

Over a mean follow-up period of 4.5 years, 616 incident coronary heart disease events – 421 nonfatal myocardial infarctions and 195 CHD deaths – occurred in 23,273 individuals who were free of CHD at baseline. The incidence rates of CHD per 1,000 person-years of follow-up increased with increasing albumin-to-creatinine ratio (ACR) in these patients, and the increases were significantly greater for black adults, compared with white adults, the investigators reported. The study was published in the Aug. 21 issue of JAMA.

Age- and sex-adjusted incidence rates were nearly 1.5-fold greater in black adults than in white adults in the two highest categories of ACR: For black vs. white participants with ACR of 30-300 mg/g, the incidence rates per 1,000 person years were 11.2 and 8.0, respectively, and for those with ACR greater than 300 mg/g, the rates were 20.6 and 13.6, respectively, both significant differences.

After adjustment for traditional cardiovascular risk factors and medications, higher baseline urinary ACR (greater than 300 mg/g vs. less than 10 mg/g) was associated with greater risk of incident CHD among blacks, but not whites (hazard ratios of 3.21 and 1.49, respectively), they said.

A similar association was not seen for recurrent CHD. Over 4.4 years of follow-up, 468 recurrent CHD events – 279 nonfatal MIs and 189 CHD deaths – occurred in 4,934 individuals who had CHD at baseline. No differences were seen between black and white adults in this group with respect to baseline urinary ACR and first recurrent CHD event (JAMA 2013; 310:706-13).

The REGARDS study, a population-based investigation of stroke incidence in black and white adults in the United States, comprised individuals aged 45 years and older at baseline between 2003 and 2007, and oversampled those who self-reported as black and those living in the U.S. stroke belt.

Black individuals are known to have higher levels of urinary albumin excretion than those of white individuals – a finding that may contribute to racial disparities in cardiovascular outcomes. Previous REGARDS study findings showed that an association between urinary ACR and incident stroke differed by race, and that higher urinary ACR was independently associated with a greater risk of incident stroke in blacks, but not whites, the investigators said.

However, little is known about racial differences with respect to the association of urinary ACR and cardiovascular outcomes apart from stroke, they noted.

"These findings confirm the results of prior studies showing that urinary ACR is an important biomarker for CHD risk in the general population, even among individuals with ACR values that are less than the current threshold for defining microalbuminuria (30 mg/g). Additionally, to our knowledge, this is the first study to demonstrate that the higher risk of incident CHD associated with excess ACR differs by race," they said.

The findings contribute to increasing evidence suggesting that blacks are more susceptible than are whites to vascular injury, and suggest that this greater susceptibility may account for much of the excess risk of cardiovascular disease events, including stroke and CHD, in black individuals, they added.

This study is limited by a number of factors, including the use of a single measure of ACR, which may have led to exposure misclassification for some patients, and also by reduced power to detect significant associations due to relatively few events occurring in some ACR categories. Also, only black and white adults were included in REGARDS, which may limit the applicability of the results to other races and ethnicities, the investigators noted.

Nonetheless, the findings indicate that higher urinary ACR is a strong risk factor for incident CHD events (but not recurrent CHD events) in black vs. white individuals, they said, concluding that future studies should examine whether the addition of ACR can improve the diagnosis and management of CHD in black individuals.

This study was supported by a cooperative agreement from the National Institute of Neurological Disorders and Stroke and from the National Heart, Lung, and Blood Institute. Dr. Gutierrez was supported by grants from the National Institute of Diabetes and Digestive and Kidney Diseases and from NINDS. Amgen provided funding in the form of an investigator-initiated grant-in-aid. Several study authors disclosed ties with Amgen, REATA Pharmaceuticals, Arbor Research, Sanofi-Genzyme, and/or diaDexus.

Body

These findings from the REGARDS study highlight the complexities inherent in the relation between albuminuria and cardiovascular disease risk, and underscore the importance of urine ACR elevations, Dr. Daniel E. Weiner and Dr. Wolfgang C. Winkelmayer wrote in an editorial.

Key questions raised by the study are, why do black individuals have higher levels of albuminuria than white individuals, and what can be done to reduce associated cardiovascular disease risk in those at higher risk, they said.

The questions could only be answered in a setting of equal care access and use, and equally healthy living strategies beginning early in life, "such that genetic factors that may influence kidney disease can be distinguished from factors related to indolent chronic diseases (metabolic syndrome, hypertension, type 2 diabetes, and prediabetes)," they said, noting that such diseases are at least somewhat preventable with healthy living, are more common in black individuals and people of lower socioeconomic status, and are associated with cardiovascular disease and higher albuminuria (JAMA. 2013;310:697-8).

"Until these complex relationships are better disentangled, the study by Dr. Gutierrez and colleagues reinforces that even mild elevations in urine ACR are associated with increased CVD risk, even though this level of albuminuria will have no meaningful systemic effects," they said, adding that differentiating between low normal (less than 10 mg/g) and high normal (10-30 mg/g) urinary ACR may help with cardiovascular risk stratification, particularly in black individuals, perhaps leading to preventive efforts and improved monitoring.

Dr. Weiner is with Tufts Medical Center, Boston. He reported having no disclosures. Dr. Winkelmayer is with Stanford (Calif.) University. He reported having served as an adviser or consultant to Amgen and numerous other pharmaceutical and device manufacturers.

Author and Disclosure Information

Publications
Topics
Legacy Keywords
urinary albumin-to-creatinine ratio, coronary heart disease, REGARDS study, vascular injury, Dr. Orlando M. Gutierrez, Reasons for Geographic and Racial Differences in Stroke
Author and Disclosure Information

Author and Disclosure Information

Body

These findings from the REGARDS study highlight the complexities inherent in the relation between albuminuria and cardiovascular disease risk, and underscore the importance of urine ACR elevations, Dr. Daniel E. Weiner and Dr. Wolfgang C. Winkelmayer wrote in an editorial.

Key questions raised by the study are, why do black individuals have higher levels of albuminuria than white individuals, and what can be done to reduce associated cardiovascular disease risk in those at higher risk, they said.

The questions could only be answered in a setting of equal care access and use, and equally healthy living strategies beginning early in life, "such that genetic factors that may influence kidney disease can be distinguished from factors related to indolent chronic diseases (metabolic syndrome, hypertension, type 2 diabetes, and prediabetes)," they said, noting that such diseases are at least somewhat preventable with healthy living, are more common in black individuals and people of lower socioeconomic status, and are associated with cardiovascular disease and higher albuminuria (JAMA. 2013;310:697-8).

"Until these complex relationships are better disentangled, the study by Dr. Gutierrez and colleagues reinforces that even mild elevations in urine ACR are associated with increased CVD risk, even though this level of albuminuria will have no meaningful systemic effects," they said, adding that differentiating between low normal (less than 10 mg/g) and high normal (10-30 mg/g) urinary ACR may help with cardiovascular risk stratification, particularly in black individuals, perhaps leading to preventive efforts and improved monitoring.

Dr. Weiner is with Tufts Medical Center, Boston. He reported having no disclosures. Dr. Winkelmayer is with Stanford (Calif.) University. He reported having served as an adviser or consultant to Amgen and numerous other pharmaceutical and device manufacturers.

Body

These findings from the REGARDS study highlight the complexities inherent in the relation between albuminuria and cardiovascular disease risk, and underscore the importance of urine ACR elevations, Dr. Daniel E. Weiner and Dr. Wolfgang C. Winkelmayer wrote in an editorial.

Key questions raised by the study are, why do black individuals have higher levels of albuminuria than white individuals, and what can be done to reduce associated cardiovascular disease risk in those at higher risk, they said.

The questions could only be answered in a setting of equal care access and use, and equally healthy living strategies beginning early in life, "such that genetic factors that may influence kidney disease can be distinguished from factors related to indolent chronic diseases (metabolic syndrome, hypertension, type 2 diabetes, and prediabetes)," they said, noting that such diseases are at least somewhat preventable with healthy living, are more common in black individuals and people of lower socioeconomic status, and are associated with cardiovascular disease and higher albuminuria (JAMA. 2013;310:697-8).

"Until these complex relationships are better disentangled, the study by Dr. Gutierrez and colleagues reinforces that even mild elevations in urine ACR are associated with increased CVD risk, even though this level of albuminuria will have no meaningful systemic effects," they said, adding that differentiating between low normal (less than 10 mg/g) and high normal (10-30 mg/g) urinary ACR may help with cardiovascular risk stratification, particularly in black individuals, perhaps leading to preventive efforts and improved monitoring.

Dr. Weiner is with Tufts Medical Center, Boston. He reported having no disclosures. Dr. Winkelmayer is with Stanford (Calif.) University. He reported having served as an adviser or consultant to Amgen and numerous other pharmaceutical and device manufacturers.

Title
Findings reinforce urinary ACR importance
Findings reinforce urinary ACR importance

A higher urinary albumin-to-creatinine ratio was associated with an increased risk of incident coronary heart disease in black adults in the large, population-based REGARDS study.

No such association was seen in white adults in the prospective cohort study, suggesting that black individuals are more susceptible to vascular injury, according to Dr. Orlando M. Gutierrez of the University of Alabama at Birmingham and his colleagues, who reported the findings on behalf of the Reasons for Geographic and Racial Differences in Stroke (REGARDS) investigators.

Over a mean follow-up period of 4.5 years, 616 incident coronary heart disease events – 421 nonfatal myocardial infarctions and 195 CHD deaths – occurred in 23,273 individuals who were free of CHD at baseline. The incidence rates of CHD per 1,000 person-years of follow-up increased with increasing albumin-to-creatinine ratio (ACR) in these patients, and the increases were significantly greater for black adults, compared with white adults, the investigators reported. The study was published in the Aug. 21 issue of JAMA.

Age- and sex-adjusted incidence rates were nearly 1.5-fold greater in black adults than in white adults in the two highest categories of ACR: For black vs. white participants with ACR of 30-300 mg/g, the incidence rates per 1,000 person years were 11.2 and 8.0, respectively, and for those with ACR greater than 300 mg/g, the rates were 20.6 and 13.6, respectively, both significant differences.

After adjustment for traditional cardiovascular risk factors and medications, higher baseline urinary ACR (greater than 300 mg/g vs. less than 10 mg/g) was associated with greater risk of incident CHD among blacks, but not whites (hazard ratios of 3.21 and 1.49, respectively), they said.

A similar association was not seen for recurrent CHD. Over 4.4 years of follow-up, 468 recurrent CHD events – 279 nonfatal MIs and 189 CHD deaths – occurred in 4,934 individuals who had CHD at baseline. No differences were seen between black and white adults in this group with respect to baseline urinary ACR and first recurrent CHD event (JAMA 2013; 310:706-13).

The REGARDS study, a population-based investigation of stroke incidence in black and white adults in the United States, comprised individuals aged 45 years and older at baseline between 2003 and 2007, and oversampled those who self-reported as black and those living in the U.S. stroke belt.

Black individuals are known to have higher levels of urinary albumin excretion than those of white individuals – a finding that may contribute to racial disparities in cardiovascular outcomes. Previous REGARDS study findings showed that an association between urinary ACR and incident stroke differed by race, and that higher urinary ACR was independently associated with a greater risk of incident stroke in blacks, but not whites, the investigators said.

However, little is known about racial differences with respect to the association of urinary ACR and cardiovascular outcomes apart from stroke, they noted.

"These findings confirm the results of prior studies showing that urinary ACR is an important biomarker for CHD risk in the general population, even among individuals with ACR values that are less than the current threshold for defining microalbuminuria (30 mg/g). Additionally, to our knowledge, this is the first study to demonstrate that the higher risk of incident CHD associated with excess ACR differs by race," they said.

The findings contribute to increasing evidence suggesting that blacks are more susceptible than are whites to vascular injury, and suggest that this greater susceptibility may account for much of the excess risk of cardiovascular disease events, including stroke and CHD, in black individuals, they added.

This study is limited by a number of factors, including the use of a single measure of ACR, which may have led to exposure misclassification for some patients, and also by reduced power to detect significant associations due to relatively few events occurring in some ACR categories. Also, only black and white adults were included in REGARDS, which may limit the applicability of the results to other races and ethnicities, the investigators noted.

Nonetheless, the findings indicate that higher urinary ACR is a strong risk factor for incident CHD events (but not recurrent CHD events) in black vs. white individuals, they said, concluding that future studies should examine whether the addition of ACR can improve the diagnosis and management of CHD in black individuals.

This study was supported by a cooperative agreement from the National Institute of Neurological Disorders and Stroke and from the National Heart, Lung, and Blood Institute. Dr. Gutierrez was supported by grants from the National Institute of Diabetes and Digestive and Kidney Diseases and from NINDS. Amgen provided funding in the form of an investigator-initiated grant-in-aid. Several study authors disclosed ties with Amgen, REATA Pharmaceuticals, Arbor Research, Sanofi-Genzyme, and/or diaDexus.

A higher urinary albumin-to-creatinine ratio was associated with an increased risk of incident coronary heart disease in black adults in the large, population-based REGARDS study.

No such association was seen in white adults in the prospective cohort study, suggesting that black individuals are more susceptible to vascular injury, according to Dr. Orlando M. Gutierrez of the University of Alabama at Birmingham and his colleagues, who reported the findings on behalf of the Reasons for Geographic and Racial Differences in Stroke (REGARDS) investigators.

Over a mean follow-up period of 4.5 years, 616 incident coronary heart disease events – 421 nonfatal myocardial infarctions and 195 CHD deaths – occurred in 23,273 individuals who were free of CHD at baseline. The incidence rates of CHD per 1,000 person-years of follow-up increased with increasing albumin-to-creatinine ratio (ACR) in these patients, and the increases were significantly greater for black adults, compared with white adults, the investigators reported. The study was published in the Aug. 21 issue of JAMA.

Age- and sex-adjusted incidence rates were nearly 1.5-fold greater in black adults than in white adults in the two highest categories of ACR: For black vs. white participants with ACR of 30-300 mg/g, the incidence rates per 1,000 person years were 11.2 and 8.0, respectively, and for those with ACR greater than 300 mg/g, the rates were 20.6 and 13.6, respectively, both significant differences.

After adjustment for traditional cardiovascular risk factors and medications, higher baseline urinary ACR (greater than 300 mg/g vs. less than 10 mg/g) was associated with greater risk of incident CHD among blacks, but not whites (hazard ratios of 3.21 and 1.49, respectively), they said.

A similar association was not seen for recurrent CHD. Over 4.4 years of follow-up, 468 recurrent CHD events – 279 nonfatal MIs and 189 CHD deaths – occurred in 4,934 individuals who had CHD at baseline. No differences were seen between black and white adults in this group with respect to baseline urinary ACR and first recurrent CHD event (JAMA 2013; 310:706-13).

The REGARDS study, a population-based investigation of stroke incidence in black and white adults in the United States, comprised individuals aged 45 years and older at baseline between 2003 and 2007, and oversampled those who self-reported as black and those living in the U.S. stroke belt.

Black individuals are known to have higher levels of urinary albumin excretion than those of white individuals – a finding that may contribute to racial disparities in cardiovascular outcomes. Previous REGARDS study findings showed that an association between urinary ACR and incident stroke differed by race, and that higher urinary ACR was independently associated with a greater risk of incident stroke in blacks, but not whites, the investigators said.

However, little is known about racial differences with respect to the association of urinary ACR and cardiovascular outcomes apart from stroke, they noted.

"These findings confirm the results of prior studies showing that urinary ACR is an important biomarker for CHD risk in the general population, even among individuals with ACR values that are less than the current threshold for defining microalbuminuria (30 mg/g). Additionally, to our knowledge, this is the first study to demonstrate that the higher risk of incident CHD associated with excess ACR differs by race," they said.

The findings contribute to increasing evidence suggesting that blacks are more susceptible than are whites to vascular injury, and suggest that this greater susceptibility may account for much of the excess risk of cardiovascular disease events, including stroke and CHD, in black individuals, they added.

This study is limited by a number of factors, including the use of a single measure of ACR, which may have led to exposure misclassification for some patients, and also by reduced power to detect significant associations due to relatively few events occurring in some ACR categories. Also, only black and white adults were included in REGARDS, which may limit the applicability of the results to other races and ethnicities, the investigators noted.

Nonetheless, the findings indicate that higher urinary ACR is a strong risk factor for incident CHD events (but not recurrent CHD events) in black vs. white individuals, they said, concluding that future studies should examine whether the addition of ACR can improve the diagnosis and management of CHD in black individuals.

This study was supported by a cooperative agreement from the National Institute of Neurological Disorders and Stroke and from the National Heart, Lung, and Blood Institute. Dr. Gutierrez was supported by grants from the National Institute of Diabetes and Digestive and Kidney Diseases and from NINDS. Amgen provided funding in the form of an investigator-initiated grant-in-aid. Several study authors disclosed ties with Amgen, REATA Pharmaceuticals, Arbor Research, Sanofi-Genzyme, and/or diaDexus.

Publications
Publications
Topics
Article Type
Display Headline
Urinary albumin, incident heart disease linked in black adults
Display Headline
Urinary albumin, incident heart disease linked in black adults
Legacy Keywords
urinary albumin-to-creatinine ratio, coronary heart disease, REGARDS study, vascular injury, Dr. Orlando M. Gutierrez, Reasons for Geographic and Racial Differences in Stroke
Legacy Keywords
urinary albumin-to-creatinine ratio, coronary heart disease, REGARDS study, vascular injury, Dr. Orlando M. Gutierrez, Reasons for Geographic and Racial Differences in Stroke
Article Source

FROM JAMA

PURLs Copyright

Inside the Article

Vitals

Major finding: Adjusted hazard ratios for incident CHD in black vs. white adults with high ACR: 3.21 and 1.49, respectively.

Data source: A prospective cohort study involving 28,207 adults.

Disclosures: This study was supported by a cooperative agreement from the National Institute of Neurological Disorders and Stroke and from the National Heart, Lung, and Blood Institute. Dr. Gutierrez was supported by grants from the National Institute of Diabetes and Digestive and Kidney Diseases and from NINDS. Amgen provided funding in the form of an investigator-initiated grant-in-aid. Several study authors disclosed ties with Amgen, REATA Pharmaceuticals, Arbor Research, Sanofi-Genzyme, and/or diaDexus.