User login
Study busts three migraine trigger myths
SAN FRANCISCO – Half of migraineurs suspect chocolate can trigger their migraine attacks, but new evidence from a large prospective study suggests almost 99% of them are mistaken.
Additional analyses of this same dataset of 774 individuals with migraine threw cold water on two other widely accepted putative migraine triggers: neck pain/tension and dietary nitrate intake, Stephen Donoghue, PhD, reported at the annual meeting of the American Headache Society.
He presented a series of statistical analyses of often-cited migraine attack triggers conducted in 774 migraineurs who registered to use N1–Headache, a sophisticated proprietary digital headache diary. At the outset, participants rated on a 0-10 scale how strongly they suspected chocolate, neck pain or tension, and other factors acted as triggers for migraine attacks in their own personal experience. They then spent 2-3 minutes daily tracking more than 70 migraine-related elements for at least 90 days, recording the data using an iPhone or iPad rather than using a traditional, more hit-or-miss conventional paper diary.
The N1–Headache software, developed by Curelator of Cambridge, Mass., then tackled the key issue of what proportion of individuals who suspected a given trigger actually showed a statistically significant association prospectively between an individual’s day-to-day variations in experiencing that trigger and their headache risk, explained Dr. Donoghue, vice president for clinical development for Curelator.
Chocolate
The Curelator concept involves bringing personalized medicine to headache patients by identifying their true migraine attack triggers, which can enable individuals to deal with those triggers without the disruption involved in unnecessarily avoiding numerous nontriggers or missing real triggers.
The study results punctured some widely held beliefs. For example, at baseline 51% of migraineurs indicated they suspected chocolate served to some degree as a trigger for their own migraine attacks; the majority of them rated their suspicion as moderate or strong. However, in individually determined correlations, chocolate was indeed associated with migraine attacks in only 1.3% of those who suspected it to be a trigger. Moreover, in an another 3.9% of chocolate suspecters, chocolate consumption was actually associated with decreased risk; in other words, for them, chocolate appeared to serve as a protector against migraine, despite their preconceptions.
Also, among the 49% of participants who didn’t consider chocolate to be a personal migraine attack trigger, the Curelator analysis demonstrated that chocolate consumption was associated with a significantly increased risk of migraine attack in 2.2%, and a significantly lower risk of migraine in another 1.5%.
Neck pain/tension
Eighty percent of migraineurs who registered to use N1–Headache via the company’s website or the App Store indicated they believed neck pain/tension to be a migraine trigger for them; 46% rated it as a strong trigger. The detailed analysis of 90 days’ worth of data showed that 32% of participants with adequate data showed a statistically significant association between neck pain/tension and migraine headache.
The strength of an individual’s suspicion of neck paint/tension as a trigger was associated with the frequency of a statistically confirmed association. However, unlike for chocolate, there were zero instances where neck pain/tension was associated with protection against migraine.
In a twist, when the investigators reanalyzed their data after eliminating those instances where neck pain/tension occurred 1 day before the start of the headache, the association disappeared.
“The temporal association that we find using the lag-day analysis strongly suggests that neck pain/tension is part of the symptomatology of migraine attacks rather than acting as a trigger,” Dr. Donoghue observed.
This concept is strongly supported by a recent German study that used neck muscle electromyography to establish the same point (J Headache Pain. 2018 Mar 20;19[1]:26), he added.
Dr. Donoghue offered one caveat regarding the Curelator analyses: “We can’t say whether or not neck pain/tension and these other factors are actually triggers. What we’re looking at is associations. We’re not showing causality.”
Nitrates in food
Of the participants in this Curelator study, 45% suspected nitrate intake was a trigger for their migraine attacks, including 24% who rated the strength of their suspicion as moderate or severe. Of those who suspected nitrates, Cox proportional hazards modeling identified 2.2% in whom a significant association between nitrate consumption and increased migraine attack risk was present and another 1.1% in whom nitrates were associated with decreased risk, which suggests that they act as a possible protector in some cases. An individual’s strength of suspicion regarding nitrates proved unrelated to the actual risk of an association.
In subjects who did not suspect nitrates as a migraine trigger, Dr. Donoghue and colleagues identified 3.5% in whom nitrate intake was actually statistically associated with increased risk and 1.7% in whom it was linked to decreased risk.
The study was funded by Curelator, where Dr. Donoghue is employed.
SOURCE: Donoghue S et al. Headache. 2018 Jun;58(52):104,109,110.
SAN FRANCISCO – Half of migraineurs suspect chocolate can trigger their migraine attacks, but new evidence from a large prospective study suggests almost 99% of them are mistaken.
Additional analyses of this same dataset of 774 individuals with migraine threw cold water on two other widely accepted putative migraine triggers: neck pain/tension and dietary nitrate intake, Stephen Donoghue, PhD, reported at the annual meeting of the American Headache Society.
He presented a series of statistical analyses of often-cited migraine attack triggers conducted in 774 migraineurs who registered to use N1–Headache, a sophisticated proprietary digital headache diary. At the outset, participants rated on a 0-10 scale how strongly they suspected chocolate, neck pain or tension, and other factors acted as triggers for migraine attacks in their own personal experience. They then spent 2-3 minutes daily tracking more than 70 migraine-related elements for at least 90 days, recording the data using an iPhone or iPad rather than using a traditional, more hit-or-miss conventional paper diary.
The N1–Headache software, developed by Curelator of Cambridge, Mass., then tackled the key issue of what proportion of individuals who suspected a given trigger actually showed a statistically significant association prospectively between an individual’s day-to-day variations in experiencing that trigger and their headache risk, explained Dr. Donoghue, vice president for clinical development for Curelator.
Chocolate
The Curelator concept involves bringing personalized medicine to headache patients by identifying their true migraine attack triggers, which can enable individuals to deal with those triggers without the disruption involved in unnecessarily avoiding numerous nontriggers or missing real triggers.
The study results punctured some widely held beliefs. For example, at baseline 51% of migraineurs indicated they suspected chocolate served to some degree as a trigger for their own migraine attacks; the majority of them rated their suspicion as moderate or strong. However, in individually determined correlations, chocolate was indeed associated with migraine attacks in only 1.3% of those who suspected it to be a trigger. Moreover, in an another 3.9% of chocolate suspecters, chocolate consumption was actually associated with decreased risk; in other words, for them, chocolate appeared to serve as a protector against migraine, despite their preconceptions.
Also, among the 49% of participants who didn’t consider chocolate to be a personal migraine attack trigger, the Curelator analysis demonstrated that chocolate consumption was associated with a significantly increased risk of migraine attack in 2.2%, and a significantly lower risk of migraine in another 1.5%.
Neck pain/tension
Eighty percent of migraineurs who registered to use N1–Headache via the company’s website or the App Store indicated they believed neck pain/tension to be a migraine trigger for them; 46% rated it as a strong trigger. The detailed analysis of 90 days’ worth of data showed that 32% of participants with adequate data showed a statistically significant association between neck pain/tension and migraine headache.
The strength of an individual’s suspicion of neck paint/tension as a trigger was associated with the frequency of a statistically confirmed association. However, unlike for chocolate, there were zero instances where neck pain/tension was associated with protection against migraine.
In a twist, when the investigators reanalyzed their data after eliminating those instances where neck pain/tension occurred 1 day before the start of the headache, the association disappeared.
“The temporal association that we find using the lag-day analysis strongly suggests that neck pain/tension is part of the symptomatology of migraine attacks rather than acting as a trigger,” Dr. Donoghue observed.
This concept is strongly supported by a recent German study that used neck muscle electromyography to establish the same point (J Headache Pain. 2018 Mar 20;19[1]:26), he added.
Dr. Donoghue offered one caveat regarding the Curelator analyses: “We can’t say whether or not neck pain/tension and these other factors are actually triggers. What we’re looking at is associations. We’re not showing causality.”
Nitrates in food
Of the participants in this Curelator study, 45% suspected nitrate intake was a trigger for their migraine attacks, including 24% who rated the strength of their suspicion as moderate or severe. Of those who suspected nitrates, Cox proportional hazards modeling identified 2.2% in whom a significant association between nitrate consumption and increased migraine attack risk was present and another 1.1% in whom nitrates were associated with decreased risk, which suggests that they act as a possible protector in some cases. An individual’s strength of suspicion regarding nitrates proved unrelated to the actual risk of an association.
In subjects who did not suspect nitrates as a migraine trigger, Dr. Donoghue and colleagues identified 3.5% in whom nitrate intake was actually statistically associated with increased risk and 1.7% in whom it was linked to decreased risk.
The study was funded by Curelator, where Dr. Donoghue is employed.
SOURCE: Donoghue S et al. Headache. 2018 Jun;58(52):104,109,110.
SAN FRANCISCO – Half of migraineurs suspect chocolate can trigger their migraine attacks, but new evidence from a large prospective study suggests almost 99% of them are mistaken.
Additional analyses of this same dataset of 774 individuals with migraine threw cold water on two other widely accepted putative migraine triggers: neck pain/tension and dietary nitrate intake, Stephen Donoghue, PhD, reported at the annual meeting of the American Headache Society.
He presented a series of statistical analyses of often-cited migraine attack triggers conducted in 774 migraineurs who registered to use N1–Headache, a sophisticated proprietary digital headache diary. At the outset, participants rated on a 0-10 scale how strongly they suspected chocolate, neck pain or tension, and other factors acted as triggers for migraine attacks in their own personal experience. They then spent 2-3 minutes daily tracking more than 70 migraine-related elements for at least 90 days, recording the data using an iPhone or iPad rather than using a traditional, more hit-or-miss conventional paper diary.
The N1–Headache software, developed by Curelator of Cambridge, Mass., then tackled the key issue of what proportion of individuals who suspected a given trigger actually showed a statistically significant association prospectively between an individual’s day-to-day variations in experiencing that trigger and their headache risk, explained Dr. Donoghue, vice president for clinical development for Curelator.
Chocolate
The Curelator concept involves bringing personalized medicine to headache patients by identifying their true migraine attack triggers, which can enable individuals to deal with those triggers without the disruption involved in unnecessarily avoiding numerous nontriggers or missing real triggers.
The study results punctured some widely held beliefs. For example, at baseline 51% of migraineurs indicated they suspected chocolate served to some degree as a trigger for their own migraine attacks; the majority of them rated their suspicion as moderate or strong. However, in individually determined correlations, chocolate was indeed associated with migraine attacks in only 1.3% of those who suspected it to be a trigger. Moreover, in an another 3.9% of chocolate suspecters, chocolate consumption was actually associated with decreased risk; in other words, for them, chocolate appeared to serve as a protector against migraine, despite their preconceptions.
Also, among the 49% of participants who didn’t consider chocolate to be a personal migraine attack trigger, the Curelator analysis demonstrated that chocolate consumption was associated with a significantly increased risk of migraine attack in 2.2%, and a significantly lower risk of migraine in another 1.5%.
Neck pain/tension
Eighty percent of migraineurs who registered to use N1–Headache via the company’s website or the App Store indicated they believed neck pain/tension to be a migraine trigger for them; 46% rated it as a strong trigger. The detailed analysis of 90 days’ worth of data showed that 32% of participants with adequate data showed a statistically significant association between neck pain/tension and migraine headache.
The strength of an individual’s suspicion of neck paint/tension as a trigger was associated with the frequency of a statistically confirmed association. However, unlike for chocolate, there were zero instances where neck pain/tension was associated with protection against migraine.
In a twist, when the investigators reanalyzed their data after eliminating those instances where neck pain/tension occurred 1 day before the start of the headache, the association disappeared.
“The temporal association that we find using the lag-day analysis strongly suggests that neck pain/tension is part of the symptomatology of migraine attacks rather than acting as a trigger,” Dr. Donoghue observed.
This concept is strongly supported by a recent German study that used neck muscle electromyography to establish the same point (J Headache Pain. 2018 Mar 20;19[1]:26), he added.
Dr. Donoghue offered one caveat regarding the Curelator analyses: “We can’t say whether or not neck pain/tension and these other factors are actually triggers. What we’re looking at is associations. We’re not showing causality.”
Nitrates in food
Of the participants in this Curelator study, 45% suspected nitrate intake was a trigger for their migraine attacks, including 24% who rated the strength of their suspicion as moderate or severe. Of those who suspected nitrates, Cox proportional hazards modeling identified 2.2% in whom a significant association between nitrate consumption and increased migraine attack risk was present and another 1.1% in whom nitrates were associated with decreased risk, which suggests that they act as a possible protector in some cases. An individual’s strength of suspicion regarding nitrates proved unrelated to the actual risk of an association.
In subjects who did not suspect nitrates as a migraine trigger, Dr. Donoghue and colleagues identified 3.5% in whom nitrate intake was actually statistically associated with increased risk and 1.7% in whom it was linked to decreased risk.
The study was funded by Curelator, where Dr. Donoghue is employed.
SOURCE: Donoghue S et al. Headache. 2018 Jun;58(52):104,109,110.
REPORTING FROM the AHS Annual MEETING
Key clinical point: Migraine patients are most often way off base regarding their suspected attack triggers.
Major finding: Half of migraineurs suspect chocolate triggers their headache attacks, and almost 99% of them are wrong.
Study details: This prospective study included 774 migraineurs who completed a detailed electronic headache diary for at least 90 consecutive days, with correlations between migraine attacks and putative triggers being analyzed by proprietary software.
Disclosures: The study was sponsored by Curelator and presented by a company executive.
Source: Donoghue S et al. Headache. 2018 Jun;58(52):104,109,110.
Migraine and menopause: Longitudinal study shows what to expect
SAN FRANCISCO – What can women with migraine expect during the menopausal transition?
About 60% will experience a change in their headache pattern. And for 60% of that group, it’s a change for the worse, Yu-Chen Cheng, MD, reported at the annual meeting of the American Headache Society.
She presented a retrospective longitudinal study of 60 women with a preexisting history of migraine who were followed through the menopausal transition. All had long-term medical records available, including brain imaging results and hormonal laboratory data.
The impetus for the study was the fact that even though three-quarters of America’s estimated 38 million migraineurs are women, all of whom will eventually undergo menopause, the question of what happens to them headache-wise as they go through this process of permanent cessation of ovarian function has received little research attention.
“This longitudinal study addresses the pattern of change of migraine during menopausal transition, an important but underestimated and undermanaged issue. We need more awareness of this. We hope in the future that physicians can pay more attention to this and provide better treatment for our patients with impaired quality of life,” said Dr. Cheng, a neurologist and postdoctoral fellow at Massachusetts General Hospital and Harvard Medical School, Boston.
Of the 35 women who experienced a change in their migraine attacks in association with menopause, the change occurred perimenopausally or postmenopausally – that is, after the final menstrual period – in 84% of cases. Premenopausal change in migraine in women who hadn’t yet missed a menstrual period in the past 12 months was a less frequent event.
No significant demographic differences existed between the 35 women with migraine change during the menopausal transition and the 25 women whose headache pattern remained stable. However, there were significant differences between the two groups in terms of the change over time in serum estradiol and follicle-stimulating hormone (FSH) levels. The median estradiol level in women whose migraine pattern remained stable went from 29 pg/mL premenopausally to 16.5 pg/mL post menopause, a statistically nonsignificant difference. In contrast, the median estradiol in women who experienced a change in migraine pattern dropped from 52.6 pg/mL premenopausally to 22.5 pg/mL post menopause, which was a significant difference.
Similarly, the pre- to postmenopause change in median FSH from 38.6 to 62.8 IU/L in the stable migraine group didn’t attain statistical significance, while the bigger shift in the migraine change group – from 13.5 IU/L premenopausally to 62.2 IU/L post menopause, was statistically significant.
“So we can say there’s a greater hormonal change in the migraine change group for women in the menopausal transition,” the neurologist said. “This suggests the possibility that a significant steep decline in estradiol level may stimulate migraine change.”
Brain imaging findings in the two groups were similar: Nearly two-thirds of women in both groups had normal brain MRI results, while the rest had nonspecific findings.
Several female headache specialists in the audience rose to thank Dr. Cheng for shining new light on a major understudied issue with far-reaching quality-of-life implications. Could hormone replacement therapy possibly prevent worsening of migraine attacks in association with menopause? she was asked.
Dr. Cheng noted that hormone replacement therapy was used by about two-thirds of women whose migraines remained stable and a similar proportion of those whose headaches changed. But the study wasn’t designed or sized to examine any possible migraine-preventive effect of hormone therapy. That would properly be addressed in a large prospective study. Anecdotally, however, it has been her clinical impression as well as that of some of her fellow neurologists at Massachusetts General that hormone replacement therapy does seem to protect against worsening migraine attacks in menopause, she added.
Dr. Cheng reported having no financial conflicts regarding her National Institutes of Health–funded study.
SOURCE: Cheng Y-C and Maleki N. Headache. 2018;58:71. Abstract OR16.
SAN FRANCISCO – What can women with migraine expect during the menopausal transition?
About 60% will experience a change in their headache pattern. And for 60% of that group, it’s a change for the worse, Yu-Chen Cheng, MD, reported at the annual meeting of the American Headache Society.
She presented a retrospective longitudinal study of 60 women with a preexisting history of migraine who were followed through the menopausal transition. All had long-term medical records available, including brain imaging results and hormonal laboratory data.
The impetus for the study was the fact that even though three-quarters of America’s estimated 38 million migraineurs are women, all of whom will eventually undergo menopause, the question of what happens to them headache-wise as they go through this process of permanent cessation of ovarian function has received little research attention.
“This longitudinal study addresses the pattern of change of migraine during menopausal transition, an important but underestimated and undermanaged issue. We need more awareness of this. We hope in the future that physicians can pay more attention to this and provide better treatment for our patients with impaired quality of life,” said Dr. Cheng, a neurologist and postdoctoral fellow at Massachusetts General Hospital and Harvard Medical School, Boston.
Of the 35 women who experienced a change in their migraine attacks in association with menopause, the change occurred perimenopausally or postmenopausally – that is, after the final menstrual period – in 84% of cases. Premenopausal change in migraine in women who hadn’t yet missed a menstrual period in the past 12 months was a less frequent event.
No significant demographic differences existed between the 35 women with migraine change during the menopausal transition and the 25 women whose headache pattern remained stable. However, there were significant differences between the two groups in terms of the change over time in serum estradiol and follicle-stimulating hormone (FSH) levels. The median estradiol level in women whose migraine pattern remained stable went from 29 pg/mL premenopausally to 16.5 pg/mL post menopause, a statistically nonsignificant difference. In contrast, the median estradiol in women who experienced a change in migraine pattern dropped from 52.6 pg/mL premenopausally to 22.5 pg/mL post menopause, which was a significant difference.
Similarly, the pre- to postmenopause change in median FSH from 38.6 to 62.8 IU/L in the stable migraine group didn’t attain statistical significance, while the bigger shift in the migraine change group – from 13.5 IU/L premenopausally to 62.2 IU/L post menopause, was statistically significant.
“So we can say there’s a greater hormonal change in the migraine change group for women in the menopausal transition,” the neurologist said. “This suggests the possibility that a significant steep decline in estradiol level may stimulate migraine change.”
Brain imaging findings in the two groups were similar: Nearly two-thirds of women in both groups had normal brain MRI results, while the rest had nonspecific findings.
Several female headache specialists in the audience rose to thank Dr. Cheng for shining new light on a major understudied issue with far-reaching quality-of-life implications. Could hormone replacement therapy possibly prevent worsening of migraine attacks in association with menopause? she was asked.
Dr. Cheng noted that hormone replacement therapy was used by about two-thirds of women whose migraines remained stable and a similar proportion of those whose headaches changed. But the study wasn’t designed or sized to examine any possible migraine-preventive effect of hormone therapy. That would properly be addressed in a large prospective study. Anecdotally, however, it has been her clinical impression as well as that of some of her fellow neurologists at Massachusetts General that hormone replacement therapy does seem to protect against worsening migraine attacks in menopause, she added.
Dr. Cheng reported having no financial conflicts regarding her National Institutes of Health–funded study.
SOURCE: Cheng Y-C and Maleki N. Headache. 2018;58:71. Abstract OR16.
SAN FRANCISCO – What can women with migraine expect during the menopausal transition?
About 60% will experience a change in their headache pattern. And for 60% of that group, it’s a change for the worse, Yu-Chen Cheng, MD, reported at the annual meeting of the American Headache Society.
She presented a retrospective longitudinal study of 60 women with a preexisting history of migraine who were followed through the menopausal transition. All had long-term medical records available, including brain imaging results and hormonal laboratory data.
The impetus for the study was the fact that even though three-quarters of America’s estimated 38 million migraineurs are women, all of whom will eventually undergo menopause, the question of what happens to them headache-wise as they go through this process of permanent cessation of ovarian function has received little research attention.
“This longitudinal study addresses the pattern of change of migraine during menopausal transition, an important but underestimated and undermanaged issue. We need more awareness of this. We hope in the future that physicians can pay more attention to this and provide better treatment for our patients with impaired quality of life,” said Dr. Cheng, a neurologist and postdoctoral fellow at Massachusetts General Hospital and Harvard Medical School, Boston.
Of the 35 women who experienced a change in their migraine attacks in association with menopause, the change occurred perimenopausally or postmenopausally – that is, after the final menstrual period – in 84% of cases. Premenopausal change in migraine in women who hadn’t yet missed a menstrual period in the past 12 months was a less frequent event.
No significant demographic differences existed between the 35 women with migraine change during the menopausal transition and the 25 women whose headache pattern remained stable. However, there were significant differences between the two groups in terms of the change over time in serum estradiol and follicle-stimulating hormone (FSH) levels. The median estradiol level in women whose migraine pattern remained stable went from 29 pg/mL premenopausally to 16.5 pg/mL post menopause, a statistically nonsignificant difference. In contrast, the median estradiol in women who experienced a change in migraine pattern dropped from 52.6 pg/mL premenopausally to 22.5 pg/mL post menopause, which was a significant difference.
Similarly, the pre- to postmenopause change in median FSH from 38.6 to 62.8 IU/L in the stable migraine group didn’t attain statistical significance, while the bigger shift in the migraine change group – from 13.5 IU/L premenopausally to 62.2 IU/L post menopause, was statistically significant.
“So we can say there’s a greater hormonal change in the migraine change group for women in the menopausal transition,” the neurologist said. “This suggests the possibility that a significant steep decline in estradiol level may stimulate migraine change.”
Brain imaging findings in the two groups were similar: Nearly two-thirds of women in both groups had normal brain MRI results, while the rest had nonspecific findings.
Several female headache specialists in the audience rose to thank Dr. Cheng for shining new light on a major understudied issue with far-reaching quality-of-life implications. Could hormone replacement therapy possibly prevent worsening of migraine attacks in association with menopause? she was asked.
Dr. Cheng noted that hormone replacement therapy was used by about two-thirds of women whose migraines remained stable and a similar proportion of those whose headaches changed. But the study wasn’t designed or sized to examine any possible migraine-preventive effect of hormone therapy. That would properly be addressed in a large prospective study. Anecdotally, however, it has been her clinical impression as well as that of some of her fellow neurologists at Massachusetts General that hormone replacement therapy does seem to protect against worsening migraine attacks in menopause, she added.
Dr. Cheng reported having no financial conflicts regarding her National Institutes of Health–funded study.
SOURCE: Cheng Y-C and Maleki N. Headache. 2018;58:71. Abstract OR16.
REPORTING FROM THE AHS ANNUAL MEETING
Key clinical point: For migraineurs, the menopausal transition is a time of change in headache pattern, often for the worse.
Major finding: Sixty percent of migraineurs experienced a change in headache pattern during the menopausal transition, and for 60% of them it involved worsening migraine intensity and/or frequency.
Study details: This retrospective longitudinal study followed 60 women with migraine before and through the menopausal transition.
Disclosures: The presenter reported having no financial conflicts regarding her National Institutes of Health–funded study.
Source: Cheng Y-C and Maleki N. Headache. 2018;58:71. Abstract OR16.
TAVR-related stroke risk unrelated to anatomy
PARIS – The most appropriate stroke prevention strategy in patients undergoing transcatheter aortic valve replacement is routine use of a cerebroembolic protection device for all, because no identifiable high-risk anatomic subsets exist, Hasan Jilaihawi, MD, said at the annual meeting of the European Association of Percutaneous Cardiovascular Interventions.
“We looked at the anatomy in great detail. I’d hoped to find a strata that was truly high risk, but there is no clear strata that is truly higher risk. So stroke remains an unpredictable event in TAVR, and in the ideal world we would use cerebroembolic protection in everyone,” said Dr. Jilaihawi, codirector of transcatheter valve therapy at New York University.
“I put it to you that, as in carotid stenting, where we routinely use cerebroembolic protection, perhaps we need to consider the same in TAVR,” the cardiologist added.
The SENTINEL trial randomized 363 patients undergoing TAVR 2:1 to the use of the Sentinel intraluminal filter device or no neuroprotection during the procedure. The use of the cerebroembolic protection device was associated with a statistically significant 63% reduction in the incidence of neurologist-adjudicated stroke within 72 hours, from 8.2% to 3.0% (J Am Coll Cardiol. 2017 Jan 31;69[4]:367-77). The device was cleared for marketing by the Food and Drug Administration in 2017 and approved by European authorities several years earlier.
A wealth of evidence shows that the average stroke rate associated with contemporary TAVR is 4.4%, although this figure is probably on the low side because most of the data come from nonrandomized registries, which typically underreport neurologic outcomes. The stroke rate is independent of operator experience and volume, surgical risk score, and institutional TAVR volume. Moreover, in the SENTINEL trial, embolic debris was captured in 99% of patients fitted with the cerebroembolic protection device.
“A huge variety of material was captured, including thrombus, valve tissue, calcified material, and – alarmingly – foreign material in 35% of cases,” Dr. Jilaihawi noted.
Nonetheless, the issue of routine versus selective use of cardioembolic protection remains controversial at a time when interventionalists are trying to make TAVR a simpler, briefer procedure, even though the approved Sentinel device is successfully deployed in a median of only 4 minutes. This was the impetus for Dr. Jilaihawi to examine baseline anatomy as a potential predictor of stroke.
He looked at four key anatomic features: aortic arch type, aortic root angulation, aortic arch calcium, and aortic valve calcification. The bottom line: The benefit of cerebroembolic protection with the Sentinel device was consistent across all anatomic subgroups. For example, in patients with an aortic root angulation angle of less than 50 degrees, the incidence of stroke within 3 days post TAVR was 3.2% with and 5.9% without cerebroembolic protection, while in those with an angle of 50 degrees or more the stroke rate was 2.6% with and 9.1% without the Sentinel device. With a total of only 16 strokes by day 3 in the study, those stroke rates in the absence of cerebroembolic protection aren’t significantly different.
There was, however, one unexpected and counterintuitive finding: The greatest stroke reduction with cerebroembolic protection was seen in patients with the least aortic valve calcium. This prompted session cochair Alain Cribier, MD, professor of medicine at the University of Rouen, France, to observe that perhaps valve repositioning is an important factor in TAVR-related strokes. After all, he noted, valve repositioning occurs more often when a patient’s valves are softer and less calcified.
“This is a very important point,” Dr. Jilaihawi responded. “I think there is an interplay between procedural aspects and the anatomy which is not completely captured in this study because we don’t know whose valve was repositioned multiple times.”
He added that the finding that TAVR-related stroke is more common in patients with less calcified aortic valves is consistent with the earlier experience in carotid stenting.
“If you look 10 years ago in the field of carotid stenting, there were a lot of analyses done which concluded that the highest-risk lesions are the least calcified lesions, even though it’s counterintuitive,” he said.
Discussant Saibal Kar, MD, director of interventional cardiac research at Cedars-Sinai Medical Center in Los Angeles, said the take-home point from the SENTINEL analysis is clear: “Cerebroembolic protection is like a seat belt: You should wear it. All patients should wear it.”
The SENTINEL trial was sponsored by Claret Medical. Dr. Jilaihawi reported receiving research grants from Abbott and Medtronic and serving as a consultant to Edwards Lifesciences and Venus Medtech.
SOURCE: Jilaihawi H. EuroPCR 2018.
PARIS – The most appropriate stroke prevention strategy in patients undergoing transcatheter aortic valve replacement is routine use of a cerebroembolic protection device for all, because no identifiable high-risk anatomic subsets exist, Hasan Jilaihawi, MD, said at the annual meeting of the European Association of Percutaneous Cardiovascular Interventions.
“We looked at the anatomy in great detail. I’d hoped to find a strata that was truly high risk, but there is no clear strata that is truly higher risk. So stroke remains an unpredictable event in TAVR, and in the ideal world we would use cerebroembolic protection in everyone,” said Dr. Jilaihawi, codirector of transcatheter valve therapy at New York University.
“I put it to you that, as in carotid stenting, where we routinely use cerebroembolic protection, perhaps we need to consider the same in TAVR,” the cardiologist added.
The SENTINEL trial randomized 363 patients undergoing TAVR 2:1 to the use of the Sentinel intraluminal filter device or no neuroprotection during the procedure. The use of the cerebroembolic protection device was associated with a statistically significant 63% reduction in the incidence of neurologist-adjudicated stroke within 72 hours, from 8.2% to 3.0% (J Am Coll Cardiol. 2017 Jan 31;69[4]:367-77). The device was cleared for marketing by the Food and Drug Administration in 2017 and approved by European authorities several years earlier.
A wealth of evidence shows that the average stroke rate associated with contemporary TAVR is 4.4%, although this figure is probably on the low side because most of the data come from nonrandomized registries, which typically underreport neurologic outcomes. The stroke rate is independent of operator experience and volume, surgical risk score, and institutional TAVR volume. Moreover, in the SENTINEL trial, embolic debris was captured in 99% of patients fitted with the cerebroembolic protection device.
“A huge variety of material was captured, including thrombus, valve tissue, calcified material, and – alarmingly – foreign material in 35% of cases,” Dr. Jilaihawi noted.
Nonetheless, the issue of routine versus selective use of cardioembolic protection remains controversial at a time when interventionalists are trying to make TAVR a simpler, briefer procedure, even though the approved Sentinel device is successfully deployed in a median of only 4 minutes. This was the impetus for Dr. Jilaihawi to examine baseline anatomy as a potential predictor of stroke.
He looked at four key anatomic features: aortic arch type, aortic root angulation, aortic arch calcium, and aortic valve calcification. The bottom line: The benefit of cerebroembolic protection with the Sentinel device was consistent across all anatomic subgroups. For example, in patients with an aortic root angulation angle of less than 50 degrees, the incidence of stroke within 3 days post TAVR was 3.2% with and 5.9% without cerebroembolic protection, while in those with an angle of 50 degrees or more the stroke rate was 2.6% with and 9.1% without the Sentinel device. With a total of only 16 strokes by day 3 in the study, those stroke rates in the absence of cerebroembolic protection aren’t significantly different.
There was, however, one unexpected and counterintuitive finding: The greatest stroke reduction with cerebroembolic protection was seen in patients with the least aortic valve calcium. This prompted session cochair Alain Cribier, MD, professor of medicine at the University of Rouen, France, to observe that perhaps valve repositioning is an important factor in TAVR-related strokes. After all, he noted, valve repositioning occurs more often when a patient’s valves are softer and less calcified.
“This is a very important point,” Dr. Jilaihawi responded. “I think there is an interplay between procedural aspects and the anatomy which is not completely captured in this study because we don’t know whose valve was repositioned multiple times.”
He added that the finding that TAVR-related stroke is more common in patients with less calcified aortic valves is consistent with the earlier experience in carotid stenting.
“If you look 10 years ago in the field of carotid stenting, there were a lot of analyses done which concluded that the highest-risk lesions are the least calcified lesions, even though it’s counterintuitive,” he said.
Discussant Saibal Kar, MD, director of interventional cardiac research at Cedars-Sinai Medical Center in Los Angeles, said the take-home point from the SENTINEL analysis is clear: “Cerebroembolic protection is like a seat belt: You should wear it. All patients should wear it.”
The SENTINEL trial was sponsored by Claret Medical. Dr. Jilaihawi reported receiving research grants from Abbott and Medtronic and serving as a consultant to Edwards Lifesciences and Venus Medtech.
SOURCE: Jilaihawi H. EuroPCR 2018.
PARIS – The most appropriate stroke prevention strategy in patients undergoing transcatheter aortic valve replacement is routine use of a cerebroembolic protection device for all, because no identifiable high-risk anatomic subsets exist, Hasan Jilaihawi, MD, said at the annual meeting of the European Association of Percutaneous Cardiovascular Interventions.
“We looked at the anatomy in great detail. I’d hoped to find a strata that was truly high risk, but there is no clear strata that is truly higher risk. So stroke remains an unpredictable event in TAVR, and in the ideal world we would use cerebroembolic protection in everyone,” said Dr. Jilaihawi, codirector of transcatheter valve therapy at New York University.
“I put it to you that, as in carotid stenting, where we routinely use cerebroembolic protection, perhaps we need to consider the same in TAVR,” the cardiologist added.
The SENTINEL trial randomized 363 patients undergoing TAVR 2:1 to the use of the Sentinel intraluminal filter device or no neuroprotection during the procedure. The use of the cerebroembolic protection device was associated with a statistically significant 63% reduction in the incidence of neurologist-adjudicated stroke within 72 hours, from 8.2% to 3.0% (J Am Coll Cardiol. 2017 Jan 31;69[4]:367-77). The device was cleared for marketing by the Food and Drug Administration in 2017 and approved by European authorities several years earlier.
A wealth of evidence shows that the average stroke rate associated with contemporary TAVR is 4.4%, although this figure is probably on the low side because most of the data come from nonrandomized registries, which typically underreport neurologic outcomes. The stroke rate is independent of operator experience and volume, surgical risk score, and institutional TAVR volume. Moreover, in the SENTINEL trial, embolic debris was captured in 99% of patients fitted with the cerebroembolic protection device.
“A huge variety of material was captured, including thrombus, valve tissue, calcified material, and – alarmingly – foreign material in 35% of cases,” Dr. Jilaihawi noted.
Nonetheless, the issue of routine versus selective use of cardioembolic protection remains controversial at a time when interventionalists are trying to make TAVR a simpler, briefer procedure, even though the approved Sentinel device is successfully deployed in a median of only 4 minutes. This was the impetus for Dr. Jilaihawi to examine baseline anatomy as a potential predictor of stroke.
He looked at four key anatomic features: aortic arch type, aortic root angulation, aortic arch calcium, and aortic valve calcification. The bottom line: The benefit of cerebroembolic protection with the Sentinel device was consistent across all anatomic subgroups. For example, in patients with an aortic root angulation angle of less than 50 degrees, the incidence of stroke within 3 days post TAVR was 3.2% with and 5.9% without cerebroembolic protection, while in those with an angle of 50 degrees or more the stroke rate was 2.6% with and 9.1% without the Sentinel device. With a total of only 16 strokes by day 3 in the study, those stroke rates in the absence of cerebroembolic protection aren’t significantly different.
There was, however, one unexpected and counterintuitive finding: The greatest stroke reduction with cerebroembolic protection was seen in patients with the least aortic valve calcium. This prompted session cochair Alain Cribier, MD, professor of medicine at the University of Rouen, France, to observe that perhaps valve repositioning is an important factor in TAVR-related strokes. After all, he noted, valve repositioning occurs more often when a patient’s valves are softer and less calcified.
“This is a very important point,” Dr. Jilaihawi responded. “I think there is an interplay between procedural aspects and the anatomy which is not completely captured in this study because we don’t know whose valve was repositioned multiple times.”
He added that the finding that TAVR-related stroke is more common in patients with less calcified aortic valves is consistent with the earlier experience in carotid stenting.
“If you look 10 years ago in the field of carotid stenting, there were a lot of analyses done which concluded that the highest-risk lesions are the least calcified lesions, even though it’s counterintuitive,” he said.
Discussant Saibal Kar, MD, director of interventional cardiac research at Cedars-Sinai Medical Center in Los Angeles, said the take-home point from the SENTINEL analysis is clear: “Cerebroembolic protection is like a seat belt: You should wear it. All patients should wear it.”
The SENTINEL trial was sponsored by Claret Medical. Dr. Jilaihawi reported receiving research grants from Abbott and Medtronic and serving as a consultant to Edwards Lifesciences and Venus Medtech.
SOURCE: Jilaihawi H. EuroPCR 2018.
REPORTING FROM EUROPCR 2018
Key clinical point: .
Major finding: Neither aortic arch type, root angulation, nor calcium burden identifies a subgroup of TAVR patients at higher stroke risk.
Study details: The SENTINEL trial randomized 363 TAVR patients 2:1 to the use of a cerebroembolic protection device or no neuroprotection during the procedure.
Disclosures: The SENTINEL trial was sponsored by Claret Medical. The presenter reported having no financial conflicts of interest.
Source: Jilaihawi H. EuroPCR 2018.
Amplatzer Amulet slashes stroke risk in A-fib
PARIS – The Amplatzer Amulet left atrial appendage occlusion device reduced stroke risk by nearly 60% at 1 year in a large, real-world registry of patients with atrial fibrillation at dual high risk for stroke and bleeding, Ulf Landmesser, MD, reported at the annual meeting of the European Association of Percutaneous Cardiovascular Interventions.
One of the most impressive findings was that this feat was accomplished by and large without background oral anticoagulation. Indeed, 83% of the 1,088 patients in this 61-center, 17-country study had contraindications to oral anticoagulation. Only 11% of subjects were discharged on oral anticoagulation after device implantation, while 22.5% were discharged on aspirin or clopidogrel monotherapy. By 1-3 months post implantation, 60% of patients were either on a single antiplatelet agent or no antithrombotic medication at all.
“Antithrombotic therapy was individualized by the patient’s physician. There didn’t seem to be an increased risk of device-related thrombus in these patients on single antiplatelet therapy. Our data suggest that, given the high bleeding risk, single antiplatelet therapy seems to be a good option for these patients,” said Dr. Landmesser, a professor in and the chair of the department of cardiology at Charité Medical School in Berlin.
Participants in the Global Prospective Amulet Study averaged 75 years of age, and 72% had a history of major bleeding. The average CHA2DS2-VASc score was 4.2, with a HAS-BLED score of 3.3, which emphasizes the high-risk nature of study participants.
On the basis of the CHA2DS2-VASc score, the predicted 1-year ischemic stroke rate without oral anticoagulation was 6.7%, so the actual 2.9% rate represented a 57% reduction in risk. Similarly, for the composite endpoint of ischemic stroke, transient ischemic attack, or systemic embolism, the predicted rate was 9.4%, but the achieved rate was 3.8%, which represented a 60% reduction in risk.
The annualized major bleeding rate was 10.3% despite the low usage of oral anticoagulation or dual-antiplatelet therapy. However, the rate of procedure- or device-related major bleeding was only 3.2%; the other 7.1% was unrelated to Amulet and reflected the underlying high risk of the study population.
The 1-year mortality rate was 8.4%. Thirty-five deaths had cardiovascular causes, 35 were noncardiovascular, and in 18 patients, cause of death couldn’t be determined.
The device-related thrombus rate through 1 year was 1.7%; 10 of 18 cases occurred within the first 90 days.
Dr. Landmesser emphasized that this was a particularly rigorously conducted registry. A unique feature was its use of an independent echocardiography core lab to assess procedural success, as well as an independent clinical events committee to adjudicate serious adverse events. Prior studies of other left atrial appendage (LAA) occlusion devices didn’t use these measures.
The Amplatzer Amulet is a second-generation occlusion device designed for easier placement and more complete sealing than its predecessor and comes in eight sizes to address anatomic variations. At implantation, adequate LAA occlusion as defined by the echocardiography core laboratory was achieved in 99.3% of patients; at that time, 89.4% of patients had no residual flow, and another 9.9% had a residual flow of less than 3 mm. At 1-3 months of follow-up, echocardiography showed 98.4% of patients had adequate occlusion.
Session cochair Alberto Cremonesi, MD, pronounced this to be “really important data.”
“I want to stress that these device implantations were transesophageal echocardiography–guided. In my mind this is absolutely essential to your excellent long-term results,” observed Dr. Cremonesi of Maria Cecilia Hospital in Cotignola, Italy.
Asked to speculate on what outcomes might have looked like had patients been treated with an oral anticoagulant rather than the Amulet occlusion device, Dr. Landmesser predicted the major bleeding rate would have been substantially higher than 10.3%. Most of the bleeding events in the study were gastrointestinal, and the novel oral anticoagulants are known to boost the risk of GI bleeding.
But that’s speculation. He noted that two ongoing randomized trials – one in Germany, the other in Scandinavia – are randomizing high-risk patients to a LAA occlusion device or best medical care, including a novel oral anticoagulant when not contraindicated. The Scandinavian study uses the Amulet, while the German trial uses both the Amulet and the Watchman device. The primary endpoint is the ischemic stroke rate.
The Amulet registry, which will continue for a second year of follow-up, was sponsored by Abbott Laboratories, which developed the Amulet device. Dr. Landmesser reported serving as a consultant to Abbott, as well as Biotronik, Rewa, and Bayer.
PARIS – The Amplatzer Amulet left atrial appendage occlusion device reduced stroke risk by nearly 60% at 1 year in a large, real-world registry of patients with atrial fibrillation at dual high risk for stroke and bleeding, Ulf Landmesser, MD, reported at the annual meeting of the European Association of Percutaneous Cardiovascular Interventions.
One of the most impressive findings was that this feat was accomplished by and large without background oral anticoagulation. Indeed, 83% of the 1,088 patients in this 61-center, 17-country study had contraindications to oral anticoagulation. Only 11% of subjects were discharged on oral anticoagulation after device implantation, while 22.5% were discharged on aspirin or clopidogrel monotherapy. By 1-3 months post implantation, 60% of patients were either on a single antiplatelet agent or no antithrombotic medication at all.
“Antithrombotic therapy was individualized by the patient’s physician. There didn’t seem to be an increased risk of device-related thrombus in these patients on single antiplatelet therapy. Our data suggest that, given the high bleeding risk, single antiplatelet therapy seems to be a good option for these patients,” said Dr. Landmesser, a professor in and the chair of the department of cardiology at Charité Medical School in Berlin.
Participants in the Global Prospective Amulet Study averaged 75 years of age, and 72% had a history of major bleeding. The average CHA2DS2-VASc score was 4.2, with a HAS-BLED score of 3.3, which emphasizes the high-risk nature of study participants.
On the basis of the CHA2DS2-VASc score, the predicted 1-year ischemic stroke rate without oral anticoagulation was 6.7%, so the actual 2.9% rate represented a 57% reduction in risk. Similarly, for the composite endpoint of ischemic stroke, transient ischemic attack, or systemic embolism, the predicted rate was 9.4%, but the achieved rate was 3.8%, which represented a 60% reduction in risk.
The annualized major bleeding rate was 10.3% despite the low usage of oral anticoagulation or dual-antiplatelet therapy. However, the rate of procedure- or device-related major bleeding was only 3.2%; the other 7.1% was unrelated to Amulet and reflected the underlying high risk of the study population.
The 1-year mortality rate was 8.4%. Thirty-five deaths had cardiovascular causes, 35 were noncardiovascular, and in 18 patients, cause of death couldn’t be determined.
The device-related thrombus rate through 1 year was 1.7%; 10 of 18 cases occurred within the first 90 days.
Dr. Landmesser emphasized that this was a particularly rigorously conducted registry. A unique feature was its use of an independent echocardiography core lab to assess procedural success, as well as an independent clinical events committee to adjudicate serious adverse events. Prior studies of other left atrial appendage (LAA) occlusion devices didn’t use these measures.
The Amplatzer Amulet is a second-generation occlusion device designed for easier placement and more complete sealing than its predecessor and comes in eight sizes to address anatomic variations. At implantation, adequate LAA occlusion as defined by the echocardiography core laboratory was achieved in 99.3% of patients; at that time, 89.4% of patients had no residual flow, and another 9.9% had a residual flow of less than 3 mm. At 1-3 months of follow-up, echocardiography showed 98.4% of patients had adequate occlusion.
Session cochair Alberto Cremonesi, MD, pronounced this to be “really important data.”
“I want to stress that these device implantations were transesophageal echocardiography–guided. In my mind this is absolutely essential to your excellent long-term results,” observed Dr. Cremonesi of Maria Cecilia Hospital in Cotignola, Italy.
Asked to speculate on what outcomes might have looked like had patients been treated with an oral anticoagulant rather than the Amulet occlusion device, Dr. Landmesser predicted the major bleeding rate would have been substantially higher than 10.3%. Most of the bleeding events in the study were gastrointestinal, and the novel oral anticoagulants are known to boost the risk of GI bleeding.
But that’s speculation. He noted that two ongoing randomized trials – one in Germany, the other in Scandinavia – are randomizing high-risk patients to a LAA occlusion device or best medical care, including a novel oral anticoagulant when not contraindicated. The Scandinavian study uses the Amulet, while the German trial uses both the Amulet and the Watchman device. The primary endpoint is the ischemic stroke rate.
The Amulet registry, which will continue for a second year of follow-up, was sponsored by Abbott Laboratories, which developed the Amulet device. Dr. Landmesser reported serving as a consultant to Abbott, as well as Biotronik, Rewa, and Bayer.
PARIS – The Amplatzer Amulet left atrial appendage occlusion device reduced stroke risk by nearly 60% at 1 year in a large, real-world registry of patients with atrial fibrillation at dual high risk for stroke and bleeding, Ulf Landmesser, MD, reported at the annual meeting of the European Association of Percutaneous Cardiovascular Interventions.
One of the most impressive findings was that this feat was accomplished by and large without background oral anticoagulation. Indeed, 83% of the 1,088 patients in this 61-center, 17-country study had contraindications to oral anticoagulation. Only 11% of subjects were discharged on oral anticoagulation after device implantation, while 22.5% were discharged on aspirin or clopidogrel monotherapy. By 1-3 months post implantation, 60% of patients were either on a single antiplatelet agent or no antithrombotic medication at all.
“Antithrombotic therapy was individualized by the patient’s physician. There didn’t seem to be an increased risk of device-related thrombus in these patients on single antiplatelet therapy. Our data suggest that, given the high bleeding risk, single antiplatelet therapy seems to be a good option for these patients,” said Dr. Landmesser, a professor in and the chair of the department of cardiology at Charité Medical School in Berlin.
Participants in the Global Prospective Amulet Study averaged 75 years of age, and 72% had a history of major bleeding. The average CHA2DS2-VASc score was 4.2, with a HAS-BLED score of 3.3, which emphasizes the high-risk nature of study participants.
On the basis of the CHA2DS2-VASc score, the predicted 1-year ischemic stroke rate without oral anticoagulation was 6.7%, so the actual 2.9% rate represented a 57% reduction in risk. Similarly, for the composite endpoint of ischemic stroke, transient ischemic attack, or systemic embolism, the predicted rate was 9.4%, but the achieved rate was 3.8%, which represented a 60% reduction in risk.
The annualized major bleeding rate was 10.3% despite the low usage of oral anticoagulation or dual-antiplatelet therapy. However, the rate of procedure- or device-related major bleeding was only 3.2%; the other 7.1% was unrelated to Amulet and reflected the underlying high risk of the study population.
The 1-year mortality rate was 8.4%. Thirty-five deaths had cardiovascular causes, 35 were noncardiovascular, and in 18 patients, cause of death couldn’t be determined.
The device-related thrombus rate through 1 year was 1.7%; 10 of 18 cases occurred within the first 90 days.
Dr. Landmesser emphasized that this was a particularly rigorously conducted registry. A unique feature was its use of an independent echocardiography core lab to assess procedural success, as well as an independent clinical events committee to adjudicate serious adverse events. Prior studies of other left atrial appendage (LAA) occlusion devices didn’t use these measures.
The Amplatzer Amulet is a second-generation occlusion device designed for easier placement and more complete sealing than its predecessor and comes in eight sizes to address anatomic variations. At implantation, adequate LAA occlusion as defined by the echocardiography core laboratory was achieved in 99.3% of patients; at that time, 89.4% of patients had no residual flow, and another 9.9% had a residual flow of less than 3 mm. At 1-3 months of follow-up, echocardiography showed 98.4% of patients had adequate occlusion.
Session cochair Alberto Cremonesi, MD, pronounced this to be “really important data.”
“I want to stress that these device implantations were transesophageal echocardiography–guided. In my mind this is absolutely essential to your excellent long-term results,” observed Dr. Cremonesi of Maria Cecilia Hospital in Cotignola, Italy.
Asked to speculate on what outcomes might have looked like had patients been treated with an oral anticoagulant rather than the Amulet occlusion device, Dr. Landmesser predicted the major bleeding rate would have been substantially higher than 10.3%. Most of the bleeding events in the study were gastrointestinal, and the novel oral anticoagulants are known to boost the risk of GI bleeding.
But that’s speculation. He noted that two ongoing randomized trials – one in Germany, the other in Scandinavia – are randomizing high-risk patients to a LAA occlusion device or best medical care, including a novel oral anticoagulant when not contraindicated. The Scandinavian study uses the Amulet, while the German trial uses both the Amulet and the Watchman device. The primary endpoint is the ischemic stroke rate.
The Amulet registry, which will continue for a second year of follow-up, was sponsored by Abbott Laboratories, which developed the Amulet device. Dr. Landmesser reported serving as a consultant to Abbott, as well as Biotronik, Rewa, and Bayer.
REPORTING FROM EUROPCR 2018
Key clinical point:
Major finding: The 1-year ischemic stroke rate in Amulet recipients was 2.9%, compared with a predicted rate of 6.7% based on CHA2DS2-VASc score.
Study details: This prospective all-comers registry included 1,088 atrial fibrillation patients who received the Amulet device at 61 centers in 17 countries.
Disclosures: The study was sponsored by Abbott Laboratories, which developed the device. The presenter reported serving as a consultant to the company, as well as Biotronik, Rewa, and Bayer.
Better stent technology needed for diabetes patients
PARIS – Interventional cardiologists are hopeful that a new generation of investigational coronary stents designed specifically for use in diabetes patients will improve upon the relatively poor current outcomes of percutaneous coronary intervention in that population.
The operative word here is “abluminal.” Both of the novel drug-eluting stents featured at the annual meeting of the European Association of Percutaneous Cardiovascular Interventions position their antirestenosis drugs abluminally: that is, aimed toward the vessel wall surface, not the lumen.
“The hypothesis is that ,” explained Luca Testa, MD, PhD, head of the coronary revascularization unit at San Donato Hospital in Milan.
There is a major unmet need for improved stent technology that addresses the special needs of diabetes patients, who tend to have more diffuse and rapidly progressive coronary artery disease (CAD) with longer lesions. Target lesion revascularization rates at 5 years of follow-up in diabetes patients with current generation drug-eluting stents (DES) remain high, at 20% or more. And diabetes patients are roughly 3.5-fold more likely to have nonfocal, diffuse coronary lesions than are nondiabetic patients with CAD, the cardiologist noted.
The sense of urgency surrounding this unmet need stems from the ongoing worldwide epidemic of diabetes. The global prevalence of diabetes was estimated at 382 million in 2013 and is projected to climb to nearly 600 million by 2035. Diabetes patients are two to four times more likely to develop CAD than are those without the disease. Because of the current suboptimal results with percutaneous coronary intervention (PCI), many of them are being referred for coronary artery bypass surgery.
Dr. Testa presented the 1-year results of the ongoing en-ABL e-Registry, a 5-year, multicenter, prospective, all-comers registry of 859 diabetic and 1,641 nondiabetic CAD patients who received the Abluminus DES at 31 centers in India. The novel stent, developed by Envision Scientific of India, is coated with sirolimus on the abluminal side. The device is actually both a DES and a drug-coated balloon. The balloon, including its proximal and distal ends, are also sirolimus coated to maximize exposure of diseased artery to the drug. The balloon needs to be inflated in position for at least 30 seconds to deliver its portion of sirolimus. The stent is composed of a biodegradable polymer matrix that is metabolized within 6-8 months.
The primary endpoint at 1 year of follow-up was the composite of cardiac death, target vessel MI, and target lesion or vessel revascularization. The rate was 3.12% in the diabetic population, which wasn’t significantly different from the 2.1% rate in nondiabetic patients. Of note, the rate was 5.17% in the 138 insulin-dependent diabetes patients, compared with 2.76% in 721 non–insulin-dependent patients.
Among diabetes patients, the composite endpoint occurred in 2.82% of those who underwent primary PCI with the Abluminus DES for an acute MI, 3.96% of those treated for lesions in small vessels 2.75 mm or less in diameter, 3.75% in diabetes patients treated for long lesions, and 4.18% in the subgroup with long lesions in small vessels.
On the basis of these encouraging results, Dr. Testa has been named the principal investigator for the new prospective, multicenter, observational DEDICATE registry, restricted to diabetic patients treated with the Abluminus DES.
Also getting underway is a randomized, investigator-initiated, multicenter, single-blind pilot study involving 165 diabetes patients assigned 2:1 to the Abluminus DES or the Xience everolimus-eluting stent, widely considered the current gold standard DES. The study, known as the ABILITY trial, has as its primary endpoint the in-stent neointimal volume as measured by optical coherence tomography 6 months post PCI. The medical director of the study is Antonio Colombo, MD, director of the cardiac catheterization laboratory and the interventional cardiology unit at San Raffaele Hospital in Milan.
Elsewhere at EuroPCR 2018, officials at Alvimedica Medical Technologies announced that the company’s abluminal stent, known as the Cre8 EVO, will be pitted against the everolimus-eluting stent in a 55-center trial of 3,040 diabetes patients. The hypothesis of the Diab8 trial, based on preliminary data from pilot studies, is that the abluminal stent will show clinical superiority – not merely equivalence – at 1 year.
The Cre8 EVO stent utilizes a proprietary, polymer-free, drug-release technology involving reservoirs located on the stent’s outer surface that direct the controlled release of a mixture of sirolimus and fatty acids that the company calls the amphilimus formulation. The drug mixture is designed to enhance tissue permeation and sirolimus bioavailability. The body of the stent is cobalt, which was used based upon a conviction that polymers are more proinflammatory.
Dr. Colombo is also the principal investigator of the Diab8 trial, sponsored by Alvimedica.
Dr. Testa reported having no financial conflicts regarding his work on the en-ABL e-Registry, funded by a nonprofit Italian cardiovascular research foundation.
PARIS – Interventional cardiologists are hopeful that a new generation of investigational coronary stents designed specifically for use in diabetes patients will improve upon the relatively poor current outcomes of percutaneous coronary intervention in that population.
The operative word here is “abluminal.” Both of the novel drug-eluting stents featured at the annual meeting of the European Association of Percutaneous Cardiovascular Interventions position their antirestenosis drugs abluminally: that is, aimed toward the vessel wall surface, not the lumen.
“The hypothesis is that ,” explained Luca Testa, MD, PhD, head of the coronary revascularization unit at San Donato Hospital in Milan.
There is a major unmet need for improved stent technology that addresses the special needs of diabetes patients, who tend to have more diffuse and rapidly progressive coronary artery disease (CAD) with longer lesions. Target lesion revascularization rates at 5 years of follow-up in diabetes patients with current generation drug-eluting stents (DES) remain high, at 20% or more. And diabetes patients are roughly 3.5-fold more likely to have nonfocal, diffuse coronary lesions than are nondiabetic patients with CAD, the cardiologist noted.
The sense of urgency surrounding this unmet need stems from the ongoing worldwide epidemic of diabetes. The global prevalence of diabetes was estimated at 382 million in 2013 and is projected to climb to nearly 600 million by 2035. Diabetes patients are two to four times more likely to develop CAD than are those without the disease. Because of the current suboptimal results with percutaneous coronary intervention (PCI), many of them are being referred for coronary artery bypass surgery.
Dr. Testa presented the 1-year results of the ongoing en-ABL e-Registry, a 5-year, multicenter, prospective, all-comers registry of 859 diabetic and 1,641 nondiabetic CAD patients who received the Abluminus DES at 31 centers in India. The novel stent, developed by Envision Scientific of India, is coated with sirolimus on the abluminal side. The device is actually both a DES and a drug-coated balloon. The balloon, including its proximal and distal ends, are also sirolimus coated to maximize exposure of diseased artery to the drug. The balloon needs to be inflated in position for at least 30 seconds to deliver its portion of sirolimus. The stent is composed of a biodegradable polymer matrix that is metabolized within 6-8 months.
The primary endpoint at 1 year of follow-up was the composite of cardiac death, target vessel MI, and target lesion or vessel revascularization. The rate was 3.12% in the diabetic population, which wasn’t significantly different from the 2.1% rate in nondiabetic patients. Of note, the rate was 5.17% in the 138 insulin-dependent diabetes patients, compared with 2.76% in 721 non–insulin-dependent patients.
Among diabetes patients, the composite endpoint occurred in 2.82% of those who underwent primary PCI with the Abluminus DES for an acute MI, 3.96% of those treated for lesions in small vessels 2.75 mm or less in diameter, 3.75% in diabetes patients treated for long lesions, and 4.18% in the subgroup with long lesions in small vessels.
On the basis of these encouraging results, Dr. Testa has been named the principal investigator for the new prospective, multicenter, observational DEDICATE registry, restricted to diabetic patients treated with the Abluminus DES.
Also getting underway is a randomized, investigator-initiated, multicenter, single-blind pilot study involving 165 diabetes patients assigned 2:1 to the Abluminus DES or the Xience everolimus-eluting stent, widely considered the current gold standard DES. The study, known as the ABILITY trial, has as its primary endpoint the in-stent neointimal volume as measured by optical coherence tomography 6 months post PCI. The medical director of the study is Antonio Colombo, MD, director of the cardiac catheterization laboratory and the interventional cardiology unit at San Raffaele Hospital in Milan.
Elsewhere at EuroPCR 2018, officials at Alvimedica Medical Technologies announced that the company’s abluminal stent, known as the Cre8 EVO, will be pitted against the everolimus-eluting stent in a 55-center trial of 3,040 diabetes patients. The hypothesis of the Diab8 trial, based on preliminary data from pilot studies, is that the abluminal stent will show clinical superiority – not merely equivalence – at 1 year.
The Cre8 EVO stent utilizes a proprietary, polymer-free, drug-release technology involving reservoirs located on the stent’s outer surface that direct the controlled release of a mixture of sirolimus and fatty acids that the company calls the amphilimus formulation. The drug mixture is designed to enhance tissue permeation and sirolimus bioavailability. The body of the stent is cobalt, which was used based upon a conviction that polymers are more proinflammatory.
Dr. Colombo is also the principal investigator of the Diab8 trial, sponsored by Alvimedica.
Dr. Testa reported having no financial conflicts regarding his work on the en-ABL e-Registry, funded by a nonprofit Italian cardiovascular research foundation.
PARIS – Interventional cardiologists are hopeful that a new generation of investigational coronary stents designed specifically for use in diabetes patients will improve upon the relatively poor current outcomes of percutaneous coronary intervention in that population.
The operative word here is “abluminal.” Both of the novel drug-eluting stents featured at the annual meeting of the European Association of Percutaneous Cardiovascular Interventions position their antirestenosis drugs abluminally: that is, aimed toward the vessel wall surface, not the lumen.
“The hypothesis is that ,” explained Luca Testa, MD, PhD, head of the coronary revascularization unit at San Donato Hospital in Milan.
There is a major unmet need for improved stent technology that addresses the special needs of diabetes patients, who tend to have more diffuse and rapidly progressive coronary artery disease (CAD) with longer lesions. Target lesion revascularization rates at 5 years of follow-up in diabetes patients with current generation drug-eluting stents (DES) remain high, at 20% or more. And diabetes patients are roughly 3.5-fold more likely to have nonfocal, diffuse coronary lesions than are nondiabetic patients with CAD, the cardiologist noted.
The sense of urgency surrounding this unmet need stems from the ongoing worldwide epidemic of diabetes. The global prevalence of diabetes was estimated at 382 million in 2013 and is projected to climb to nearly 600 million by 2035. Diabetes patients are two to four times more likely to develop CAD than are those without the disease. Because of the current suboptimal results with percutaneous coronary intervention (PCI), many of them are being referred for coronary artery bypass surgery.
Dr. Testa presented the 1-year results of the ongoing en-ABL e-Registry, a 5-year, multicenter, prospective, all-comers registry of 859 diabetic and 1,641 nondiabetic CAD patients who received the Abluminus DES at 31 centers in India. The novel stent, developed by Envision Scientific of India, is coated with sirolimus on the abluminal side. The device is actually both a DES and a drug-coated balloon. The balloon, including its proximal and distal ends, are also sirolimus coated to maximize exposure of diseased artery to the drug. The balloon needs to be inflated in position for at least 30 seconds to deliver its portion of sirolimus. The stent is composed of a biodegradable polymer matrix that is metabolized within 6-8 months.
The primary endpoint at 1 year of follow-up was the composite of cardiac death, target vessel MI, and target lesion or vessel revascularization. The rate was 3.12% in the diabetic population, which wasn’t significantly different from the 2.1% rate in nondiabetic patients. Of note, the rate was 5.17% in the 138 insulin-dependent diabetes patients, compared with 2.76% in 721 non–insulin-dependent patients.
Among diabetes patients, the composite endpoint occurred in 2.82% of those who underwent primary PCI with the Abluminus DES for an acute MI, 3.96% of those treated for lesions in small vessels 2.75 mm or less in diameter, 3.75% in diabetes patients treated for long lesions, and 4.18% in the subgroup with long lesions in small vessels.
On the basis of these encouraging results, Dr. Testa has been named the principal investigator for the new prospective, multicenter, observational DEDICATE registry, restricted to diabetic patients treated with the Abluminus DES.
Also getting underway is a randomized, investigator-initiated, multicenter, single-blind pilot study involving 165 diabetes patients assigned 2:1 to the Abluminus DES or the Xience everolimus-eluting stent, widely considered the current gold standard DES. The study, known as the ABILITY trial, has as its primary endpoint the in-stent neointimal volume as measured by optical coherence tomography 6 months post PCI. The medical director of the study is Antonio Colombo, MD, director of the cardiac catheterization laboratory and the interventional cardiology unit at San Raffaele Hospital in Milan.
Elsewhere at EuroPCR 2018, officials at Alvimedica Medical Technologies announced that the company’s abluminal stent, known as the Cre8 EVO, will be pitted against the everolimus-eluting stent in a 55-center trial of 3,040 diabetes patients. The hypothesis of the Diab8 trial, based on preliminary data from pilot studies, is that the abluminal stent will show clinical superiority – not merely equivalence – at 1 year.
The Cre8 EVO stent utilizes a proprietary, polymer-free, drug-release technology involving reservoirs located on the stent’s outer surface that direct the controlled release of a mixture of sirolimus and fatty acids that the company calls the amphilimus formulation. The drug mixture is designed to enhance tissue permeation and sirolimus bioavailability. The body of the stent is cobalt, which was used based upon a conviction that polymers are more proinflammatory.
Dr. Colombo is also the principal investigator of the Diab8 trial, sponsored by Alvimedica.
Dr. Testa reported having no financial conflicts regarding his work on the en-ABL e-Registry, funded by a nonprofit Italian cardiovascular research foundation.
REPORTING FROM EUROPCR 2018
Norovirus vaccine appears promising in children
MALMO, SWEDEN – in an interim analysis of an ongoing phase 2 study, Taisei Masuda, MD, reported at the annual meeting of the European Society for Paediatric Infectious Diseases.
The randomized, double-blind, multinational trial remains blinded because follow-up is continuing, so – to the disappointment of the ESPID audience – there are as yet no data on duration of antibody persistence or clinical efficacy.
However, an earlier phase 2 study in 420 healthy participants aged 18-64 years showed that the Takeda vaccine elicited persistent immune responses 1 year post vaccination and that higher antibody levels correlated with a reduced frequency of moderate to severe vomiting and diarrheal illness following oral challenge with norovirus (Clin Vaccine Immunol. 2015 Aug;22[8]:923-9). Follow-up will continue in order to learn how long the protective immune response lasts in adults, according to Dr. Masuda, of Takeda Pharmaceuticals in Zurich.
The bivalent Takeda vaccine is the first candidate vaccine to reach the randomized trial stage. An oral vaccine in tablet form under development by Vaxart, a San Francisco Bay Area biotech company, recently completed preliminary phase 1 studies.
Dr. Masuda explained that the Takeda vaccine contains virus-like particle antigens from norovirus strains GI.1 and GII.4c, which together account for the majority of human norovirus illness. These virus-like particles are formed on the outer surface of the virus. Of note, virus-like particle–based vaccines against hepatitis B and human papillomavirus have won regulatory approval in the United States, Europe, and elsewhere.
He presented data on 120 healthy subjects aged 1 year to less than 4 years old and another 120 aged 4 years to less than 9 years. They are part of a larger phase 2 study of 840 children as young as age 6 weeks. This was a dose-finding study, so participants received various doses of the vaccine on day 1 and either a second dose or a saline injection 28 days later. The vaccine, which contains aluminum hydroxide to enhance immunogenicity, comes in prefilled syringes.
At 57 days of follow-up in this interim analysis, protective seroresponse rates as defined by at least a fourfold increase in histo-blood group antigen–blocking titers approached 100%. In the older group, this was typically achieved with a single dose of vaccine. However, the younger group of children generally derived further benefit from a second dose, according to Dr. Masuda.
In terms of safety concerns, he said no serious adverse events occurred in the study and no one withdrew from the trial because of vaccine-related side effects. The overall safety picture was the same in the two age groups. The incidence of fever of 38° C or higher was similar after administration of vaccine and placebo. Injection site pain occurred in one-quarter of younger vaccine recipients, in 38%-63% of those aged 4 years or older, and in 17%-22% who got placebo injections. Those and other local and systemic adverse events were mostly mild and transient. Their incidence and severity weren’t related to vaccine dosage.
In sum, Dr. Masuda deemed the safety profile “clinically acceptable.”
Session chair Karina Butler, MD, of Temple Street Children’s University Hospital, Dublin, raised the question of how might this vaccine, which may require two doses in younger children, fit into an already crowded pediatric immunization schedule – will parents and physicians embrace it?
Dr. Masuda replied that noroviruses are the No. 1 cause of acute gastroenteritis worldwide and there is a clamor for development of effective vaccines to protect the groups that bear the greatest burden of disease, including children, the elderly, military personnel, cruise ship vacationers, and others who experience crowded conditions. He expressed confidence that a safe and effective vaccine will be in high demand.
“In the future, we’ll look at the possibility of a combination vaccine,” he added.
In response to audience questions, Dr. Masuda said that in adult studies higher levels of immunogenicity have been achieved after vaccination, compared with natural infection; however, there are as yet no pediatric data on that score. Also, investigators have seen evidence of cross-reactivity to the vaccine in some but not all naturally circulating nonvaccine strains.
The vaccine formulation being carried forward into advanced clinical trials in adults is 15 mcg of GI.1/50 mcg of GII.4c (J Infect Dis. 2018 Jan 30;217[4]:597-607).
The phase 2 study presented by Dr. Masuda was supported by the U.S. Army.
MALMO, SWEDEN – in an interim analysis of an ongoing phase 2 study, Taisei Masuda, MD, reported at the annual meeting of the European Society for Paediatric Infectious Diseases.
The randomized, double-blind, multinational trial remains blinded because follow-up is continuing, so – to the disappointment of the ESPID audience – there are as yet no data on duration of antibody persistence or clinical efficacy.
However, an earlier phase 2 study in 420 healthy participants aged 18-64 years showed that the Takeda vaccine elicited persistent immune responses 1 year post vaccination and that higher antibody levels correlated with a reduced frequency of moderate to severe vomiting and diarrheal illness following oral challenge with norovirus (Clin Vaccine Immunol. 2015 Aug;22[8]:923-9). Follow-up will continue in order to learn how long the protective immune response lasts in adults, according to Dr. Masuda, of Takeda Pharmaceuticals in Zurich.
The bivalent Takeda vaccine is the first candidate vaccine to reach the randomized trial stage. An oral vaccine in tablet form under development by Vaxart, a San Francisco Bay Area biotech company, recently completed preliminary phase 1 studies.
Dr. Masuda explained that the Takeda vaccine contains virus-like particle antigens from norovirus strains GI.1 and GII.4c, which together account for the majority of human norovirus illness. These virus-like particles are formed on the outer surface of the virus. Of note, virus-like particle–based vaccines against hepatitis B and human papillomavirus have won regulatory approval in the United States, Europe, and elsewhere.
He presented data on 120 healthy subjects aged 1 year to less than 4 years old and another 120 aged 4 years to less than 9 years. They are part of a larger phase 2 study of 840 children as young as age 6 weeks. This was a dose-finding study, so participants received various doses of the vaccine on day 1 and either a second dose or a saline injection 28 days later. The vaccine, which contains aluminum hydroxide to enhance immunogenicity, comes in prefilled syringes.
At 57 days of follow-up in this interim analysis, protective seroresponse rates as defined by at least a fourfold increase in histo-blood group antigen–blocking titers approached 100%. In the older group, this was typically achieved with a single dose of vaccine. However, the younger group of children generally derived further benefit from a second dose, according to Dr. Masuda.
In terms of safety concerns, he said no serious adverse events occurred in the study and no one withdrew from the trial because of vaccine-related side effects. The overall safety picture was the same in the two age groups. The incidence of fever of 38° C or higher was similar after administration of vaccine and placebo. Injection site pain occurred in one-quarter of younger vaccine recipients, in 38%-63% of those aged 4 years or older, and in 17%-22% who got placebo injections. Those and other local and systemic adverse events were mostly mild and transient. Their incidence and severity weren’t related to vaccine dosage.
In sum, Dr. Masuda deemed the safety profile “clinically acceptable.”
Session chair Karina Butler, MD, of Temple Street Children’s University Hospital, Dublin, raised the question of how might this vaccine, which may require two doses in younger children, fit into an already crowded pediatric immunization schedule – will parents and physicians embrace it?
Dr. Masuda replied that noroviruses are the No. 1 cause of acute gastroenteritis worldwide and there is a clamor for development of effective vaccines to protect the groups that bear the greatest burden of disease, including children, the elderly, military personnel, cruise ship vacationers, and others who experience crowded conditions. He expressed confidence that a safe and effective vaccine will be in high demand.
“In the future, we’ll look at the possibility of a combination vaccine,” he added.
In response to audience questions, Dr. Masuda said that in adult studies higher levels of immunogenicity have been achieved after vaccination, compared with natural infection; however, there are as yet no pediatric data on that score. Also, investigators have seen evidence of cross-reactivity to the vaccine in some but not all naturally circulating nonvaccine strains.
The vaccine formulation being carried forward into advanced clinical trials in adults is 15 mcg of GI.1/50 mcg of GII.4c (J Infect Dis. 2018 Jan 30;217[4]:597-607).
The phase 2 study presented by Dr. Masuda was supported by the U.S. Army.
MALMO, SWEDEN – in an interim analysis of an ongoing phase 2 study, Taisei Masuda, MD, reported at the annual meeting of the European Society for Paediatric Infectious Diseases.
The randomized, double-blind, multinational trial remains blinded because follow-up is continuing, so – to the disappointment of the ESPID audience – there are as yet no data on duration of antibody persistence or clinical efficacy.
However, an earlier phase 2 study in 420 healthy participants aged 18-64 years showed that the Takeda vaccine elicited persistent immune responses 1 year post vaccination and that higher antibody levels correlated with a reduced frequency of moderate to severe vomiting and diarrheal illness following oral challenge with norovirus (Clin Vaccine Immunol. 2015 Aug;22[8]:923-9). Follow-up will continue in order to learn how long the protective immune response lasts in adults, according to Dr. Masuda, of Takeda Pharmaceuticals in Zurich.
The bivalent Takeda vaccine is the first candidate vaccine to reach the randomized trial stage. An oral vaccine in tablet form under development by Vaxart, a San Francisco Bay Area biotech company, recently completed preliminary phase 1 studies.
Dr. Masuda explained that the Takeda vaccine contains virus-like particle antigens from norovirus strains GI.1 and GII.4c, which together account for the majority of human norovirus illness. These virus-like particles are formed on the outer surface of the virus. Of note, virus-like particle–based vaccines against hepatitis B and human papillomavirus have won regulatory approval in the United States, Europe, and elsewhere.
He presented data on 120 healthy subjects aged 1 year to less than 4 years old and another 120 aged 4 years to less than 9 years. They are part of a larger phase 2 study of 840 children as young as age 6 weeks. This was a dose-finding study, so participants received various doses of the vaccine on day 1 and either a second dose or a saline injection 28 days later. The vaccine, which contains aluminum hydroxide to enhance immunogenicity, comes in prefilled syringes.
At 57 days of follow-up in this interim analysis, protective seroresponse rates as defined by at least a fourfold increase in histo-blood group antigen–blocking titers approached 100%. In the older group, this was typically achieved with a single dose of vaccine. However, the younger group of children generally derived further benefit from a second dose, according to Dr. Masuda.
In terms of safety concerns, he said no serious adverse events occurred in the study and no one withdrew from the trial because of vaccine-related side effects. The overall safety picture was the same in the two age groups. The incidence of fever of 38° C or higher was similar after administration of vaccine and placebo. Injection site pain occurred in one-quarter of younger vaccine recipients, in 38%-63% of those aged 4 years or older, and in 17%-22% who got placebo injections. Those and other local and systemic adverse events were mostly mild and transient. Their incidence and severity weren’t related to vaccine dosage.
In sum, Dr. Masuda deemed the safety profile “clinically acceptable.”
Session chair Karina Butler, MD, of Temple Street Children’s University Hospital, Dublin, raised the question of how might this vaccine, which may require two doses in younger children, fit into an already crowded pediatric immunization schedule – will parents and physicians embrace it?
Dr. Masuda replied that noroviruses are the No. 1 cause of acute gastroenteritis worldwide and there is a clamor for development of effective vaccines to protect the groups that bear the greatest burden of disease, including children, the elderly, military personnel, cruise ship vacationers, and others who experience crowded conditions. He expressed confidence that a safe and effective vaccine will be in high demand.
“In the future, we’ll look at the possibility of a combination vaccine,” he added.
In response to audience questions, Dr. Masuda said that in adult studies higher levels of immunogenicity have been achieved after vaccination, compared with natural infection; however, there are as yet no pediatric data on that score. Also, investigators have seen evidence of cross-reactivity to the vaccine in some but not all naturally circulating nonvaccine strains.
The vaccine formulation being carried forward into advanced clinical trials in adults is 15 mcg of GI.1/50 mcg of GII.4c (J Infect Dis. 2018 Jan 30;217[4]:597-607).
The phase 2 study presented by Dr. Masuda was supported by the U.S. Army.
REPORTING FROM ESPID 2018
Key clinical point: Hope runs high that an effective norovirus vaccine is in the works.
Major finding: Protective seroresponse rates against the two chief disease-causing strains of norovirus were seen in nearly 100% of vaccinated children aged 1-8 years.
Study details: This is an ongoing prospective, multicenter, double-blind, phase 2 randomized trial including 840 children.
Disclosures: The study was supported by the U.S. Army and presented by an employee of Takeda Pharmaceuticals.
Registry data provide evidence that Mohs surgery remains underutilized
CHICAGO – Analysis of U.S. national cancer registry data shows that, contrary to expectation, the , Sean Condon, MD, reported at the annual meeting of the American College of Mohs Surgery.
Ditto for the use of Mohs in patients with the rare cutaneous malignancies for which published evidence clearly demonstrates Mohs outperforms wide local excision, which is employed seven times more frequently than Mohs in such situations.
“Mohs utilization did not increase after the Affordable Care Act [ACA], despite new health insurance coverage for 20 million previously uninsured adults,” Dr. Condon said. “Surprisingly, after the ACA we actually saw a decrease in Mohs use for melanoma in situ.”
Indeed, his retrospective study of more than 25,000 patients in the National Cancer Institute’s SEER (Surveillance, Epidemiology, and End Results) registries showed that the proportion of patients with melanoma in situ treated with Mohs declined from 13.9% during 2008-2009 – prior to ACA implementation – to 12.3% in 2011-2013, after the ACA took effect. That’s a statistically significant 13% drop, even though numerous published studies have shown outcomes in melanoma in situ are better with Mohs, said the dermatologist, who conducted the study while completing a Mohs surgery fellowship at the Cleveland Clinic. He is now in private practice in Thousand Oaks, Calif.
His analysis included 19,013 patients treated in 2008-2014 for melanoma in situ and 6,309 others treated for rare cutaneous malignancies deemed appropriate for Mohs according to the criteria formally developed jointly by the American Academy of Dermatology, the American College of Mohs Surgery, the American Society for Dermatologic Surgery Association, and the American Society for Mohs Surgery (J Am Acad Dermatol. 2012 Oct;67[4]:531-50). These rare malignancies include adnexal carcinoma, Merkel cell carcinoma, dermatofibrosarcoma, extramammary Paget disease, sebaceous adenocarcinoma, and leiomyosarcoma.
“These rare cutaneous malignancies were historically treated with wide local excision. However, numerous studies have lately shown that lower recurrence rates were found with Mohs compared with wide local excision,” Dr. Condon noted.
Nonetheless, the proportion of the rare cutaneous malignancies treated using Mohs was unaffected by implementation of the ACA. Nor was it influenced one way or the other by publication of the joint Mohs appropriate use criteria in 2012: The Mohs-treated proportion of such cases was 15.25% in 2010-2011 and 14.6% in 2013-2014.
Similarly, even though the appropriate use criteria identified melanoma in situ as Mohs appropriate, the proportion of those malignancies treated via Mohs was the same before and after the 2012 release of the criteria.
“It’s commonly thought that Mohs is overused. However, our study and our data clearly identify that Mohs is being underutilized for melanoma in situ and for rare cutaneous malignancies. This represents a knowledge gap for other specialties regarding best-practice therapy,” Dr. Condon said.
He and his coinvestigators searched for socioeconomic predictors of Mohs utilization by matching the nationally representative SEER data with U.S. census data. They examined the impact of three metrics: insurance status, income, and poverty. They found that low-income patients and those in the highest quartile of poverty were significantly less likely to have Mohs surgery for their melanoma in situ and rare cutaneous malignancies throughout the study years. Lack of health insurance had no impact on Mohs utilization for melanoma in situ but was independently associated with decreased likelihood of Mohs for the rare cutaneous malignancies. White patients were 2-fold to 2.4-fold more likely to have Mohs surgery for their rare cutaneous malignancies than were black patients.
“One can conclude that Mohs micrographic surgery may be skewed toward more affluent patients, and lower socioeconomic status areas have less Mohs access. So our data from this study support a role for targeted education and improved patient access to Mohs,” Dr. Condon said.
He noted that because the SEER registries don’t track squamous or basal cell carcinomas, it’s unknown whether Mohs is also underutilized for the higher-risk forms of these most common of all skin cancers.
Dr. Condon reported having no financial conflicts regarding his study, conducted free of commercial support.
CHICAGO – Analysis of U.S. national cancer registry data shows that, contrary to expectation, the , Sean Condon, MD, reported at the annual meeting of the American College of Mohs Surgery.
Ditto for the use of Mohs in patients with the rare cutaneous malignancies for which published evidence clearly demonstrates Mohs outperforms wide local excision, which is employed seven times more frequently than Mohs in such situations.
“Mohs utilization did not increase after the Affordable Care Act [ACA], despite new health insurance coverage for 20 million previously uninsured adults,” Dr. Condon said. “Surprisingly, after the ACA we actually saw a decrease in Mohs use for melanoma in situ.”
Indeed, his retrospective study of more than 25,000 patients in the National Cancer Institute’s SEER (Surveillance, Epidemiology, and End Results) registries showed that the proportion of patients with melanoma in situ treated with Mohs declined from 13.9% during 2008-2009 – prior to ACA implementation – to 12.3% in 2011-2013, after the ACA took effect. That’s a statistically significant 13% drop, even though numerous published studies have shown outcomes in melanoma in situ are better with Mohs, said the dermatologist, who conducted the study while completing a Mohs surgery fellowship at the Cleveland Clinic. He is now in private practice in Thousand Oaks, Calif.
His analysis included 19,013 patients treated in 2008-2014 for melanoma in situ and 6,309 others treated for rare cutaneous malignancies deemed appropriate for Mohs according to the criteria formally developed jointly by the American Academy of Dermatology, the American College of Mohs Surgery, the American Society for Dermatologic Surgery Association, and the American Society for Mohs Surgery (J Am Acad Dermatol. 2012 Oct;67[4]:531-50). These rare malignancies include adnexal carcinoma, Merkel cell carcinoma, dermatofibrosarcoma, extramammary Paget disease, sebaceous adenocarcinoma, and leiomyosarcoma.
“These rare cutaneous malignancies were historically treated with wide local excision. However, numerous studies have lately shown that lower recurrence rates were found with Mohs compared with wide local excision,” Dr. Condon noted.
Nonetheless, the proportion of the rare cutaneous malignancies treated using Mohs was unaffected by implementation of the ACA. Nor was it influenced one way or the other by publication of the joint Mohs appropriate use criteria in 2012: The Mohs-treated proportion of such cases was 15.25% in 2010-2011 and 14.6% in 2013-2014.
Similarly, even though the appropriate use criteria identified melanoma in situ as Mohs appropriate, the proportion of those malignancies treated via Mohs was the same before and after the 2012 release of the criteria.
“It’s commonly thought that Mohs is overused. However, our study and our data clearly identify that Mohs is being underutilized for melanoma in situ and for rare cutaneous malignancies. This represents a knowledge gap for other specialties regarding best-practice therapy,” Dr. Condon said.
He and his coinvestigators searched for socioeconomic predictors of Mohs utilization by matching the nationally representative SEER data with U.S. census data. They examined the impact of three metrics: insurance status, income, and poverty. They found that low-income patients and those in the highest quartile of poverty were significantly less likely to have Mohs surgery for their melanoma in situ and rare cutaneous malignancies throughout the study years. Lack of health insurance had no impact on Mohs utilization for melanoma in situ but was independently associated with decreased likelihood of Mohs for the rare cutaneous malignancies. White patients were 2-fold to 2.4-fold more likely to have Mohs surgery for their rare cutaneous malignancies than were black patients.
“One can conclude that Mohs micrographic surgery may be skewed toward more affluent patients, and lower socioeconomic status areas have less Mohs access. So our data from this study support a role for targeted education and improved patient access to Mohs,” Dr. Condon said.
He noted that because the SEER registries don’t track squamous or basal cell carcinomas, it’s unknown whether Mohs is also underutilized for the higher-risk forms of these most common of all skin cancers.
Dr. Condon reported having no financial conflicts regarding his study, conducted free of commercial support.
CHICAGO – Analysis of U.S. national cancer registry data shows that, contrary to expectation, the , Sean Condon, MD, reported at the annual meeting of the American College of Mohs Surgery.
Ditto for the use of Mohs in patients with the rare cutaneous malignancies for which published evidence clearly demonstrates Mohs outperforms wide local excision, which is employed seven times more frequently than Mohs in such situations.
“Mohs utilization did not increase after the Affordable Care Act [ACA], despite new health insurance coverage for 20 million previously uninsured adults,” Dr. Condon said. “Surprisingly, after the ACA we actually saw a decrease in Mohs use for melanoma in situ.”
Indeed, his retrospective study of more than 25,000 patients in the National Cancer Institute’s SEER (Surveillance, Epidemiology, and End Results) registries showed that the proportion of patients with melanoma in situ treated with Mohs declined from 13.9% during 2008-2009 – prior to ACA implementation – to 12.3% in 2011-2013, after the ACA took effect. That’s a statistically significant 13% drop, even though numerous published studies have shown outcomes in melanoma in situ are better with Mohs, said the dermatologist, who conducted the study while completing a Mohs surgery fellowship at the Cleveland Clinic. He is now in private practice in Thousand Oaks, Calif.
His analysis included 19,013 patients treated in 2008-2014 for melanoma in situ and 6,309 others treated for rare cutaneous malignancies deemed appropriate for Mohs according to the criteria formally developed jointly by the American Academy of Dermatology, the American College of Mohs Surgery, the American Society for Dermatologic Surgery Association, and the American Society for Mohs Surgery (J Am Acad Dermatol. 2012 Oct;67[4]:531-50). These rare malignancies include adnexal carcinoma, Merkel cell carcinoma, dermatofibrosarcoma, extramammary Paget disease, sebaceous adenocarcinoma, and leiomyosarcoma.
“These rare cutaneous malignancies were historically treated with wide local excision. However, numerous studies have lately shown that lower recurrence rates were found with Mohs compared with wide local excision,” Dr. Condon noted.
Nonetheless, the proportion of the rare cutaneous malignancies treated using Mohs was unaffected by implementation of the ACA. Nor was it influenced one way or the other by publication of the joint Mohs appropriate use criteria in 2012: The Mohs-treated proportion of such cases was 15.25% in 2010-2011 and 14.6% in 2013-2014.
Similarly, even though the appropriate use criteria identified melanoma in situ as Mohs appropriate, the proportion of those malignancies treated via Mohs was the same before and after the 2012 release of the criteria.
“It’s commonly thought that Mohs is overused. However, our study and our data clearly identify that Mohs is being underutilized for melanoma in situ and for rare cutaneous malignancies. This represents a knowledge gap for other specialties regarding best-practice therapy,” Dr. Condon said.
He and his coinvestigators searched for socioeconomic predictors of Mohs utilization by matching the nationally representative SEER data with U.S. census data. They examined the impact of three metrics: insurance status, income, and poverty. They found that low-income patients and those in the highest quartile of poverty were significantly less likely to have Mohs surgery for their melanoma in situ and rare cutaneous malignancies throughout the study years. Lack of health insurance had no impact on Mohs utilization for melanoma in situ but was independently associated with decreased likelihood of Mohs for the rare cutaneous malignancies. White patients were 2-fold to 2.4-fold more likely to have Mohs surgery for their rare cutaneous malignancies than were black patients.
“One can conclude that Mohs micrographic surgery may be skewed toward more affluent patients, and lower socioeconomic status areas have less Mohs access. So our data from this study support a role for targeted education and improved patient access to Mohs,” Dr. Condon said.
He noted that because the SEER registries don’t track squamous or basal cell carcinomas, it’s unknown whether Mohs is also underutilized for the higher-risk forms of these most common of all skin cancers.
Dr. Condon reported having no financial conflicts regarding his study, conducted free of commercial support.
REPORTING FROM THE ACMS 50TH ANNUAL MEETING
Key clinical point: Mohs micrographic surgery remains seriously underutilized for the skin cancers for which it is most advantageous.
Major finding: The use of Mohs micrographic surgery to treat melanoma in situ declined significantly after passage of the Affordable Care Act.
Study details: This was a retrospective study of national SEER data on more than 25,000 patients treated for melanoma in situ or rare cutaneous malignancies during 2008-2014.
Disclosures: The presenter reported having no financial conflicts regarding his study, conducted free of commercial support.
Biomarker duo rapidly identifies serious bacterial infections
MALMO, SWEDEN – The combination of serum procalcitonin and C-reactive protein levels upon admission to a pediatric ICU displayed high utility for early diagnosis of serious bacterial infection in critically ill children in a large prospective observational study presented at the annual meeting of the European Society for Paediatric Infectious Diseases.
This winning combination significantly outperformed neutrophil gelatinase-associated lipocalin, activated partial thromboplastin time, and resistin, both individually and in various combinations, for the vital task of making a rapid distinction between infectious and noninfectious causes of pediatric systemic inflammatory response syndrome, reported Enitan D. Carrol, MD, professor of pediatric infection at the University of Liverpool (England).
“One of the clinical dilemmas we face in intensive care is being able to differentiate between infectious and noninfectious causes of systemic inflammatory response syndrome. This is important because we need to identify which children have life-threatening infections so that we can promptly initiate antimicrobial therapy,” she explained.
One in four deaths in pediatric ICUs are infection related, Dr. Carrol noted.
“There is an urgent need for infection markers which, firstly, change early in the course of bacterial infection, secondly, correlate with real-time clinical progression, and thirdly, have a rapid turn-around time to allow effective clinical decision making,” she observed.
The combination of procalcitonin and C-reactive protein (CRP) levels measured at admission fits the bill, Dr. Carrol continued. Of the five biomarkers evaluated in her study – all backed by some supporting evidence of efficacy in earlier studies – the top two individual performers in terms of negative predictive value (NPV) were a CRP less than 4.2 mg/dL with a negative NPV of 99%, and a procalcitonin less than 1.52 ng/mL with an NPV of 96%. The positive predictive value of each of the biomarkers was 37%. The sensitivity and specificity of procalcitonin for diagnosis of serious bacterial infection were 78% and 80%, respectively. For CRP, the figures were 93% and 76%.
The combination of procalcitonin and CRP outperformed a multitude of other two-, three-, and four-biomarker combinations tested, with an area under the curve of 93% for combined sensitivity and specificity.
The study included 657 children admitted to the pediatric ICU at Alder Hey Children’s Hospital in Liverpool with systemic inflammatory response syndrome. All had blood samples measured for the five biomarkers on days 1-7. Clinicians were blinded as to the biomarker results. Ninety-two (14%) patients were ultimately found to have a serious bacterial infection – essentially, bacterial meningitis or septic shock – and 565 (86%) had a nonbacterial etiology.
The 28-day mortality rate was 9% in the group with serious bacterial infection, significantly higher than the 2% rate in the group with other causes of their systemic inflammatory response syndrome.
Longitudinal trends in procalcitonin and CRP as evidenced in the study can be used in clinical decision making, according to Dr. Carrol. Mean values of procalcitonin plummeted by 80% from day 1 to day 5 in response to antimicrobial therapy in the group with serious bacterial infections. In contrast, CRP levels rose sharply from day 1 to a peak on day 2, then fell, although the 50% drop from day 2 to day 5 in response to antimicrobial therapy wasn’t as pronounced as the change in procalcitonin.
“There is an additive benefit for both biomarkers compared with CRP alone. The problem with CRP on admission, as I’ve demonstrated in this study, is it often hasn’t risen yet early after admission. So although it gave the best area under the curve of any of the biomarkers, I think that combined with procalcitonin you get a much better descriminator,” Dr. Carrol said.
The median duration of ICU stay in the patients with serious bacterial infection at admission was 5 days, compared with 3 days when the cause of systemic inflammatory response syndrome lay elsewhere. Their median duration of ventilation was significantly longer, too: 4 days versus 2 in children without a serious bacterial infection.
Stepwise logistic regression analysis pinpointed several clinical variables as being associated with prolonged ICU stay.

In addition, initiation of antibiotic therapy prior to admission to the pediatric ICU was associated with a 50% reduction in the likelihood of a prolonged ICU stay. “This reflects the fact that early antibiotics give you a better prognosis if you have sepsis,” according to Dr. Carrol.
She and her coinvestigators now have embarked on a multicenter U.K. study looking at the impact of procalcitonin to guide duration of antimicrobial therapy in critically ill children.
The Alder Hey study was funded by the U.K. National Institute for Health Research. Dr. Carrol reported having no financial conflicts. Although she serves as a consultant to several health care companies, all remuneration goes directly to the University of Liverpool.
MALMO, SWEDEN – The combination of serum procalcitonin and C-reactive protein levels upon admission to a pediatric ICU displayed high utility for early diagnosis of serious bacterial infection in critically ill children in a large prospective observational study presented at the annual meeting of the European Society for Paediatric Infectious Diseases.
This winning combination significantly outperformed neutrophil gelatinase-associated lipocalin, activated partial thromboplastin time, and resistin, both individually and in various combinations, for the vital task of making a rapid distinction between infectious and noninfectious causes of pediatric systemic inflammatory response syndrome, reported Enitan D. Carrol, MD, professor of pediatric infection at the University of Liverpool (England).
“One of the clinical dilemmas we face in intensive care is being able to differentiate between infectious and noninfectious causes of systemic inflammatory response syndrome. This is important because we need to identify which children have life-threatening infections so that we can promptly initiate antimicrobial therapy,” she explained.
One in four deaths in pediatric ICUs are infection related, Dr. Carrol noted.
“There is an urgent need for infection markers which, firstly, change early in the course of bacterial infection, secondly, correlate with real-time clinical progression, and thirdly, have a rapid turn-around time to allow effective clinical decision making,” she observed.
The combination of procalcitonin and C-reactive protein (CRP) levels measured at admission fits the bill, Dr. Carrol continued. Of the five biomarkers evaluated in her study – all backed by some supporting evidence of efficacy in earlier studies – the top two individual performers in terms of negative predictive value (NPV) were a CRP less than 4.2 mg/dL with a negative NPV of 99%, and a procalcitonin less than 1.52 ng/mL with an NPV of 96%. The positive predictive value of each of the biomarkers was 37%. The sensitivity and specificity of procalcitonin for diagnosis of serious bacterial infection were 78% and 80%, respectively. For CRP, the figures were 93% and 76%.
The combination of procalcitonin and CRP outperformed a multitude of other two-, three-, and four-biomarker combinations tested, with an area under the curve of 93% for combined sensitivity and specificity.
The study included 657 children admitted to the pediatric ICU at Alder Hey Children’s Hospital in Liverpool with systemic inflammatory response syndrome. All had blood samples measured for the five biomarkers on days 1-7. Clinicians were blinded as to the biomarker results. Ninety-two (14%) patients were ultimately found to have a serious bacterial infection – essentially, bacterial meningitis or septic shock – and 565 (86%) had a nonbacterial etiology.
The 28-day mortality rate was 9% in the group with serious bacterial infection, significantly higher than the 2% rate in the group with other causes of their systemic inflammatory response syndrome.
Longitudinal trends in procalcitonin and CRP as evidenced in the study can be used in clinical decision making, according to Dr. Carrol. Mean values of procalcitonin plummeted by 80% from day 1 to day 5 in response to antimicrobial therapy in the group with serious bacterial infections. In contrast, CRP levels rose sharply from day 1 to a peak on day 2, then fell, although the 50% drop from day 2 to day 5 in response to antimicrobial therapy wasn’t as pronounced as the change in procalcitonin.
“There is an additive benefit for both biomarkers compared with CRP alone. The problem with CRP on admission, as I’ve demonstrated in this study, is it often hasn’t risen yet early after admission. So although it gave the best area under the curve of any of the biomarkers, I think that combined with procalcitonin you get a much better descriminator,” Dr. Carrol said.
The median duration of ICU stay in the patients with serious bacterial infection at admission was 5 days, compared with 3 days when the cause of systemic inflammatory response syndrome lay elsewhere. Their median duration of ventilation was significantly longer, too: 4 days versus 2 in children without a serious bacterial infection.
Stepwise logistic regression analysis pinpointed several clinical variables as being associated with prolonged ICU stay.

In addition, initiation of antibiotic therapy prior to admission to the pediatric ICU was associated with a 50% reduction in the likelihood of a prolonged ICU stay. “This reflects the fact that early antibiotics give you a better prognosis if you have sepsis,” according to Dr. Carrol.
She and her coinvestigators now have embarked on a multicenter U.K. study looking at the impact of procalcitonin to guide duration of antimicrobial therapy in critically ill children.
The Alder Hey study was funded by the U.K. National Institute for Health Research. Dr. Carrol reported having no financial conflicts. Although she serves as a consultant to several health care companies, all remuneration goes directly to the University of Liverpool.
MALMO, SWEDEN – The combination of serum procalcitonin and C-reactive protein levels upon admission to a pediatric ICU displayed high utility for early diagnosis of serious bacterial infection in critically ill children in a large prospective observational study presented at the annual meeting of the European Society for Paediatric Infectious Diseases.
This winning combination significantly outperformed neutrophil gelatinase-associated lipocalin, activated partial thromboplastin time, and resistin, both individually and in various combinations, for the vital task of making a rapid distinction between infectious and noninfectious causes of pediatric systemic inflammatory response syndrome, reported Enitan D. Carrol, MD, professor of pediatric infection at the University of Liverpool (England).
“One of the clinical dilemmas we face in intensive care is being able to differentiate between infectious and noninfectious causes of systemic inflammatory response syndrome. This is important because we need to identify which children have life-threatening infections so that we can promptly initiate antimicrobial therapy,” she explained.
One in four deaths in pediatric ICUs are infection related, Dr. Carrol noted.
“There is an urgent need for infection markers which, firstly, change early in the course of bacterial infection, secondly, correlate with real-time clinical progression, and thirdly, have a rapid turn-around time to allow effective clinical decision making,” she observed.
The combination of procalcitonin and C-reactive protein (CRP) levels measured at admission fits the bill, Dr. Carrol continued. Of the five biomarkers evaluated in her study – all backed by some supporting evidence of efficacy in earlier studies – the top two individual performers in terms of negative predictive value (NPV) were a CRP less than 4.2 mg/dL with a negative NPV of 99%, and a procalcitonin less than 1.52 ng/mL with an NPV of 96%. The positive predictive value of each of the biomarkers was 37%. The sensitivity and specificity of procalcitonin for diagnosis of serious bacterial infection were 78% and 80%, respectively. For CRP, the figures were 93% and 76%.
The combination of procalcitonin and CRP outperformed a multitude of other two-, three-, and four-biomarker combinations tested, with an area under the curve of 93% for combined sensitivity and specificity.
The study included 657 children admitted to the pediatric ICU at Alder Hey Children’s Hospital in Liverpool with systemic inflammatory response syndrome. All had blood samples measured for the five biomarkers on days 1-7. Clinicians were blinded as to the biomarker results. Ninety-two (14%) patients were ultimately found to have a serious bacterial infection – essentially, bacterial meningitis or septic shock – and 565 (86%) had a nonbacterial etiology.
The 28-day mortality rate was 9% in the group with serious bacterial infection, significantly higher than the 2% rate in the group with other causes of their systemic inflammatory response syndrome.
Longitudinal trends in procalcitonin and CRP as evidenced in the study can be used in clinical decision making, according to Dr. Carrol. Mean values of procalcitonin plummeted by 80% from day 1 to day 5 in response to antimicrobial therapy in the group with serious bacterial infections. In contrast, CRP levels rose sharply from day 1 to a peak on day 2, then fell, although the 50% drop from day 2 to day 5 in response to antimicrobial therapy wasn’t as pronounced as the change in procalcitonin.
“There is an additive benefit for both biomarkers compared with CRP alone. The problem with CRP on admission, as I’ve demonstrated in this study, is it often hasn’t risen yet early after admission. So although it gave the best area under the curve of any of the biomarkers, I think that combined with procalcitonin you get a much better descriminator,” Dr. Carrol said.
The median duration of ICU stay in the patients with serious bacterial infection at admission was 5 days, compared with 3 days when the cause of systemic inflammatory response syndrome lay elsewhere. Their median duration of ventilation was significantly longer, too: 4 days versus 2 in children without a serious bacterial infection.
Stepwise logistic regression analysis pinpointed several clinical variables as being associated with prolonged ICU stay.

In addition, initiation of antibiotic therapy prior to admission to the pediatric ICU was associated with a 50% reduction in the likelihood of a prolonged ICU stay. “This reflects the fact that early antibiotics give you a better prognosis if you have sepsis,” according to Dr. Carrol.
She and her coinvestigators now have embarked on a multicenter U.K. study looking at the impact of procalcitonin to guide duration of antimicrobial therapy in critically ill children.
The Alder Hey study was funded by the U.K. National Institute for Health Research. Dr. Carrol reported having no financial conflicts. Although she serves as a consultant to several health care companies, all remuneration goes directly to the University of Liverpool.
REPORTING FROM ESPID 2018
Key clinical point:
Major finding: The area under the curve combining sensitivity and specificity was 93%.
Study details: This was a prospective, observational, single-center, clinician-blinded study of 657 patients admitted to a pediatric ICU with symptoms of systemic inflammatory response syndrome.
Disclosures: The study was funded by the U.K. National Institute for Health Research. The presenter reported having no relevant financial conflicts. Although she serves as a consultant to several health care companies, all remuneration goes directly to the University of Liverpool.
MI risk prediction after noncardiac surgery simplified
ORLANDO – The risk of perioperative MI or death associated with noncardiac surgery is vanishingly low in patients free of diabetes, hypertension, and smoking, Tanya Wilcox, MD, reported at the annual meeting of the American College of Cardiology.
How small is the risk? A mere 1 in 1,000, according to her analysis of more than 3.8 million major noncardiac surgeries in the American College of Surgeons National Surgical Quality Improvement Program database for 2009-2015, according to Dr. Wilcox of New York University.
Physicians are frequently asked by surgeons to clear patients for noncardiac surgery in terms of cardiovascular risk. Because current risk scores are complex, aren’t amenable to rapid bedside calculations, and may entail cardiac stress testing, Dr. Wilcox decided it was worth assessing the impact of three straightforward cardiovascular risk factors – current smoking and treatment for hypertension or diabetes – on 30-day postoperative MI-free survival. For this purpose she turned to the National Surgical Quality Improvement Program database, a validated, risk-adjusted, outcomes-based program to measure and improve the quality of surgical care utilizing data from 250 U.S. surgical centers.
Of the 3,817,113 patients who underwent major noncardiac surgery, 1,586,020 (42%) of them had none of the three cardiovascular risk factors of interest, 1,541,846 (40%) had one, 643,424 (17%) had two, and 45,823, or 1.2%, had all three. The patients’ mean age was 57, 75% were white, and 57% were women. About half of all patients underwent various operations within the realm of general surgery; next most frequent were orthopedic procedures, accounting for 18% of total noncardiac surgery. Of note, only 23% of patients with zero risk factors were American Society of Anesthesiologists Class 3-5, compared with 51% of those with one cardiovascular risk factor, 76% with two, and 71% with all three.
The incidence of acute MI or death within 30 days of noncardiac surgery climbed in stepwise fashion according to a patient’s risk factor burden. In a multivariate analysis adjusted for age, race, and gender, patients with any one of the cardiovascular risk factors had a 30-day risk of acute MI or death that was 1.52 times greater than those with no risk factors, patients with two risk factors were at 2.4-fold increased risk, and those with all three were at 3.63-fold greater risk than those with none. The degree of increased risk associated with any single risk factor ranged from 1.47-fold for hypertension to 1.94-fold for smoking.
“Further study is needed to determine whether aggressive risk factor modifications in the form of blood pressure control, glycemic control, and smoking cessation could reduce the incidence of postoperative MI,” Dr. Wilcox observed.
She reported having no financial conflicts regarding her study.
ORLANDO – The risk of perioperative MI or death associated with noncardiac surgery is vanishingly low in patients free of diabetes, hypertension, and smoking, Tanya Wilcox, MD, reported at the annual meeting of the American College of Cardiology.
How small is the risk? A mere 1 in 1,000, according to her analysis of more than 3.8 million major noncardiac surgeries in the American College of Surgeons National Surgical Quality Improvement Program database for 2009-2015, according to Dr. Wilcox of New York University.
Physicians are frequently asked by surgeons to clear patients for noncardiac surgery in terms of cardiovascular risk. Because current risk scores are complex, aren’t amenable to rapid bedside calculations, and may entail cardiac stress testing, Dr. Wilcox decided it was worth assessing the impact of three straightforward cardiovascular risk factors – current smoking and treatment for hypertension or diabetes – on 30-day postoperative MI-free survival. For this purpose she turned to the National Surgical Quality Improvement Program database, a validated, risk-adjusted, outcomes-based program to measure and improve the quality of surgical care utilizing data from 250 U.S. surgical centers.
Of the 3,817,113 patients who underwent major noncardiac surgery, 1,586,020 (42%) of them had none of the three cardiovascular risk factors of interest, 1,541,846 (40%) had one, 643,424 (17%) had two, and 45,823, or 1.2%, had all three. The patients’ mean age was 57, 75% were white, and 57% were women. About half of all patients underwent various operations within the realm of general surgery; next most frequent were orthopedic procedures, accounting for 18% of total noncardiac surgery. Of note, only 23% of patients with zero risk factors were American Society of Anesthesiologists Class 3-5, compared with 51% of those with one cardiovascular risk factor, 76% with two, and 71% with all three.
The incidence of acute MI or death within 30 days of noncardiac surgery climbed in stepwise fashion according to a patient’s risk factor burden. In a multivariate analysis adjusted for age, race, and gender, patients with any one of the cardiovascular risk factors had a 30-day risk of acute MI or death that was 1.52 times greater than those with no risk factors, patients with two risk factors were at 2.4-fold increased risk, and those with all three were at 3.63-fold greater risk than those with none. The degree of increased risk associated with any single risk factor ranged from 1.47-fold for hypertension to 1.94-fold for smoking.
“Further study is needed to determine whether aggressive risk factor modifications in the form of blood pressure control, glycemic control, and smoking cessation could reduce the incidence of postoperative MI,” Dr. Wilcox observed.
She reported having no financial conflicts regarding her study.
ORLANDO – The risk of perioperative MI or death associated with noncardiac surgery is vanishingly low in patients free of diabetes, hypertension, and smoking, Tanya Wilcox, MD, reported at the annual meeting of the American College of Cardiology.
How small is the risk? A mere 1 in 1,000, according to her analysis of more than 3.8 million major noncardiac surgeries in the American College of Surgeons National Surgical Quality Improvement Program database for 2009-2015, according to Dr. Wilcox of New York University.
Physicians are frequently asked by surgeons to clear patients for noncardiac surgery in terms of cardiovascular risk. Because current risk scores are complex, aren’t amenable to rapid bedside calculations, and may entail cardiac stress testing, Dr. Wilcox decided it was worth assessing the impact of three straightforward cardiovascular risk factors – current smoking and treatment for hypertension or diabetes – on 30-day postoperative MI-free survival. For this purpose she turned to the National Surgical Quality Improvement Program database, a validated, risk-adjusted, outcomes-based program to measure and improve the quality of surgical care utilizing data from 250 U.S. surgical centers.
Of the 3,817,113 patients who underwent major noncardiac surgery, 1,586,020 (42%) of them had none of the three cardiovascular risk factors of interest, 1,541,846 (40%) had one, 643,424 (17%) had two, and 45,823, or 1.2%, had all three. The patients’ mean age was 57, 75% were white, and 57% were women. About half of all patients underwent various operations within the realm of general surgery; next most frequent were orthopedic procedures, accounting for 18% of total noncardiac surgery. Of note, only 23% of patients with zero risk factors were American Society of Anesthesiologists Class 3-5, compared with 51% of those with one cardiovascular risk factor, 76% with two, and 71% with all three.
The incidence of acute MI or death within 30 days of noncardiac surgery climbed in stepwise fashion according to a patient’s risk factor burden. In a multivariate analysis adjusted for age, race, and gender, patients with any one of the cardiovascular risk factors had a 30-day risk of acute MI or death that was 1.52 times greater than those with no risk factors, patients with two risk factors were at 2.4-fold increased risk, and those with all three were at 3.63-fold greater risk than those with none. The degree of increased risk associated with any single risk factor ranged from 1.47-fold for hypertension to 1.94-fold for smoking.
“Further study is needed to determine whether aggressive risk factor modifications in the form of blood pressure control, glycemic control, and smoking cessation could reduce the incidence of postoperative MI,” Dr. Wilcox observed.
She reported having no financial conflicts regarding her study.
REPORTING FROM ACC 2018
Key clinical point: Noncardiac surgery patients can breathe easier regarding perioperative cardiovascular risk provided they don’t smoke and aren’t hypertensive or diabetic.
Major finding: .
Study details: This was a retrospective analysis of more than 3.8 million noncardiac surgeries contained in the American College of Surgeons National Surgical Quality Improvement Program database for 2009-2015.
Disclosures: The study presenter reported having no financial conflicts.
Galectin-3: A new post-MI prognostic biomarker?
ORLANDO – An elevated circulating galactin-3 level after an acute MI is a potent long-term predictor of both heart failure and mortality, independent of known prognostic markers, Rabea Asleh, MD, PhD, reported at the annual meeting of the American College of Cardiology.
“These findings suggest that galectin-3 measurement may have a role in the risk stratification of patients presenting with MI,” according to Dr. Asleh, an Israeli cardiologist doing a fellowship in advanced heart failure and transplant cardiology at the Mayo Clinic in Rochester, Minn.
“The changing clinical presentation of MI necessitates evolution in our approach to risk stratification,” he explained. “Over the last 2 decades we’ve observed a change in the epidemiology of MI, with more patients developing non-ST-elevation MI compared to STEMI. They present at an older age and develop heart failure with preserved ejection fraction more than heart failure with reduced ejection fraction.”
He presented a prospective population-based community cohort study of 1,401 Olmsted County, Minn., residents who had a validated MI during 2002-2012. Their mean age was 67 years, 61% were men, and 79% presented with non-STEMI. During a mean follow-up of 5.3 years, 389 of the participants developed heart failure and 512 patients died.
Galectin-3 was measured a median of 2 days post MI. The median level was 18.4 ng/mL. Patients were divided into tertiles based upon their galactin-3 measurement: Tertile 1 required a post-MI galectin-3 level below 15.2 ng/mL; tertile 2 had a level of 15.2-22.6 ng/mL; and the top tertile was for individuals with a galectin-3 above 22.6 ng/mL.
Of note, patients with a higher galectin-3 level were older, had a higher prevalence of diabetes, hypertension, hyperlipidemia, anterior MI, a higher Killip class, a higher Charlson comorbidity score, and a lower peak troponin T level. They also had a lower estimated glomerular filtration rate; indeed, the median eGFR in the top tertile for galactin-3 was 48 mL/min per 1.73 m2, compared with 68 mL/min in the lowest galectin-3 tertile. Women accounted for 27% of patients in tertile 1, 41% in tertile 2, and fully half of those in tertile 3.
In an unadjusted analysis, the risk of mortality during follow-up was sixfold greater for patients in galectin-3 tertile 3 than in tertile 1; the risk of heart failure was increased 5.5-fold.
More meaningfully, in a Cox multivariate analysis extensively adjusted for age, gender, comorbidities, malignancy, standard cardiovascular risk factors, MI characteristics, eGFR, Killip class, cardiac troponin T, and other potential confounders, patients in galectin-3 tertile 2 had a 1.6-fold increased risk of death and a 1.62-fold increased likelihood of heart failure during follow-up, compared with subjects in tertile 1, Dr. Asleh noted.
Patients in tertile 3 had a 2.4-fold increased risk of death and were at 2.1 times greater risk of heart failure than those in tertile 1. The degree of risk for heart failure associated with elevated galactin-3 was virtually identical for heart failure with preserved as compared with reduced ejection fraction, he added.
Session cochair L. Kristin Newby, MD, of Duke University, Durham, N.C., noted that the Mayo study did not adjust for brain natriuretic peptide (BNP) or N-terminal pro hormone BNP (NT-proBNP), both of which are known to be strong predictors of both heart failure and mortality after acute MI. Doesn’t their absence weaken the strength of galactin-3’s prognostic power as demonstrated in the study? she asked.
Dr. Asleh replied that those biomarkers weren’t collected in this study, which began in 2002.
“What I can tell you is, other studies show there is only a weak correlation between galactin-3 and NT-proBNP post MI. Some studies have even shown an inverse correlation,” he said. “The pathophysiological explanation is that galactin-3 is more implicated in fibrosis before the stage of development of left ventricular loading and stretching of the myocardium. So galactin-3 may be implicated in LV fibrosis leading to heart failure before the NT-proBNP comes into play.”
Also, he cited a study by other investigators conducted in patients with a left ventricular assist device for advanced heart failure. Upon device-induced left ventricular unloading the patients’ NT-proBNP levels dropped significantly while their galactin-3 remained high and unchanged. This suggests the two biomarkers are implicated in different disease pathways.
Both animal and human studies indicate galactin-3 is involved specifically in fibrosis, as opposed to, say, C-reactive protein, a well established marker of systemic inflammation, the cardiologist added.
Dr. Asleh reported having no financial conflicts of interest regarding his study, which was supported by the National Institutes of Health.
ORLANDO – An elevated circulating galactin-3 level after an acute MI is a potent long-term predictor of both heart failure and mortality, independent of known prognostic markers, Rabea Asleh, MD, PhD, reported at the annual meeting of the American College of Cardiology.
“These findings suggest that galectin-3 measurement may have a role in the risk stratification of patients presenting with MI,” according to Dr. Asleh, an Israeli cardiologist doing a fellowship in advanced heart failure and transplant cardiology at the Mayo Clinic in Rochester, Minn.
“The changing clinical presentation of MI necessitates evolution in our approach to risk stratification,” he explained. “Over the last 2 decades we’ve observed a change in the epidemiology of MI, with more patients developing non-ST-elevation MI compared to STEMI. They present at an older age and develop heart failure with preserved ejection fraction more than heart failure with reduced ejection fraction.”
He presented a prospective population-based community cohort study of 1,401 Olmsted County, Minn., residents who had a validated MI during 2002-2012. Their mean age was 67 years, 61% were men, and 79% presented with non-STEMI. During a mean follow-up of 5.3 years, 389 of the participants developed heart failure and 512 patients died.
Galectin-3 was measured a median of 2 days post MI. The median level was 18.4 ng/mL. Patients were divided into tertiles based upon their galactin-3 measurement: Tertile 1 required a post-MI galectin-3 level below 15.2 ng/mL; tertile 2 had a level of 15.2-22.6 ng/mL; and the top tertile was for individuals with a galectin-3 above 22.6 ng/mL.
Of note, patients with a higher galectin-3 level were older, had a higher prevalence of diabetes, hypertension, hyperlipidemia, anterior MI, a higher Killip class, a higher Charlson comorbidity score, and a lower peak troponin T level. They also had a lower estimated glomerular filtration rate; indeed, the median eGFR in the top tertile for galactin-3 was 48 mL/min per 1.73 m2, compared with 68 mL/min in the lowest galectin-3 tertile. Women accounted for 27% of patients in tertile 1, 41% in tertile 2, and fully half of those in tertile 3.
In an unadjusted analysis, the risk of mortality during follow-up was sixfold greater for patients in galectin-3 tertile 3 than in tertile 1; the risk of heart failure was increased 5.5-fold.
More meaningfully, in a Cox multivariate analysis extensively adjusted for age, gender, comorbidities, malignancy, standard cardiovascular risk factors, MI characteristics, eGFR, Killip class, cardiac troponin T, and other potential confounders, patients in galectin-3 tertile 2 had a 1.6-fold increased risk of death and a 1.62-fold increased likelihood of heart failure during follow-up, compared with subjects in tertile 1, Dr. Asleh noted.
Patients in tertile 3 had a 2.4-fold increased risk of death and were at 2.1 times greater risk of heart failure than those in tertile 1. The degree of risk for heart failure associated with elevated galactin-3 was virtually identical for heart failure with preserved as compared with reduced ejection fraction, he added.
Session cochair L. Kristin Newby, MD, of Duke University, Durham, N.C., noted that the Mayo study did not adjust for brain natriuretic peptide (BNP) or N-terminal pro hormone BNP (NT-proBNP), both of which are known to be strong predictors of both heart failure and mortality after acute MI. Doesn’t their absence weaken the strength of galactin-3’s prognostic power as demonstrated in the study? she asked.
Dr. Asleh replied that those biomarkers weren’t collected in this study, which began in 2002.
“What I can tell you is, other studies show there is only a weak correlation between galactin-3 and NT-proBNP post MI. Some studies have even shown an inverse correlation,” he said. “The pathophysiological explanation is that galactin-3 is more implicated in fibrosis before the stage of development of left ventricular loading and stretching of the myocardium. So galactin-3 may be implicated in LV fibrosis leading to heart failure before the NT-proBNP comes into play.”
Also, he cited a study by other investigators conducted in patients with a left ventricular assist device for advanced heart failure. Upon device-induced left ventricular unloading the patients’ NT-proBNP levels dropped significantly while their galactin-3 remained high and unchanged. This suggests the two biomarkers are implicated in different disease pathways.
Both animal and human studies indicate galactin-3 is involved specifically in fibrosis, as opposed to, say, C-reactive protein, a well established marker of systemic inflammation, the cardiologist added.
Dr. Asleh reported having no financial conflicts of interest regarding his study, which was supported by the National Institutes of Health.
ORLANDO – An elevated circulating galactin-3 level after an acute MI is a potent long-term predictor of both heart failure and mortality, independent of known prognostic markers, Rabea Asleh, MD, PhD, reported at the annual meeting of the American College of Cardiology.
“These findings suggest that galectin-3 measurement may have a role in the risk stratification of patients presenting with MI,” according to Dr. Asleh, an Israeli cardiologist doing a fellowship in advanced heart failure and transplant cardiology at the Mayo Clinic in Rochester, Minn.
“The changing clinical presentation of MI necessitates evolution in our approach to risk stratification,” he explained. “Over the last 2 decades we’ve observed a change in the epidemiology of MI, with more patients developing non-ST-elevation MI compared to STEMI. They present at an older age and develop heart failure with preserved ejection fraction more than heart failure with reduced ejection fraction.”
He presented a prospective population-based community cohort study of 1,401 Olmsted County, Minn., residents who had a validated MI during 2002-2012. Their mean age was 67 years, 61% were men, and 79% presented with non-STEMI. During a mean follow-up of 5.3 years, 389 of the participants developed heart failure and 512 patients died.
Galectin-3 was measured a median of 2 days post MI. The median level was 18.4 ng/mL. Patients were divided into tertiles based upon their galactin-3 measurement: Tertile 1 required a post-MI galectin-3 level below 15.2 ng/mL; tertile 2 had a level of 15.2-22.6 ng/mL; and the top tertile was for individuals with a galectin-3 above 22.6 ng/mL.
Of note, patients with a higher galectin-3 level were older, had a higher prevalence of diabetes, hypertension, hyperlipidemia, anterior MI, a higher Killip class, a higher Charlson comorbidity score, and a lower peak troponin T level. They also had a lower estimated glomerular filtration rate; indeed, the median eGFR in the top tertile for galactin-3 was 48 mL/min per 1.73 m2, compared with 68 mL/min in the lowest galectin-3 tertile. Women accounted for 27% of patients in tertile 1, 41% in tertile 2, and fully half of those in tertile 3.
In an unadjusted analysis, the risk of mortality during follow-up was sixfold greater for patients in galectin-3 tertile 3 than in tertile 1; the risk of heart failure was increased 5.5-fold.
More meaningfully, in a Cox multivariate analysis extensively adjusted for age, gender, comorbidities, malignancy, standard cardiovascular risk factors, MI characteristics, eGFR, Killip class, cardiac troponin T, and other potential confounders, patients in galectin-3 tertile 2 had a 1.6-fold increased risk of death and a 1.62-fold increased likelihood of heart failure during follow-up, compared with subjects in tertile 1, Dr. Asleh noted.
Patients in tertile 3 had a 2.4-fold increased risk of death and were at 2.1 times greater risk of heart failure than those in tertile 1. The degree of risk for heart failure associated with elevated galactin-3 was virtually identical for heart failure with preserved as compared with reduced ejection fraction, he added.
Session cochair L. Kristin Newby, MD, of Duke University, Durham, N.C., noted that the Mayo study did not adjust for brain natriuretic peptide (BNP) or N-terminal pro hormone BNP (NT-proBNP), both of which are known to be strong predictors of both heart failure and mortality after acute MI. Doesn’t their absence weaken the strength of galactin-3’s prognostic power as demonstrated in the study? she asked.
Dr. Asleh replied that those biomarkers weren’t collected in this study, which began in 2002.
“What I can tell you is, other studies show there is only a weak correlation between galactin-3 and NT-proBNP post MI. Some studies have even shown an inverse correlation,” he said. “The pathophysiological explanation is that galactin-3 is more implicated in fibrosis before the stage of development of left ventricular loading and stretching of the myocardium. So galactin-3 may be implicated in LV fibrosis leading to heart failure before the NT-proBNP comes into play.”
Also, he cited a study by other investigators conducted in patients with a left ventricular assist device for advanced heart failure. Upon device-induced left ventricular unloading the patients’ NT-proBNP levels dropped significantly while their galactin-3 remained high and unchanged. This suggests the two biomarkers are implicated in different disease pathways.
Both animal and human studies indicate galactin-3 is involved specifically in fibrosis, as opposed to, say, C-reactive protein, a well established marker of systemic inflammation, the cardiologist added.
Dr. Asleh reported having no financial conflicts of interest regarding his study, which was supported by the National Institutes of Health.
REPORTING FROM ACC 18
Key clinical point: Galectin-3 level post-MI is a potent long-term predictor of both heart failure and mortality independent of known prognostic markers.
Major finding: Post-MI patients in the top tertile of circulating galectin-3 were at an adjusted 2.4-fold increased mortality risk and a 2.05-fold greater risk of developing heart failure compared with those in the lowest tertile.
Study details: This prospective population-based cohort study included 1,401 MI patients followed for a mean of 5.3 years.
Disclosures: The National Institutes of Health supported the study. The presenter reported having no financial conflicts of interest.







