User login
Meta-Analyses Indicate Asthma Boosts Lung Cancer Risk
DENVER — Asthma may be a risk factor for lung cancer, according to two new meta-analyses.
The public health implications of such an association would be enormous. Asthma affects at least 15 million Americans, 40% of them children. Its prevalence has been climbing steadily for decades in developed countries, more than doubling in the United States during a recent 20-year period. And lung cancer is the second most common noncutaneous malignancy in this country, with 10% of lung cancer deaths not attributable to smoking, Chanis Mercado said at the annual meeting of the American Public Health Association.
One of the two meta-analyses she performed as a Ph.D. candidate in public health at the Ponce (P.R.) School of Medicine involved 17 high-quality case-control studies with a total of 54,238 subjects. The conclusion was that individuals with asthma had 34% greater odds of having lung cancer, compared with matched controls without asthma.
A separate meta-analysis that included 16 high-quality cohort studies and 1,384,824 subjects showed that those with asthma were 46% more likely to develop lung cancer than were subjects without asthma.
These results were statistically robust. Eliminating any individual study didn’t substantially change the results. Tests for the existence of publication bias proved reassuringly negative.
One biologically plausible possible mechanism for the observed asthma–lung cancer link is that the persistent chronic inflammation that is a defining feature of asthma causes DNA damage to cells in the airway. Another possibility is that asthma patients have defective clearance of toxins in the bronchioalveolar epithelium, resulting in prolonged local exposure to carcinogens, she said.
The clinical implication of these two meta-analyses is that asthma patients ought to be screened earlier and more often for signs and symptoms of lung cancer, Ms. Mercado continued. This screening might take the form of chest x-rays, sputum cytology tests, and/or a low threshold for acting on symptoms of weight loss or hemoptysis.
Ms. Mercado declared having no relevant financial interests.
DENVER — Asthma may be a risk factor for lung cancer, according to two new meta-analyses.
The public health implications of such an association would be enormous. Asthma affects at least 15 million Americans, 40% of them children. Its prevalence has been climbing steadily for decades in developed countries, more than doubling in the United States during a recent 20-year period. And lung cancer is the second most common noncutaneous malignancy in this country, with 10% of lung cancer deaths not attributable to smoking, Chanis Mercado said at the annual meeting of the American Public Health Association.
One of the two meta-analyses she performed as a Ph.D. candidate in public health at the Ponce (P.R.) School of Medicine involved 17 high-quality case-control studies with a total of 54,238 subjects. The conclusion was that individuals with asthma had 34% greater odds of having lung cancer, compared with matched controls without asthma.
A separate meta-analysis that included 16 high-quality cohort studies and 1,384,824 subjects showed that those with asthma were 46% more likely to develop lung cancer than were subjects without asthma.
These results were statistically robust. Eliminating any individual study didn’t substantially change the results. Tests for the existence of publication bias proved reassuringly negative.
One biologically plausible possible mechanism for the observed asthma–lung cancer link is that the persistent chronic inflammation that is a defining feature of asthma causes DNA damage to cells in the airway. Another possibility is that asthma patients have defective clearance of toxins in the bronchioalveolar epithelium, resulting in prolonged local exposure to carcinogens, she said.
The clinical implication of these two meta-analyses is that asthma patients ought to be screened earlier and more often for signs and symptoms of lung cancer, Ms. Mercado continued. This screening might take the form of chest x-rays, sputum cytology tests, and/or a low threshold for acting on symptoms of weight loss or hemoptysis.
Ms. Mercado declared having no relevant financial interests.
DENVER — Asthma may be a risk factor for lung cancer, according to two new meta-analyses.
The public health implications of such an association would be enormous. Asthma affects at least 15 million Americans, 40% of them children. Its prevalence has been climbing steadily for decades in developed countries, more than doubling in the United States during a recent 20-year period. And lung cancer is the second most common noncutaneous malignancy in this country, with 10% of lung cancer deaths not attributable to smoking, Chanis Mercado said at the annual meeting of the American Public Health Association.
One of the two meta-analyses she performed as a Ph.D. candidate in public health at the Ponce (P.R.) School of Medicine involved 17 high-quality case-control studies with a total of 54,238 subjects. The conclusion was that individuals with asthma had 34% greater odds of having lung cancer, compared with matched controls without asthma.
A separate meta-analysis that included 16 high-quality cohort studies and 1,384,824 subjects showed that those with asthma were 46% more likely to develop lung cancer than were subjects without asthma.
These results were statistically robust. Eliminating any individual study didn’t substantially change the results. Tests for the existence of publication bias proved reassuringly negative.
One biologically plausible possible mechanism for the observed asthma–lung cancer link is that the persistent chronic inflammation that is a defining feature of asthma causes DNA damage to cells in the airway. Another possibility is that asthma patients have defective clearance of toxins in the bronchioalveolar epithelium, resulting in prolonged local exposure to carcinogens, she said.
The clinical implication of these two meta-analyses is that asthma patients ought to be screened earlier and more often for signs and symptoms of lung cancer, Ms. Mercado continued. This screening might take the form of chest x-rays, sputum cytology tests, and/or a low threshold for acting on symptoms of weight loss or hemoptysis.
Ms. Mercado declared having no relevant financial interests.
Screening Mammography Rates Are Below Guideline Recommendations
DENVER – Even before the U.S. Preventive Services Task Force issued its controversial 2009 recommendation, only a slim majority of women with health insurance were getting even one mammogram every 2 years.
Thus, the utilization rate for mammography – be it standard, digital, or MRI – remains well below recommendations, Judie Mopsik said at the annual meeting of the American Public Health Association.
She presented an analysis of longitudinal medical claims data for 4.5 million women aged 18 years or older covered by a national health insurance company with 20 million enrollees. These were women with full access to preventive care. During the study period, 2006-2008, 1.9 million of the 4.5 million women had a mammogram.
The mammographic screening rate during this 2-year window was 9% among 18- to 39-year-olds. The rate was 53% among women aged 40-49 years, a group for whom routine screening isn’t recommended in the latest USPSTF guidelines. At the time of the study, however, the USPSTF’s previous guidelines were in effect, which recommended mammography every 1-2 years starting at age 40, noted Ms. Mopsik, who is president of the Council of Professional Associations on Federal Statistics and vice president for business development at the Lewin Group in Falls Church, Va.
The screening rate within the 2-year study window was 59% among women in their 50s, for whom mammography is routinely recommended at least once every 2 years. And the screening rate was 49% in women aged 60 years or older.
Among women who had two or more mammograms during the 2-year study period, the majority – 56% to 84% depending upon the age group – had their most recent mammogram within 11-18 months of their prior mammogram. This is the population of assiduous adherents to preventive medicine likely to find particularly troubling the USPSTF’s reversal of its longtime guidelines calling for screening every 1-2 years, particularly since the American Cancer Society still recommends starting annual mammography at age 40 years.
This study was supported by the National Center for Health Statistics. Ms. Mopsik declared that she has no relevant financial conflicts of interest.
DENVER – Even before the U.S. Preventive Services Task Force issued its controversial 2009 recommendation, only a slim majority of women with health insurance were getting even one mammogram every 2 years.
Thus, the utilization rate for mammography – be it standard, digital, or MRI – remains well below recommendations, Judie Mopsik said at the annual meeting of the American Public Health Association.
She presented an analysis of longitudinal medical claims data for 4.5 million women aged 18 years or older covered by a national health insurance company with 20 million enrollees. These were women with full access to preventive care. During the study period, 2006-2008, 1.9 million of the 4.5 million women had a mammogram.
The mammographic screening rate during this 2-year window was 9% among 18- to 39-year-olds. The rate was 53% among women aged 40-49 years, a group for whom routine screening isn’t recommended in the latest USPSTF guidelines. At the time of the study, however, the USPSTF’s previous guidelines were in effect, which recommended mammography every 1-2 years starting at age 40, noted Ms. Mopsik, who is president of the Council of Professional Associations on Federal Statistics and vice president for business development at the Lewin Group in Falls Church, Va.
The screening rate within the 2-year study window was 59% among women in their 50s, for whom mammography is routinely recommended at least once every 2 years. And the screening rate was 49% in women aged 60 years or older.
Among women who had two or more mammograms during the 2-year study period, the majority – 56% to 84% depending upon the age group – had their most recent mammogram within 11-18 months of their prior mammogram. This is the population of assiduous adherents to preventive medicine likely to find particularly troubling the USPSTF’s reversal of its longtime guidelines calling for screening every 1-2 years, particularly since the American Cancer Society still recommends starting annual mammography at age 40 years.
This study was supported by the National Center for Health Statistics. Ms. Mopsik declared that she has no relevant financial conflicts of interest.
DENVER – Even before the U.S. Preventive Services Task Force issued its controversial 2009 recommendation, only a slim majority of women with health insurance were getting even one mammogram every 2 years.
Thus, the utilization rate for mammography – be it standard, digital, or MRI – remains well below recommendations, Judie Mopsik said at the annual meeting of the American Public Health Association.
She presented an analysis of longitudinal medical claims data for 4.5 million women aged 18 years or older covered by a national health insurance company with 20 million enrollees. These were women with full access to preventive care. During the study period, 2006-2008, 1.9 million of the 4.5 million women had a mammogram.
The mammographic screening rate during this 2-year window was 9% among 18- to 39-year-olds. The rate was 53% among women aged 40-49 years, a group for whom routine screening isn’t recommended in the latest USPSTF guidelines. At the time of the study, however, the USPSTF’s previous guidelines were in effect, which recommended mammography every 1-2 years starting at age 40, noted Ms. Mopsik, who is president of the Council of Professional Associations on Federal Statistics and vice president for business development at the Lewin Group in Falls Church, Va.
The screening rate within the 2-year study window was 59% among women in their 50s, for whom mammography is routinely recommended at least once every 2 years. And the screening rate was 49% in women aged 60 years or older.
Among women who had two or more mammograms during the 2-year study period, the majority – 56% to 84% depending upon the age group – had their most recent mammogram within 11-18 months of their prior mammogram. This is the population of assiduous adherents to preventive medicine likely to find particularly troubling the USPSTF’s reversal of its longtime guidelines calling for screening every 1-2 years, particularly since the American Cancer Society still recommends starting annual mammography at age 40 years.
This study was supported by the National Center for Health Statistics. Ms. Mopsik declared that she has no relevant financial conflicts of interest.
FROM THE ANNUAL MEETING OF THE AMERICAN PUBLIC HEALTH ASSOCIATION
Major Finding: The screening mammography rate within a 2-year study window was 59% for women in their 50s and 49% for women aged 60 years or older.
Data Source: Longitudinal medical claims data for 2006-2008 for 4.5 million women covered by a national health insurance company with 20 million covered lives and full access to preventive care.
Disclosures: The study was supported by the National Center for Health Statistics. Ms. Mopsik declared having no relevant financial interests.
Onychomycosis is Best Tackled With Evidence-Based Strategies
GOTHENBURG, SWEDEN – Onychomycosis remains a difficult disorder to treat and cure, even with modern antifungal agents. But the chances of success can be greatly enhanced through application of several proven, evidence-based strategies.
A recent study identified multiple baseline factors associated with a low cure rate following a standard 3-month course of oral terbinafine for onychomycosis. One preemptive strategy in patients possessing several of these poor-prognosis factors is to consider combination therapy from the outset. Alternatively, the standard 3 months of terbinafine could be stretched for 5-6 months, Dr. Bardur Sigurgeirsson said at the annual congress of the European Academy of Dermatology and Venereology.
The host-related prognostic factors were identified in Dr. Sigurgeirsson’s recent secondary retrospective analysis of 3-year outcomes in 199 Icelandic participants in a large international randomized trial of continuous versus intermittent terbinafine (J. Eur. Acad. Dermatol. Venereol. 2010; 24:679-84).
Several of the prognostic factors were already known, but the study provided the first-ever supporting data validating their legitimacy, said Dr. Sigurgeirsson of the University of Iceland, Reykjavik. The new information is particularly useful in everyday clinical practice because no universal classification of disease severity exists.
In the multivariate, logistic, regression analysis, baseline factors associated with a negative outcome at 72 weeks of follow-up – that is, failure to achieve mycologic or clinical cure – included matrix involvement, lateral nail edge involvement, and dermatophytoma. Slow nail growth from screening to baseline was another predictor of lack of cure; this makes sense, as patients with faster-growing nails are likely to shed the infected part sooner, he noted.
Other factors enabling physicians to select good candidates for up-front combination or extended therapy were being over age 65 years, being male, having a history of prior fungal toe infection, and having a positive culture at 24 weeks’ follow-up, even if the nails look good at that point.
Several factors in popular dermatologic lore to predict poor outcome were not borne out in the study. The extent of infection involvement, the number of infected toenails, duration of infection, and presence of spikes were unrelated to the 72-week cure rate. There was a trend for patients with thicker nail plates or subungual hyperkeratosis to be less likely to reach cure, but this factor did not achieve statistical significance, he reported.
The greatest likelihood of cure at 72 weeks’ follow-up after the standard 3 months of oral terbinafine was seen in younger female patients with fast nail growth.
An earlier, randomized, multicenter study by Dr. Sigurgeirsson and coworkers made the case for up-front combination therapy with amorolfine hydrochloride 5% nail lacquer and oral terbinafine for treating onychomycosis in patients with terbinafine monotherapy lack-of-cure risk factors. The trial involved 249 patients; one of the strongest predictors of poor outcome was baseline nail matrix involvement. The success rate at 18 months was 59% for combination therapy, compared with 45% for oral terbinafine monotherapy. The cost per cure was significantly less with combination therapy (Br. J. Dermatol. 2007;157:149-57).
Onychomycosis is best viewed as a chronic relapsing condition, as evidenced by a 5-year, blinded, prospective follow-up study Dr. Sigurgeirsson and colleagues conducted in terbinafine – or itraconazole-treated patients (Arch. Dermatol. 2002;138:353-7). The mycologic relapse rates were 53% in the itraconazole arm and 48% with terbinafine.
In a subsequent study of nearly 4,000 patients, the investigators identified a number of risk factors for recurrent onychomycosis: cancer, 3.4-fold increased risk; psoriasis, 2.4-fold increased risk; tinea pedis interdigitalis, 3.9-fold increased risk; moccasin form of tinea pedis, 4.3-fold increased risk; regular swimming, 2.6-fold increased risk; and having a spouse, parents, or children with onychomycosis, 2.5- to 3.5-fold increased risk (J. Eur. Acad. Dermatol. Venereol. 2004;18:48-51).
These findings were recently confirmed and expanded upon in a Japanese survey of 30,000 dermatology patients. Dermatologists at Teikyo University in Itabashi found most of the same risk factors earlier identified by Dr. Sigurgeirsson and coworkers. In addition, the Japanese investigators identified two previously undescribed risk factors for recurrent infection: more time spent wearing shoes, and having a higher temperature in the home (J. Dermatol. 2010;37:397-406).
Prophylactic therapy is worth considering following cure of onychomycosis in patients at increased risk for relapse based upon their risk factor profile, Dr. Sigurgeirsson said. He and his coworkers recently showed that amorolfine nail lacquer applied once every 2 weeks is safe and effective for this purpose (J. Eur. Acad. Dermatol. Venereol. 2010;24:910-5).
Many of his studies of terbinafine for onychomycosis were supported by research grants from Novartis.
GOTHENBURG, SWEDEN – Onychomycosis remains a difficult disorder to treat and cure, even with modern antifungal agents. But the chances of success can be greatly enhanced through application of several proven, evidence-based strategies.
A recent study identified multiple baseline factors associated with a low cure rate following a standard 3-month course of oral terbinafine for onychomycosis. One preemptive strategy in patients possessing several of these poor-prognosis factors is to consider combination therapy from the outset. Alternatively, the standard 3 months of terbinafine could be stretched for 5-6 months, Dr. Bardur Sigurgeirsson said at the annual congress of the European Academy of Dermatology and Venereology.
The host-related prognostic factors were identified in Dr. Sigurgeirsson’s recent secondary retrospective analysis of 3-year outcomes in 199 Icelandic participants in a large international randomized trial of continuous versus intermittent terbinafine (J. Eur. Acad. Dermatol. Venereol. 2010; 24:679-84).
Several of the prognostic factors were already known, but the study provided the first-ever supporting data validating their legitimacy, said Dr. Sigurgeirsson of the University of Iceland, Reykjavik. The new information is particularly useful in everyday clinical practice because no universal classification of disease severity exists.
In the multivariate, logistic, regression analysis, baseline factors associated with a negative outcome at 72 weeks of follow-up – that is, failure to achieve mycologic or clinical cure – included matrix involvement, lateral nail edge involvement, and dermatophytoma. Slow nail growth from screening to baseline was another predictor of lack of cure; this makes sense, as patients with faster-growing nails are likely to shed the infected part sooner, he noted.
Other factors enabling physicians to select good candidates for up-front combination or extended therapy were being over age 65 years, being male, having a history of prior fungal toe infection, and having a positive culture at 24 weeks’ follow-up, even if the nails look good at that point.
Several factors in popular dermatologic lore to predict poor outcome were not borne out in the study. The extent of infection involvement, the number of infected toenails, duration of infection, and presence of spikes were unrelated to the 72-week cure rate. There was a trend for patients with thicker nail plates or subungual hyperkeratosis to be less likely to reach cure, but this factor did not achieve statistical significance, he reported.
The greatest likelihood of cure at 72 weeks’ follow-up after the standard 3 months of oral terbinafine was seen in younger female patients with fast nail growth.
An earlier, randomized, multicenter study by Dr. Sigurgeirsson and coworkers made the case for up-front combination therapy with amorolfine hydrochloride 5% nail lacquer and oral terbinafine for treating onychomycosis in patients with terbinafine monotherapy lack-of-cure risk factors. The trial involved 249 patients; one of the strongest predictors of poor outcome was baseline nail matrix involvement. The success rate at 18 months was 59% for combination therapy, compared with 45% for oral terbinafine monotherapy. The cost per cure was significantly less with combination therapy (Br. J. Dermatol. 2007;157:149-57).
Onychomycosis is best viewed as a chronic relapsing condition, as evidenced by a 5-year, blinded, prospective follow-up study Dr. Sigurgeirsson and colleagues conducted in terbinafine – or itraconazole-treated patients (Arch. Dermatol. 2002;138:353-7). The mycologic relapse rates were 53% in the itraconazole arm and 48% with terbinafine.
In a subsequent study of nearly 4,000 patients, the investigators identified a number of risk factors for recurrent onychomycosis: cancer, 3.4-fold increased risk; psoriasis, 2.4-fold increased risk; tinea pedis interdigitalis, 3.9-fold increased risk; moccasin form of tinea pedis, 4.3-fold increased risk; regular swimming, 2.6-fold increased risk; and having a spouse, parents, or children with onychomycosis, 2.5- to 3.5-fold increased risk (J. Eur. Acad. Dermatol. Venereol. 2004;18:48-51).
These findings were recently confirmed and expanded upon in a Japanese survey of 30,000 dermatology patients. Dermatologists at Teikyo University in Itabashi found most of the same risk factors earlier identified by Dr. Sigurgeirsson and coworkers. In addition, the Japanese investigators identified two previously undescribed risk factors for recurrent infection: more time spent wearing shoes, and having a higher temperature in the home (J. Dermatol. 2010;37:397-406).
Prophylactic therapy is worth considering following cure of onychomycosis in patients at increased risk for relapse based upon their risk factor profile, Dr. Sigurgeirsson said. He and his coworkers recently showed that amorolfine nail lacquer applied once every 2 weeks is safe and effective for this purpose (J. Eur. Acad. Dermatol. Venereol. 2010;24:910-5).
Many of his studies of terbinafine for onychomycosis were supported by research grants from Novartis.
GOTHENBURG, SWEDEN – Onychomycosis remains a difficult disorder to treat and cure, even with modern antifungal agents. But the chances of success can be greatly enhanced through application of several proven, evidence-based strategies.
A recent study identified multiple baseline factors associated with a low cure rate following a standard 3-month course of oral terbinafine for onychomycosis. One preemptive strategy in patients possessing several of these poor-prognosis factors is to consider combination therapy from the outset. Alternatively, the standard 3 months of terbinafine could be stretched for 5-6 months, Dr. Bardur Sigurgeirsson said at the annual congress of the European Academy of Dermatology and Venereology.
The host-related prognostic factors were identified in Dr. Sigurgeirsson’s recent secondary retrospective analysis of 3-year outcomes in 199 Icelandic participants in a large international randomized trial of continuous versus intermittent terbinafine (J. Eur. Acad. Dermatol. Venereol. 2010; 24:679-84).
Several of the prognostic factors were already known, but the study provided the first-ever supporting data validating their legitimacy, said Dr. Sigurgeirsson of the University of Iceland, Reykjavik. The new information is particularly useful in everyday clinical practice because no universal classification of disease severity exists.
In the multivariate, logistic, regression analysis, baseline factors associated with a negative outcome at 72 weeks of follow-up – that is, failure to achieve mycologic or clinical cure – included matrix involvement, lateral nail edge involvement, and dermatophytoma. Slow nail growth from screening to baseline was another predictor of lack of cure; this makes sense, as patients with faster-growing nails are likely to shed the infected part sooner, he noted.
Other factors enabling physicians to select good candidates for up-front combination or extended therapy were being over age 65 years, being male, having a history of prior fungal toe infection, and having a positive culture at 24 weeks’ follow-up, even if the nails look good at that point.
Several factors in popular dermatologic lore to predict poor outcome were not borne out in the study. The extent of infection involvement, the number of infected toenails, duration of infection, and presence of spikes were unrelated to the 72-week cure rate. There was a trend for patients with thicker nail plates or subungual hyperkeratosis to be less likely to reach cure, but this factor did not achieve statistical significance, he reported.
The greatest likelihood of cure at 72 weeks’ follow-up after the standard 3 months of oral terbinafine was seen in younger female patients with fast nail growth.
An earlier, randomized, multicenter study by Dr. Sigurgeirsson and coworkers made the case for up-front combination therapy with amorolfine hydrochloride 5% nail lacquer and oral terbinafine for treating onychomycosis in patients with terbinafine monotherapy lack-of-cure risk factors. The trial involved 249 patients; one of the strongest predictors of poor outcome was baseline nail matrix involvement. The success rate at 18 months was 59% for combination therapy, compared with 45% for oral terbinafine monotherapy. The cost per cure was significantly less with combination therapy (Br. J. Dermatol. 2007;157:149-57).
Onychomycosis is best viewed as a chronic relapsing condition, as evidenced by a 5-year, blinded, prospective follow-up study Dr. Sigurgeirsson and colleagues conducted in terbinafine – or itraconazole-treated patients (Arch. Dermatol. 2002;138:353-7). The mycologic relapse rates were 53% in the itraconazole arm and 48% with terbinafine.
In a subsequent study of nearly 4,000 patients, the investigators identified a number of risk factors for recurrent onychomycosis: cancer, 3.4-fold increased risk; psoriasis, 2.4-fold increased risk; tinea pedis interdigitalis, 3.9-fold increased risk; moccasin form of tinea pedis, 4.3-fold increased risk; regular swimming, 2.6-fold increased risk; and having a spouse, parents, or children with onychomycosis, 2.5- to 3.5-fold increased risk (J. Eur. Acad. Dermatol. Venereol. 2004;18:48-51).
These findings were recently confirmed and expanded upon in a Japanese survey of 30,000 dermatology patients. Dermatologists at Teikyo University in Itabashi found most of the same risk factors earlier identified by Dr. Sigurgeirsson and coworkers. In addition, the Japanese investigators identified two previously undescribed risk factors for recurrent infection: more time spent wearing shoes, and having a higher temperature in the home (J. Dermatol. 2010;37:397-406).
Prophylactic therapy is worth considering following cure of onychomycosis in patients at increased risk for relapse based upon their risk factor profile, Dr. Sigurgeirsson said. He and his coworkers recently showed that amorolfine nail lacquer applied once every 2 weeks is safe and effective for this purpose (J. Eur. Acad. Dermatol. Venereol. 2010;24:910-5).
Many of his studies of terbinafine for onychomycosis were supported by research grants from Novartis.
FROM THE ANNUAL CONGRESS OF THE EUROPEAN ACADEMY OF DERMATOLOGY AND VENEREOLOGY
Sun Exposure Recommendations to Boost Vitamin D Criticized
GOTHENBURG, SWEDEN – The popular practice of trying to improve serum vitamin D status through controlled sun exposure is a no-win proposition that's unlikely to result in adequate vitamin D levels year-round without compromising skin health, according to Brian L. Diffey, Ph.D.
"Failure to understand the nature of human exposure to sunlight has led to widespread misguided public health advice concerning the sun exposure necessary for adequate vitamin D status. Messages concerning sun exposure should remain focused on the detrimental effects of excessive sun exposure and avoid giving specific advice on what may be thought to be optimal sun exposure," said Dr. Diffey, professor emeritus of photobiology at the University of Newcastle (England) who has been publishing studies on the relationship between sun exposure and skin cancer for more than 20 years.
"The recommendation for short, casual sun exposure as adequate for a healthy vitamin D status is simply ubiquitous. We read it everywhere. It has become part of our conventional wisdom. Nobody really questions it. But there's been a gross oversight in all of these recommendations: These calculations relate only to exposure under a clear sky with no clouds, [while] lying horizontal in the middle of the day in midsummer with no shade and roughly 25% of our body surface exposed," he explained.
That's simply not how sun exposure occurs in contemporary life. A person walking around in an urban environment with shade from nearby buildings and trees receives a sun exposure on the vertical body surfaces that’s typically one-sixth of that of a sunbather lying horizontally, Dr. Diffey continued.
As examples of the widespread public health messages encouraging limited sun exposure to enhance vitamin D levels, he noted that the U.K.'s National Osteoporosis Society recommends trying to get 10 minutes of sun exposure once or twice a day without sunscreen between May and September for bone health. The U.K. Health Protection Agency states that short periods outdoors will produce sufficient vitamin D. And "The UV Advantage," by Dr. Michael Holick, professor of medicine at Boston University and winner of the 2009 Linus Pauling prize for health research, containing the "Holick formula for safe sun," is a brisk seller.
Dr. Diffey pointed to a recent large international study of serum vitamin D levels month-by-month for individuals living at various latitudes which concluded most people have adequate but suboptimal levels during the summer months, with a mean of 70 nmol/L. The investigators deemed a level greater than 75 nmol/L to be optimal. In the winter months, most people fall into the 'inadequate' range, with a mean serum vitamin D of 48 nmol/L (BMJ 2010;340:b5664. [doi: 10.1136/bmj.b5664].
In light of study data showing that most people in Europe and North America spend an average of 1-2 hours per day outdoors during the summer, they are generally regarded as having suboptimal vitamin D levels during those months and are vitamin D insufficient the rest of the year. A recommendation for 10-20 minutes of daily casual sun exposure followed by sun avoidance would be "grossly insufficient" to maintain adequate vitamin D levels, he said.
"In fact, if people really did follow the conventional public health advice, we would be much more vitamin D insufficient than we now are," according to the photobiologist.
The safe and effective ways to raise vitamin D levels, Dr. Diffey said, are more widespread fortification of foods or the use of supplements, especially during the winter months.
For dermatologists, he added, there's another effective option: "Pop into your UVB cabin once a week from November to February when nobody's looking and give yourself 1 SED [standard erythema dose], which is about one-third of the minimal erythema dose." His recently published mathematical model (Br. J. Dermatol. 2010; 162:1,342-8) predicts this modest UVB exposure, adding up to a little over one-tenth of a typical UVB treatment course for psoriasis, would keep the recipient in the adequate range for serum vitamin D throughout the dark months.
Dr. Diffey said he has no relevant financial conflicts of interests.
GOTHENBURG, SWEDEN – The popular practice of trying to improve serum vitamin D status through controlled sun exposure is a no-win proposition that's unlikely to result in adequate vitamin D levels year-round without compromising skin health, according to Brian L. Diffey, Ph.D.
"Failure to understand the nature of human exposure to sunlight has led to widespread misguided public health advice concerning the sun exposure necessary for adequate vitamin D status. Messages concerning sun exposure should remain focused on the detrimental effects of excessive sun exposure and avoid giving specific advice on what may be thought to be optimal sun exposure," said Dr. Diffey, professor emeritus of photobiology at the University of Newcastle (England) who has been publishing studies on the relationship between sun exposure and skin cancer for more than 20 years.
"The recommendation for short, casual sun exposure as adequate for a healthy vitamin D status is simply ubiquitous. We read it everywhere. It has become part of our conventional wisdom. Nobody really questions it. But there's been a gross oversight in all of these recommendations: These calculations relate only to exposure under a clear sky with no clouds, [while] lying horizontal in the middle of the day in midsummer with no shade and roughly 25% of our body surface exposed," he explained.
That's simply not how sun exposure occurs in contemporary life. A person walking around in an urban environment with shade from nearby buildings and trees receives a sun exposure on the vertical body surfaces that’s typically one-sixth of that of a sunbather lying horizontally, Dr. Diffey continued.
As examples of the widespread public health messages encouraging limited sun exposure to enhance vitamin D levels, he noted that the U.K.'s National Osteoporosis Society recommends trying to get 10 minutes of sun exposure once or twice a day without sunscreen between May and September for bone health. The U.K. Health Protection Agency states that short periods outdoors will produce sufficient vitamin D. And "The UV Advantage," by Dr. Michael Holick, professor of medicine at Boston University and winner of the 2009 Linus Pauling prize for health research, containing the "Holick formula for safe sun," is a brisk seller.
Dr. Diffey pointed to a recent large international study of serum vitamin D levels month-by-month for individuals living at various latitudes which concluded most people have adequate but suboptimal levels during the summer months, with a mean of 70 nmol/L. The investigators deemed a level greater than 75 nmol/L to be optimal. In the winter months, most people fall into the 'inadequate' range, with a mean serum vitamin D of 48 nmol/L (BMJ 2010;340:b5664. [doi: 10.1136/bmj.b5664].
In light of study data showing that most people in Europe and North America spend an average of 1-2 hours per day outdoors during the summer, they are generally regarded as having suboptimal vitamin D levels during those months and are vitamin D insufficient the rest of the year. A recommendation for 10-20 minutes of daily casual sun exposure followed by sun avoidance would be "grossly insufficient" to maintain adequate vitamin D levels, he said.
"In fact, if people really did follow the conventional public health advice, we would be much more vitamin D insufficient than we now are," according to the photobiologist.
The safe and effective ways to raise vitamin D levels, Dr. Diffey said, are more widespread fortification of foods or the use of supplements, especially during the winter months.
For dermatologists, he added, there's another effective option: "Pop into your UVB cabin once a week from November to February when nobody's looking and give yourself 1 SED [standard erythema dose], which is about one-third of the minimal erythema dose." His recently published mathematical model (Br. J. Dermatol. 2010; 162:1,342-8) predicts this modest UVB exposure, adding up to a little over one-tenth of a typical UVB treatment course for psoriasis, would keep the recipient in the adequate range for serum vitamin D throughout the dark months.
Dr. Diffey said he has no relevant financial conflicts of interests.
GOTHENBURG, SWEDEN – The popular practice of trying to improve serum vitamin D status through controlled sun exposure is a no-win proposition that's unlikely to result in adequate vitamin D levels year-round without compromising skin health, according to Brian L. Diffey, Ph.D.
"Failure to understand the nature of human exposure to sunlight has led to widespread misguided public health advice concerning the sun exposure necessary for adequate vitamin D status. Messages concerning sun exposure should remain focused on the detrimental effects of excessive sun exposure and avoid giving specific advice on what may be thought to be optimal sun exposure," said Dr. Diffey, professor emeritus of photobiology at the University of Newcastle (England) who has been publishing studies on the relationship between sun exposure and skin cancer for more than 20 years.
"The recommendation for short, casual sun exposure as adequate for a healthy vitamin D status is simply ubiquitous. We read it everywhere. It has become part of our conventional wisdom. Nobody really questions it. But there's been a gross oversight in all of these recommendations: These calculations relate only to exposure under a clear sky with no clouds, [while] lying horizontal in the middle of the day in midsummer with no shade and roughly 25% of our body surface exposed," he explained.
That's simply not how sun exposure occurs in contemporary life. A person walking around in an urban environment with shade from nearby buildings and trees receives a sun exposure on the vertical body surfaces that’s typically one-sixth of that of a sunbather lying horizontally, Dr. Diffey continued.
As examples of the widespread public health messages encouraging limited sun exposure to enhance vitamin D levels, he noted that the U.K.'s National Osteoporosis Society recommends trying to get 10 minutes of sun exposure once or twice a day without sunscreen between May and September for bone health. The U.K. Health Protection Agency states that short periods outdoors will produce sufficient vitamin D. And "The UV Advantage," by Dr. Michael Holick, professor of medicine at Boston University and winner of the 2009 Linus Pauling prize for health research, containing the "Holick formula for safe sun," is a brisk seller.
Dr. Diffey pointed to a recent large international study of serum vitamin D levels month-by-month for individuals living at various latitudes which concluded most people have adequate but suboptimal levels during the summer months, with a mean of 70 nmol/L. The investigators deemed a level greater than 75 nmol/L to be optimal. In the winter months, most people fall into the 'inadequate' range, with a mean serum vitamin D of 48 nmol/L (BMJ 2010;340:b5664. [doi: 10.1136/bmj.b5664].
In light of study data showing that most people in Europe and North America spend an average of 1-2 hours per day outdoors during the summer, they are generally regarded as having suboptimal vitamin D levels during those months and are vitamin D insufficient the rest of the year. A recommendation for 10-20 minutes of daily casual sun exposure followed by sun avoidance would be "grossly insufficient" to maintain adequate vitamin D levels, he said.
"In fact, if people really did follow the conventional public health advice, we would be much more vitamin D insufficient than we now are," according to the photobiologist.
The safe and effective ways to raise vitamin D levels, Dr. Diffey said, are more widespread fortification of foods or the use of supplements, especially during the winter months.
For dermatologists, he added, there's another effective option: "Pop into your UVB cabin once a week from November to February when nobody's looking and give yourself 1 SED [standard erythema dose], which is about one-third of the minimal erythema dose." His recently published mathematical model (Br. J. Dermatol. 2010; 162:1,342-8) predicts this modest UVB exposure, adding up to a little over one-tenth of a typical UVB treatment course for psoriasis, would keep the recipient in the adequate range for serum vitamin D throughout the dark months.
Dr. Diffey said he has no relevant financial conflicts of interests.
EXPERT ANALYSIS FROM THE ANNUAL CONGRESS OF THE EUROPEAN ACADEMY OF DERMATOLOGY AND VENEREOLOGY
Recommendations for Short Sun Exposure to Boost Vitamin D Criticized
GOTHENBURG, SWEDEN – The popular practice of trying to improve serum vitamin D status through controlled sun exposure is a no-win proposition that’s unlikely to result in adequate vitamin D levels year-round without compromising skin health, according to Brian L. Diffey, Ph.D., asserted in a plenary lecture at the annual congress of the European Academy of Dermatology and Venereology.
"Failure to understand the nature of human exposure to sunlight has led to widespread misguided public health advice concerning the sun exposure necessary for adequate vitamin D status. Messages concerning sun exposure should remain focused on the detrimental effects of excessive sun exposure and avoid giving specific advice on what may be thought to be optimal sun exposure," said Dr. Diffey, professor emeritus of photobiology at the University of Newcastle (England) who has been publishing studies on the relationship between sun exposure and skin cancer for more than 20 years.
"The recommendation for short, casual sun exposure as adequate for a healthy vitamin D status is simply ubiquitous. We read it everywhere. It has become part of our conventional wisdom. Nobody really questions it. But there’s been a gross oversight in all of these recommendations: These calculations relate only to exposure under a clear sky with no clouds, [while] lying horizontal in the middle of the day in midsummer with no shade and roughly 25% of our body surface exposed," he explained.
That’s simply not how sun exposure occurs in contemporary life. A person walking around in an urban environment with shade from nearby buildings and trees receives a sun exposure on the vertical body surfaces that’s typically one-sixth of that of a sunbather lying horizontally, Dr. Diffey continued.
As examples of the widespread public health messages encouraging limited sun exposure to enhance vitamin D levels, he noted that the U.K.’s National Osteoporosis Society recommends trying to get 10 minutes of sun exposure once or twice a day without sunscreen between May and September for bone health. The U.K. Health Protection Agency states that short periods outdoors will produce sufficient vitamin D. And "The UV Advantage," by Dr. Michael Holick, professor of medicine at Boston University and winner of the 2009 Linus Pauling prize for health research, containing the "Holick formula for safe sun," is a brisk seller.
Dr. Diffey pointed to a recent large international study of serum vitamin D levels month-by-month for individuals living at various latitudes which concluded most people have adequate but suboptimal levels during the summer months, with a mean of 70 nmol/L. The investigators deemed a level greater than 75 nmol/L to be optimal. In the winter months, most people fall into the ‘inadequate’ range, with a mean serum vitamin D of 48 nmol/L (BMJ 2010;340:b5664. [doi: 10.1136/bmj.b5664].
In light of study data showing that most people in Europe and North America spend an average of 1-2 hours per day outdoors during the summer, they are generally regarded as having suboptimal vitamin D levels during those months and are vitamin D insufficient the rest of the year. A recommendation for 10-20 minutes of daily casual sun exposure followed by sun avoidance would be "grossly insufficient" to maintain adequate vitamin D levels, he said.
"In fact, if people really did follow the conventional public health advice, we would be much more vitamin D insufficient than we now are," according to the photobiologist.
The safe and effective ways to raise vitamin D levels, Dr. Diffey said, are more widespread fortification of foods or the use of supplements, especially during the winter months.
For dermatologists, he added, there’s another effective option: "Pop into your UVB cabin once a week from November to February when nobody’s looking and give yourself 1 SED [standard erythema dose], which is about one-third of the minimal erythema dose." His recently published mathematical model (Br. J. Dermatol. 2010; 162:1,342-8) predicts this modest UVB exposure, adding up to a little over one-tenth of a typical UVB treatment course for psoriasis, would keep the recipient in the adequate range for serum vitamin D throughout the dark months.
Dr. Diffey said he has no relevant financial conflicts of interests.
GOTHENBURG, SWEDEN – The popular practice of trying to improve serum vitamin D status through controlled sun exposure is a no-win proposition that’s unlikely to result in adequate vitamin D levels year-round without compromising skin health, according to Brian L. Diffey, Ph.D., asserted in a plenary lecture at the annual congress of the European Academy of Dermatology and Venereology.
"Failure to understand the nature of human exposure to sunlight has led to widespread misguided public health advice concerning the sun exposure necessary for adequate vitamin D status. Messages concerning sun exposure should remain focused on the detrimental effects of excessive sun exposure and avoid giving specific advice on what may be thought to be optimal sun exposure," said Dr. Diffey, professor emeritus of photobiology at the University of Newcastle (England) who has been publishing studies on the relationship between sun exposure and skin cancer for more than 20 years.
"The recommendation for short, casual sun exposure as adequate for a healthy vitamin D status is simply ubiquitous. We read it everywhere. It has become part of our conventional wisdom. Nobody really questions it. But there’s been a gross oversight in all of these recommendations: These calculations relate only to exposure under a clear sky with no clouds, [while] lying horizontal in the middle of the day in midsummer with no shade and roughly 25% of our body surface exposed," he explained.
That’s simply not how sun exposure occurs in contemporary life. A person walking around in an urban environment with shade from nearby buildings and trees receives a sun exposure on the vertical body surfaces that’s typically one-sixth of that of a sunbather lying horizontally, Dr. Diffey continued.
As examples of the widespread public health messages encouraging limited sun exposure to enhance vitamin D levels, he noted that the U.K.’s National Osteoporosis Society recommends trying to get 10 minutes of sun exposure once or twice a day without sunscreen between May and September for bone health. The U.K. Health Protection Agency states that short periods outdoors will produce sufficient vitamin D. And "The UV Advantage," by Dr. Michael Holick, professor of medicine at Boston University and winner of the 2009 Linus Pauling prize for health research, containing the "Holick formula for safe sun," is a brisk seller.
Dr. Diffey pointed to a recent large international study of serum vitamin D levels month-by-month for individuals living at various latitudes which concluded most people have adequate but suboptimal levels during the summer months, with a mean of 70 nmol/L. The investigators deemed a level greater than 75 nmol/L to be optimal. In the winter months, most people fall into the ‘inadequate’ range, with a mean serum vitamin D of 48 nmol/L (BMJ 2010;340:b5664. [doi: 10.1136/bmj.b5664].
In light of study data showing that most people in Europe and North America spend an average of 1-2 hours per day outdoors during the summer, they are generally regarded as having suboptimal vitamin D levels during those months and are vitamin D insufficient the rest of the year. A recommendation for 10-20 minutes of daily casual sun exposure followed by sun avoidance would be "grossly insufficient" to maintain adequate vitamin D levels, he said.
"In fact, if people really did follow the conventional public health advice, we would be much more vitamin D insufficient than we now are," according to the photobiologist.
The safe and effective ways to raise vitamin D levels, Dr. Diffey said, are more widespread fortification of foods or the use of supplements, especially during the winter months.
For dermatologists, he added, there’s another effective option: "Pop into your UVB cabin once a week from November to February when nobody’s looking and give yourself 1 SED [standard erythema dose], which is about one-third of the minimal erythema dose." His recently published mathematical model (Br. J. Dermatol. 2010; 162:1,342-8) predicts this modest UVB exposure, adding up to a little over one-tenth of a typical UVB treatment course for psoriasis, would keep the recipient in the adequate range for serum vitamin D throughout the dark months.
Dr. Diffey said he has no relevant financial conflicts of interests.
GOTHENBURG, SWEDEN – The popular practice of trying to improve serum vitamin D status through controlled sun exposure is a no-win proposition that’s unlikely to result in adequate vitamin D levels year-round without compromising skin health, according to Brian L. Diffey, Ph.D., asserted in a plenary lecture at the annual congress of the European Academy of Dermatology and Venereology.
"Failure to understand the nature of human exposure to sunlight has led to widespread misguided public health advice concerning the sun exposure necessary for adequate vitamin D status. Messages concerning sun exposure should remain focused on the detrimental effects of excessive sun exposure and avoid giving specific advice on what may be thought to be optimal sun exposure," said Dr. Diffey, professor emeritus of photobiology at the University of Newcastle (England) who has been publishing studies on the relationship between sun exposure and skin cancer for more than 20 years.
"The recommendation for short, casual sun exposure as adequate for a healthy vitamin D status is simply ubiquitous. We read it everywhere. It has become part of our conventional wisdom. Nobody really questions it. But there’s been a gross oversight in all of these recommendations: These calculations relate only to exposure under a clear sky with no clouds, [while] lying horizontal in the middle of the day in midsummer with no shade and roughly 25% of our body surface exposed," he explained.
That’s simply not how sun exposure occurs in contemporary life. A person walking around in an urban environment with shade from nearby buildings and trees receives a sun exposure on the vertical body surfaces that’s typically one-sixth of that of a sunbather lying horizontally, Dr. Diffey continued.
As examples of the widespread public health messages encouraging limited sun exposure to enhance vitamin D levels, he noted that the U.K.’s National Osteoporosis Society recommends trying to get 10 minutes of sun exposure once or twice a day without sunscreen between May and September for bone health. The U.K. Health Protection Agency states that short periods outdoors will produce sufficient vitamin D. And "The UV Advantage," by Dr. Michael Holick, professor of medicine at Boston University and winner of the 2009 Linus Pauling prize for health research, containing the "Holick formula for safe sun," is a brisk seller.
Dr. Diffey pointed to a recent large international study of serum vitamin D levels month-by-month for individuals living at various latitudes which concluded most people have adequate but suboptimal levels during the summer months, with a mean of 70 nmol/L. The investigators deemed a level greater than 75 nmol/L to be optimal. In the winter months, most people fall into the ‘inadequate’ range, with a mean serum vitamin D of 48 nmol/L (BMJ 2010;340:b5664. [doi: 10.1136/bmj.b5664].
In light of study data showing that most people in Europe and North America spend an average of 1-2 hours per day outdoors during the summer, they are generally regarded as having suboptimal vitamin D levels during those months and are vitamin D insufficient the rest of the year. A recommendation for 10-20 minutes of daily casual sun exposure followed by sun avoidance would be "grossly insufficient" to maintain adequate vitamin D levels, he said.
"In fact, if people really did follow the conventional public health advice, we would be much more vitamin D insufficient than we now are," according to the photobiologist.
The safe and effective ways to raise vitamin D levels, Dr. Diffey said, are more widespread fortification of foods or the use of supplements, especially during the winter months.
For dermatologists, he added, there’s another effective option: "Pop into your UVB cabin once a week from November to February when nobody’s looking and give yourself 1 SED [standard erythema dose], which is about one-third of the minimal erythema dose." His recently published mathematical model (Br. J. Dermatol. 2010; 162:1,342-8) predicts this modest UVB exposure, adding up to a little over one-tenth of a typical UVB treatment course for psoriasis, would keep the recipient in the adequate range for serum vitamin D throughout the dark months.
Dr. Diffey said he has no relevant financial conflicts of interests.
FROM THE ANNUAL CONGRESS OF THE EUROPEAN ACADEMY OF DERMATOLOGY AND VENEREOLOGY
Briakinumab Boosts Quality of Life for Psoriasis Patients
GOTHENBURG, SWEDEN - Psoriasis patients given the investigational interleukin-12/interleukin-23 inhibitor briakinumab had significantly better scores on health-related quality of life measures than did patients given etanercept in a phase III randomized, double-blind clinical trial.
"These results further enhanced the treatment benefits of briakinumab on patients’ lives beyond the previously described clinical efficacy in significantly reducing psoriasis symptoms versus placebo and etanercept," Yanjun Bao, Ph.D., said at the annual congress of the European Academy of Dermatology and Venereology.
The 12-week study involved 347 psoriasis patients who were randomized double-blind 2:2:1 to briakinumab, etanercept at 50 mg twice weekly, or placebo. Briakinumab was dosed at 200 mg at weeks 0 and 4, then 100 mg at week 8.
Treatment with briakinumab resulted in a mean 10.3-point reduction in Dermatology Life Quality Index scores from a baseline of 12.4, which was significantly greater than the 8.1-point decrease in the etanercept group or the 3.0-point decline in the placebo arm, reported Dr. Bao of Abbott Laboratories in Abbott Park, Ill.
In addition, the briakinumab group’s mean 29.1-point improvement in the visual analog scale for psoriasis-related pain from a baseline score of 34.5 was significantly larger than the 24-point reduction with etanercept and the 6.1-point decrease with placebo. The mean score on the Short Form-36 mental component summary improved by 5.4 points in the briakinumab group from a baseline of 45.9, a significantly greater response than the 3.2-point reduction with etanercept or the 1-point decrease with placebo, she continued.
The phase III study was sponsored by Abbott.
GOTHENBURG, SWEDEN - Psoriasis patients given the investigational interleukin-12/interleukin-23 inhibitor briakinumab had significantly better scores on health-related quality of life measures than did patients given etanercept in a phase III randomized, double-blind clinical trial.
"These results further enhanced the treatment benefits of briakinumab on patients’ lives beyond the previously described clinical efficacy in significantly reducing psoriasis symptoms versus placebo and etanercept," Yanjun Bao, Ph.D., said at the annual congress of the European Academy of Dermatology and Venereology.
The 12-week study involved 347 psoriasis patients who were randomized double-blind 2:2:1 to briakinumab, etanercept at 50 mg twice weekly, or placebo. Briakinumab was dosed at 200 mg at weeks 0 and 4, then 100 mg at week 8.
Treatment with briakinumab resulted in a mean 10.3-point reduction in Dermatology Life Quality Index scores from a baseline of 12.4, which was significantly greater than the 8.1-point decrease in the etanercept group or the 3.0-point decline in the placebo arm, reported Dr. Bao of Abbott Laboratories in Abbott Park, Ill.
In addition, the briakinumab group’s mean 29.1-point improvement in the visual analog scale for psoriasis-related pain from a baseline score of 34.5 was significantly larger than the 24-point reduction with etanercept and the 6.1-point decrease with placebo. The mean score on the Short Form-36 mental component summary improved by 5.4 points in the briakinumab group from a baseline of 45.9, a significantly greater response than the 3.2-point reduction with etanercept or the 1-point decrease with placebo, she continued.
The phase III study was sponsored by Abbott.
GOTHENBURG, SWEDEN - Psoriasis patients given the investigational interleukin-12/interleukin-23 inhibitor briakinumab had significantly better scores on health-related quality of life measures than did patients given etanercept in a phase III randomized, double-blind clinical trial.
"These results further enhanced the treatment benefits of briakinumab on patients’ lives beyond the previously described clinical efficacy in significantly reducing psoriasis symptoms versus placebo and etanercept," Yanjun Bao, Ph.D., said at the annual congress of the European Academy of Dermatology and Venereology.
The 12-week study involved 347 psoriasis patients who were randomized double-blind 2:2:1 to briakinumab, etanercept at 50 mg twice weekly, or placebo. Briakinumab was dosed at 200 mg at weeks 0 and 4, then 100 mg at week 8.
Treatment with briakinumab resulted in a mean 10.3-point reduction in Dermatology Life Quality Index scores from a baseline of 12.4, which was significantly greater than the 8.1-point decrease in the etanercept group or the 3.0-point decline in the placebo arm, reported Dr. Bao of Abbott Laboratories in Abbott Park, Ill.
In addition, the briakinumab group’s mean 29.1-point improvement in the visual analog scale for psoriasis-related pain from a baseline score of 34.5 was significantly larger than the 24-point reduction with etanercept and the 6.1-point decrease with placebo. The mean score on the Short Form-36 mental component summary improved by 5.4 points in the briakinumab group from a baseline of 45.9, a significantly greater response than the 3.2-point reduction with etanercept or the 1-point decrease with placebo, she continued.
The phase III study was sponsored by Abbott.
FROM THE ANNUAL CONGRESS OF THE EUROPEAN ACADEMY OF DERMATOLOGY AND VENEREOLOGY
Major Finding: Scores on the Dermatology Life Quality Index declined by a mean of 10.3points with briakinumab, 8.1 points with etanercept, and 3.0 points with placebo.
Data Source: A 12-week double-blind study of 347 psoriasis patients randomized 2:2:1 to briakinumab, etanercept at 50 mg twice weekly, or placebo. Briakinumab was dosed at 200 mg at weeks 0 and 4, then 100 mg at week 8.
Disclosures: The phase III study was sponsored by Abbott. Dr. Bao is employed by Abbott Laboratories.
Briakinumab Boosts Quality of Life for Psoriasis Patients
GOTHENBURG, SWEDEN - Psoriasis patients given the investigational interleukin-12/interleukin-23 inhibitor briakinumab had significantly better scores on health-related quality of life measures than did patients given etanercept in a phase III randomized, double-blind clinical trial.
"These results further enhanced the treatment benefits of briakinumab on patients’ lives beyond the previously described clinical efficacy in significantly reducing psoriasis symptoms versus placebo and etanercept," Yanjun Bao, Ph.D., said at the annual congress of the European Academy of Dermatology and Venereology.
The 12-week study involved 347 psoriasis patients who were randomized double-blind 2:2:1 to briakinumab, etanercept at 50 mg twice weekly, or placebo. Briakinumab was dosed at 200 mg at weeks 0 and 4, then 100 mg at week 8.
Treatment with briakinumab resulted in a mean 10.3-point reduction in Dermatology Life Quality Index scores from a baseline of 12.4, which was significantly greater than the 8.1-point decrease in the etanercept group or the 3.0-point decline in the placebo arm, reported Dr. Bao of Abbott Laboratories in Abbott Park, Ill.
In addition, the briakinumab group’s mean 29.1-point improvement in the visual analog scale for psoriasis-related pain from a baseline score of 34.5 was significantly larger than the 24-point reduction with etanercept and the 6.1-point decrease with placebo. The mean score on the Short Form-36 mental component summary improved by 5.4 points in the briakinumab group from a baseline of 45.9, a significantly greater response than the 3.2-point reduction with etanercept or the 1-point decrease with placebo, she continued.
The phase III study was sponsored by Abbott.
GOTHENBURG, SWEDEN - Psoriasis patients given the investigational interleukin-12/interleukin-23 inhibitor briakinumab had significantly better scores on health-related quality of life measures than did patients given etanercept in a phase III randomized, double-blind clinical trial.
"These results further enhanced the treatment benefits of briakinumab on patients’ lives beyond the previously described clinical efficacy in significantly reducing psoriasis symptoms versus placebo and etanercept," Yanjun Bao, Ph.D., said at the annual congress of the European Academy of Dermatology and Venereology.
The 12-week study involved 347 psoriasis patients who were randomized double-blind 2:2:1 to briakinumab, etanercept at 50 mg twice weekly, or placebo. Briakinumab was dosed at 200 mg at weeks 0 and 4, then 100 mg at week 8.
Treatment with briakinumab resulted in a mean 10.3-point reduction in Dermatology Life Quality Index scores from a baseline of 12.4, which was significantly greater than the 8.1-point decrease in the etanercept group or the 3.0-point decline in the placebo arm, reported Dr. Bao of Abbott Laboratories in Abbott Park, Ill.
In addition, the briakinumab group’s mean 29.1-point improvement in the visual analog scale for psoriasis-related pain from a baseline score of 34.5 was significantly larger than the 24-point reduction with etanercept and the 6.1-point decrease with placebo. The mean score on the Short Form-36 mental component summary improved by 5.4 points in the briakinumab group from a baseline of 45.9, a significantly greater response than the 3.2-point reduction with etanercept or the 1-point decrease with placebo, she continued.
The phase III study was sponsored by Abbott.
GOTHENBURG, SWEDEN - Psoriasis patients given the investigational interleukin-12/interleukin-23 inhibitor briakinumab had significantly better scores on health-related quality of life measures than did patients given etanercept in a phase III randomized, double-blind clinical trial.
"These results further enhanced the treatment benefits of briakinumab on patients’ lives beyond the previously described clinical efficacy in significantly reducing psoriasis symptoms versus placebo and etanercept," Yanjun Bao, Ph.D., said at the annual congress of the European Academy of Dermatology and Venereology.
The 12-week study involved 347 psoriasis patients who were randomized double-blind 2:2:1 to briakinumab, etanercept at 50 mg twice weekly, or placebo. Briakinumab was dosed at 200 mg at weeks 0 and 4, then 100 mg at week 8.
Treatment with briakinumab resulted in a mean 10.3-point reduction in Dermatology Life Quality Index scores from a baseline of 12.4, which was significantly greater than the 8.1-point decrease in the etanercept group or the 3.0-point decline in the placebo arm, reported Dr. Bao of Abbott Laboratories in Abbott Park, Ill.
In addition, the briakinumab group’s mean 29.1-point improvement in the visual analog scale for psoriasis-related pain from a baseline score of 34.5 was significantly larger than the 24-point reduction with etanercept and the 6.1-point decrease with placebo. The mean score on the Short Form-36 mental component summary improved by 5.4 points in the briakinumab group from a baseline of 45.9, a significantly greater response than the 3.2-point reduction with etanercept or the 1-point decrease with placebo, she continued.
The phase III study was sponsored by Abbott.
FROM THE ANNUAL CONGRESS OF THE EUROPEAN ACADEMY OF DERMATOLOGY AND VENEREOLOGY
Major Finding: Scores on the Dermatology Life Quality Index declined by a mean of 10.3points with briakinumab, 8.1 points with etanercept, and 3.0 points with placebo.
Data Source: A 12-week double-blind study of 347 psoriasis patients randomized 2:2:1 to briakinumab, etanercept at 50 mg twice weekly, or placebo. Briakinumab was dosed at 200 mg at weeks 0 and 4, then 100 mg at week 8.
Disclosures: The phase III study was sponsored by Abbott. Dr. Bao is employed by Abbott Laboratories.
Marijuana Use May Protect Against Diabetes
Major Finding: The age-adjusted prevalence of diabetes was 4% in nonusers and significantly lower at 3% in marijuana users. In a multiple logistic regression analysis adjusted for sociodemographic factors, comorbid conditions, laboratory values, and inflammatory markers, marijuana users had a 66% lower likelihood of having diabetes.
Data Source: A cross-sectional study involving 10,896 NHANES III participants aged 20-59 years.
Disclosures: The study was funded by Omics Biotechnology, which is pursuing potential medical applications for nonpsychotropic cannabinoid receptor agonists. Dr. Shaheen declared she has no relevant financial relationships.
DENVER — Marijuana use may be associated with a markedly decreased risk of diabetes.
A provocative new analysis of data from the Third National Health and Nutrition Examination Survey (NHANES III) indicates marijuana users had 66% lower odds of having diabetes after adjustment for numerous potential confounding factors, Dr. Magda Shaheen reported at the meeting.
This robust observed benefit has a biologically plausible mechanism, she noted.
In addition to defects in pancreatic beta- cell function and insulin sensitivity, the pathogenesis of diabetes is thought to involve systemic inflammation. Marijuana contains bioactive cannabinoids that have been shown to have an anti-inflammatory effect. This was borne out in the NHANES III analysis, where the prevalence of an elevated C-reactive protein level in excess of 0.5 mg/dL was significantly higher in nonusers of marijuana, at 18.9%, than in past users, with a 13% prevalence of elevated CRP, current light users (16%), or current heavy users of the illicit drug (9%), according to Dr. Shaheen of Charles R. Drew University of Medicine and Science, Los Angeles.
The study population consisted of 10,896 NHANES III participants aged 20-59 years; they constituted a statistically representative sample of the broader U.S. civilian population in 1988-1994, when the survey was conducted.
The majority of subjects – 55% – reported never having used marijuana. Another 37% were past users, meaning they hadn't used marijuana during the previous month. The 6% of subjects who reported currently using the drug 1-4 days per month were categorized as current light users, while 3.3% of subjects were current heavier users.
The age-adjusted prevalence of diabetes in this cross-sectional study was 4% in nonusers and significantly lower at 3% in marijuana users.
Current and past users of marijuana were significantly younger, had a lower body mass index, and were more physically active than were nonusers. They were also more likely to smoke cigarettes, drink alcohol, and use cocaine. In addition, they were more likely to have an HDL level greater than 40 mg/dL and had lower mean total cholesterol, LDL, and triglyceride levels.
In a multiple logistic regression analysis adjusted for sociodemographic factors, comorbid conditions, laboratory values, and inflammatory markers, marijuana users had a 66% lower likelihood of having diabetes. This benefit was confined to the 41- to 59-year-old age group, where the reduction in diabetes risk associated with marijuana use was 67%. In contrast, the 7% reduction in risk among 20- to 40-year-olds was not statistically significant. These findings could be the result of the markedly higher occurrence of diabetes in middle age.
Unlike in diabetes, marijuana use was not associated with a lower prevalence of the other chronic diseases that Dr. Shaheen and coworkers looked at in which systemic inflammation also plays a role: myocardial infarction, heart failure, stroke, and hypertension. “This was probably due to the lower prevalence of these diseases in this age group,” she commented.
Major Finding: The age-adjusted prevalence of diabetes was 4% in nonusers and significantly lower at 3% in marijuana users. In a multiple logistic regression analysis adjusted for sociodemographic factors, comorbid conditions, laboratory values, and inflammatory markers, marijuana users had a 66% lower likelihood of having diabetes.
Data Source: A cross-sectional study involving 10,896 NHANES III participants aged 20-59 years.
Disclosures: The study was funded by Omics Biotechnology, which is pursuing potential medical applications for nonpsychotropic cannabinoid receptor agonists. Dr. Shaheen declared she has no relevant financial relationships.
DENVER — Marijuana use may be associated with a markedly decreased risk of diabetes.
A provocative new analysis of data from the Third National Health and Nutrition Examination Survey (NHANES III) indicates marijuana users had 66% lower odds of having diabetes after adjustment for numerous potential confounding factors, Dr. Magda Shaheen reported at the meeting.
This robust observed benefit has a biologically plausible mechanism, she noted.
In addition to defects in pancreatic beta- cell function and insulin sensitivity, the pathogenesis of diabetes is thought to involve systemic inflammation. Marijuana contains bioactive cannabinoids that have been shown to have an anti-inflammatory effect. This was borne out in the NHANES III analysis, where the prevalence of an elevated C-reactive protein level in excess of 0.5 mg/dL was significantly higher in nonusers of marijuana, at 18.9%, than in past users, with a 13% prevalence of elevated CRP, current light users (16%), or current heavy users of the illicit drug (9%), according to Dr. Shaheen of Charles R. Drew University of Medicine and Science, Los Angeles.
The study population consisted of 10,896 NHANES III participants aged 20-59 years; they constituted a statistically representative sample of the broader U.S. civilian population in 1988-1994, when the survey was conducted.
The majority of subjects – 55% – reported never having used marijuana. Another 37% were past users, meaning they hadn't used marijuana during the previous month. The 6% of subjects who reported currently using the drug 1-4 days per month were categorized as current light users, while 3.3% of subjects were current heavier users.
The age-adjusted prevalence of diabetes in this cross-sectional study was 4% in nonusers and significantly lower at 3% in marijuana users.
Current and past users of marijuana were significantly younger, had a lower body mass index, and were more physically active than were nonusers. They were also more likely to smoke cigarettes, drink alcohol, and use cocaine. In addition, they were more likely to have an HDL level greater than 40 mg/dL and had lower mean total cholesterol, LDL, and triglyceride levels.
In a multiple logistic regression analysis adjusted for sociodemographic factors, comorbid conditions, laboratory values, and inflammatory markers, marijuana users had a 66% lower likelihood of having diabetes. This benefit was confined to the 41- to 59-year-old age group, where the reduction in diabetes risk associated with marijuana use was 67%. In contrast, the 7% reduction in risk among 20- to 40-year-olds was not statistically significant. These findings could be the result of the markedly higher occurrence of diabetes in middle age.
Unlike in diabetes, marijuana use was not associated with a lower prevalence of the other chronic diseases that Dr. Shaheen and coworkers looked at in which systemic inflammation also plays a role: myocardial infarction, heart failure, stroke, and hypertension. “This was probably due to the lower prevalence of these diseases in this age group,” she commented.
Major Finding: The age-adjusted prevalence of diabetes was 4% in nonusers and significantly lower at 3% in marijuana users. In a multiple logistic regression analysis adjusted for sociodemographic factors, comorbid conditions, laboratory values, and inflammatory markers, marijuana users had a 66% lower likelihood of having diabetes.
Data Source: A cross-sectional study involving 10,896 NHANES III participants aged 20-59 years.
Disclosures: The study was funded by Omics Biotechnology, which is pursuing potential medical applications for nonpsychotropic cannabinoid receptor agonists. Dr. Shaheen declared she has no relevant financial relationships.
DENVER — Marijuana use may be associated with a markedly decreased risk of diabetes.
A provocative new analysis of data from the Third National Health and Nutrition Examination Survey (NHANES III) indicates marijuana users had 66% lower odds of having diabetes after adjustment for numerous potential confounding factors, Dr. Magda Shaheen reported at the meeting.
This robust observed benefit has a biologically plausible mechanism, she noted.
In addition to defects in pancreatic beta- cell function and insulin sensitivity, the pathogenesis of diabetes is thought to involve systemic inflammation. Marijuana contains bioactive cannabinoids that have been shown to have an anti-inflammatory effect. This was borne out in the NHANES III analysis, where the prevalence of an elevated C-reactive protein level in excess of 0.5 mg/dL was significantly higher in nonusers of marijuana, at 18.9%, than in past users, with a 13% prevalence of elevated CRP, current light users (16%), or current heavy users of the illicit drug (9%), according to Dr. Shaheen of Charles R. Drew University of Medicine and Science, Los Angeles.
The study population consisted of 10,896 NHANES III participants aged 20-59 years; they constituted a statistically representative sample of the broader U.S. civilian population in 1988-1994, when the survey was conducted.
The majority of subjects – 55% – reported never having used marijuana. Another 37% were past users, meaning they hadn't used marijuana during the previous month. The 6% of subjects who reported currently using the drug 1-4 days per month were categorized as current light users, while 3.3% of subjects were current heavier users.
The age-adjusted prevalence of diabetes in this cross-sectional study was 4% in nonusers and significantly lower at 3% in marijuana users.
Current and past users of marijuana were significantly younger, had a lower body mass index, and were more physically active than were nonusers. They were also more likely to smoke cigarettes, drink alcohol, and use cocaine. In addition, they were more likely to have an HDL level greater than 40 mg/dL and had lower mean total cholesterol, LDL, and triglyceride levels.
In a multiple logistic regression analysis adjusted for sociodemographic factors, comorbid conditions, laboratory values, and inflammatory markers, marijuana users had a 66% lower likelihood of having diabetes. This benefit was confined to the 41- to 59-year-old age group, where the reduction in diabetes risk associated with marijuana use was 67%. In contrast, the 7% reduction in risk among 20- to 40-year-olds was not statistically significant. These findings could be the result of the markedly higher occurrence of diabetes in middle age.
Unlike in diabetes, marijuana use was not associated with a lower prevalence of the other chronic diseases that Dr. Shaheen and coworkers looked at in which systemic inflammation also plays a role: myocardial infarction, heart failure, stroke, and hypertension. “This was probably due to the lower prevalence of these diseases in this age group,” she commented.
From the Annual Meeting of the American Public Health Association
Managing Rheumatologic Diseases in Pregnancy
SNOWMASS, COLO. – Corticosteroids can be thought of as the 'go-to' drugs for the management of rheumatologic disorders in pregnancy.
“Corticosteroids have been my ace in the hole in treating many patients during pregnancy. They're potent immunosuppressives that can get you out of a lot of trouble. And although they can have side effects, if used judiciously they are a reasonable treatment choice,” Dr. Bonnie L. Bermas stressed at the symposium.
Reassuringly, transplant registries comprising many tens of thousands of organ recipients have shown no increased rate of congenital anomalies with the use of corticosteroids in pregnancy.
However, an influential University of Toronto meta-analysis has concluded that “although prednisone does not represent a major teratogenic risk in humans at therapeutic doses, it does increase by an order of 3.4-fold the risk of oral cleft” (Teratology 2000;62:385-92).
“What this translates to in your practice is, the cleft palate incidence increases from 1 in 1,000 in the general population to about 1 in 300 live births exposed to steroids in utero. That's how I counsel my patients who need to be on corticosteroids in the first trimester,” said Dr. Bermas, clinical director of the lupus center at Brigham and Women's Hospital in Boston.
After 12-14 weeks' gestation, however, the palate is formed. And although steroids are no longer associated with an increased risk for cleft palate after that point in gestation, other risks remain. These include gestational diabetes, gestational hypertension, osteoporosis in the mother, premature rupture of the membranes, and small-for-gestational-age infants.
Prednisone and methylprednisolone—the steroids rheumatologists utilize most often—don't cross the placenta efficiently, and hence are much less likely to cause fetal adverse effects than are dexamethasone or betamethasone.
Steroids that are administered to the mother make their way into breast milk only in low concentrations. If she's on less than 20 mg/day of prednisone, she can breastfeed normally. For women on higher doses, Dr. Bermas advises pumping and discarding the breast milk for the first 4 hours after a dose is taken.
Dr. Bermas emphasized that the key to successful treatment of rheumatologic disorders during pregnancy is a clear-eyed assessment of and accommodation to the patient's tolerance for risk—and the physician's, as well.
“There are some women who do not drink caffeinated beverages or take any medications, not even a Tylenol, and who will eat only organic foods while pregnant. There are others who are willing to tolerate some risk during pregnancy. And as clinicians, we have our own risk tolerances, too. For example, azathioprine is a medication that I feel comfortable using during pregnancy, but I have colleagues who won't because they wouldn't be able to sleep at night,” she explained.
The reason she prescribes azathioprine during pregnancy—despite its category D rating from the Food and Drug Administration, indicating “positive evidence of risk”—is that there's an enormous transplant literature showing no increase in congenital anomalies with in utero exposure to this drug.
Mycophenolate mofetil (CellCept) also has a category D rating. But unlike azathioprine, mycophenolate mofetil has no extensive and reassuring transplant literature. As a result, Dr. Bermas said that she avoids it in pregnancy and nursing.
Other rheumatologic medications to avoid in pregnancy are methotrexate, penicillamine, 6-mercaptopurine, and chlorambucil, she continued.
The use of tumor necrosis factor inhibitors during pregnancy is an extremely challenging question. Although at present the FDA rates them as category B (“no evidence of risk in humans”), that could very well change as a result of a reported association (J. Rheumatol. 2009;36:635-41) with VACTERL anomalies, which include vertebral anomalies, anal atresia, cardiac defects, tracheoesophageal fistula, renal anomalies, and limb abnormalities. On the other hand, several editorials and review articles have expressed the view that the risk of VACTERL anomaly after in utero exposure is overstated.
When lupus patients on antimalarials become pregnant, Dr. Bermas said she generally keeps them on the medication. She also allows patients to remain on antimalarials while nursing, which is consistent with the position of the American Academy of Pediatrics.
For mild cases of rheumatologic disease in pregnancy, Dr. Bermas reported that she relies on NSAIDs and/or prednisone at 5-10 mg/day. She halts the NSAID after the second trimester in order to avoid premature closure of a patent ductus arteriosus. For an inflammatory mild arthritis, she considers adding sulfasalazine.
She said she manages moderate disease with higher-dose steroids, azathioprine, or cyclosporine. For severe disease, Dr. Bermas reported that she turns to pulse steroids, azathioprine, cyclosporine, or intravenous immunoglobulin. In life-or-death situations, there are many case reports of cyclophosphamide being used successfully in the third trimester, a time by which most organogenesis is completed.
Dr. Bermas reported having no financial conflicts of interest.
“Corticosteroids have been my ace in the hole in treating many patients … [They] can get you out of a lot of trouble.'
Source DR. BERMAS
SNOWMASS, COLO. – Corticosteroids can be thought of as the 'go-to' drugs for the management of rheumatologic disorders in pregnancy.
“Corticosteroids have been my ace in the hole in treating many patients during pregnancy. They're potent immunosuppressives that can get you out of a lot of trouble. And although they can have side effects, if used judiciously they are a reasonable treatment choice,” Dr. Bonnie L. Bermas stressed at the symposium.
Reassuringly, transplant registries comprising many tens of thousands of organ recipients have shown no increased rate of congenital anomalies with the use of corticosteroids in pregnancy.
However, an influential University of Toronto meta-analysis has concluded that “although prednisone does not represent a major teratogenic risk in humans at therapeutic doses, it does increase by an order of 3.4-fold the risk of oral cleft” (Teratology 2000;62:385-92).
“What this translates to in your practice is, the cleft palate incidence increases from 1 in 1,000 in the general population to about 1 in 300 live births exposed to steroids in utero. That's how I counsel my patients who need to be on corticosteroids in the first trimester,” said Dr. Bermas, clinical director of the lupus center at Brigham and Women's Hospital in Boston.
After 12-14 weeks' gestation, however, the palate is formed. And although steroids are no longer associated with an increased risk for cleft palate after that point in gestation, other risks remain. These include gestational diabetes, gestational hypertension, osteoporosis in the mother, premature rupture of the membranes, and small-for-gestational-age infants.
Prednisone and methylprednisolone—the steroids rheumatologists utilize most often—don't cross the placenta efficiently, and hence are much less likely to cause fetal adverse effects than are dexamethasone or betamethasone.
Steroids that are administered to the mother make their way into breast milk only in low concentrations. If she's on less than 20 mg/day of prednisone, she can breastfeed normally. For women on higher doses, Dr. Bermas advises pumping and discarding the breast milk for the first 4 hours after a dose is taken.
Dr. Bermas emphasized that the key to successful treatment of rheumatologic disorders during pregnancy is a clear-eyed assessment of and accommodation to the patient's tolerance for risk—and the physician's, as well.
“There are some women who do not drink caffeinated beverages or take any medications, not even a Tylenol, and who will eat only organic foods while pregnant. There are others who are willing to tolerate some risk during pregnancy. And as clinicians, we have our own risk tolerances, too. For example, azathioprine is a medication that I feel comfortable using during pregnancy, but I have colleagues who won't because they wouldn't be able to sleep at night,” she explained.
The reason she prescribes azathioprine during pregnancy—despite its category D rating from the Food and Drug Administration, indicating “positive evidence of risk”—is that there's an enormous transplant literature showing no increase in congenital anomalies with in utero exposure to this drug.
Mycophenolate mofetil (CellCept) also has a category D rating. But unlike azathioprine, mycophenolate mofetil has no extensive and reassuring transplant literature. As a result, Dr. Bermas said that she avoids it in pregnancy and nursing.
Other rheumatologic medications to avoid in pregnancy are methotrexate, penicillamine, 6-mercaptopurine, and chlorambucil, she continued.
The use of tumor necrosis factor inhibitors during pregnancy is an extremely challenging question. Although at present the FDA rates them as category B (“no evidence of risk in humans”), that could very well change as a result of a reported association (J. Rheumatol. 2009;36:635-41) with VACTERL anomalies, which include vertebral anomalies, anal atresia, cardiac defects, tracheoesophageal fistula, renal anomalies, and limb abnormalities. On the other hand, several editorials and review articles have expressed the view that the risk of VACTERL anomaly after in utero exposure is overstated.
When lupus patients on antimalarials become pregnant, Dr. Bermas said she generally keeps them on the medication. She also allows patients to remain on antimalarials while nursing, which is consistent with the position of the American Academy of Pediatrics.
For mild cases of rheumatologic disease in pregnancy, Dr. Bermas reported that she relies on NSAIDs and/or prednisone at 5-10 mg/day. She halts the NSAID after the second trimester in order to avoid premature closure of a patent ductus arteriosus. For an inflammatory mild arthritis, she considers adding sulfasalazine.
She said she manages moderate disease with higher-dose steroids, azathioprine, or cyclosporine. For severe disease, Dr. Bermas reported that she turns to pulse steroids, azathioprine, cyclosporine, or intravenous immunoglobulin. In life-or-death situations, there are many case reports of cyclophosphamide being used successfully in the third trimester, a time by which most organogenesis is completed.
Dr. Bermas reported having no financial conflicts of interest.
“Corticosteroids have been my ace in the hole in treating many patients … [They] can get you out of a lot of trouble.'
Source DR. BERMAS
SNOWMASS, COLO. – Corticosteroids can be thought of as the 'go-to' drugs for the management of rheumatologic disorders in pregnancy.
“Corticosteroids have been my ace in the hole in treating many patients during pregnancy. They're potent immunosuppressives that can get you out of a lot of trouble. And although they can have side effects, if used judiciously they are a reasonable treatment choice,” Dr. Bonnie L. Bermas stressed at the symposium.
Reassuringly, transplant registries comprising many tens of thousands of organ recipients have shown no increased rate of congenital anomalies with the use of corticosteroids in pregnancy.
However, an influential University of Toronto meta-analysis has concluded that “although prednisone does not represent a major teratogenic risk in humans at therapeutic doses, it does increase by an order of 3.4-fold the risk of oral cleft” (Teratology 2000;62:385-92).
“What this translates to in your practice is, the cleft palate incidence increases from 1 in 1,000 in the general population to about 1 in 300 live births exposed to steroids in utero. That's how I counsel my patients who need to be on corticosteroids in the first trimester,” said Dr. Bermas, clinical director of the lupus center at Brigham and Women's Hospital in Boston.
After 12-14 weeks' gestation, however, the palate is formed. And although steroids are no longer associated with an increased risk for cleft palate after that point in gestation, other risks remain. These include gestational diabetes, gestational hypertension, osteoporosis in the mother, premature rupture of the membranes, and small-for-gestational-age infants.
Prednisone and methylprednisolone—the steroids rheumatologists utilize most often—don't cross the placenta efficiently, and hence are much less likely to cause fetal adverse effects than are dexamethasone or betamethasone.
Steroids that are administered to the mother make their way into breast milk only in low concentrations. If she's on less than 20 mg/day of prednisone, she can breastfeed normally. For women on higher doses, Dr. Bermas advises pumping and discarding the breast milk for the first 4 hours after a dose is taken.
Dr. Bermas emphasized that the key to successful treatment of rheumatologic disorders during pregnancy is a clear-eyed assessment of and accommodation to the patient's tolerance for risk—and the physician's, as well.
“There are some women who do not drink caffeinated beverages or take any medications, not even a Tylenol, and who will eat only organic foods while pregnant. There are others who are willing to tolerate some risk during pregnancy. And as clinicians, we have our own risk tolerances, too. For example, azathioprine is a medication that I feel comfortable using during pregnancy, but I have colleagues who won't because they wouldn't be able to sleep at night,” she explained.
The reason she prescribes azathioprine during pregnancy—despite its category D rating from the Food and Drug Administration, indicating “positive evidence of risk”—is that there's an enormous transplant literature showing no increase in congenital anomalies with in utero exposure to this drug.
Mycophenolate mofetil (CellCept) also has a category D rating. But unlike azathioprine, mycophenolate mofetil has no extensive and reassuring transplant literature. As a result, Dr. Bermas said that she avoids it in pregnancy and nursing.
Other rheumatologic medications to avoid in pregnancy are methotrexate, penicillamine, 6-mercaptopurine, and chlorambucil, she continued.
The use of tumor necrosis factor inhibitors during pregnancy is an extremely challenging question. Although at present the FDA rates them as category B (“no evidence of risk in humans”), that could very well change as a result of a reported association (J. Rheumatol. 2009;36:635-41) with VACTERL anomalies, which include vertebral anomalies, anal atresia, cardiac defects, tracheoesophageal fistula, renal anomalies, and limb abnormalities. On the other hand, several editorials and review articles have expressed the view that the risk of VACTERL anomaly after in utero exposure is overstated.
When lupus patients on antimalarials become pregnant, Dr. Bermas said she generally keeps them on the medication. She also allows patients to remain on antimalarials while nursing, which is consistent with the position of the American Academy of Pediatrics.
For mild cases of rheumatologic disease in pregnancy, Dr. Bermas reported that she relies on NSAIDs and/or prednisone at 5-10 mg/day. She halts the NSAID after the second trimester in order to avoid premature closure of a patent ductus arteriosus. For an inflammatory mild arthritis, she considers adding sulfasalazine.
She said she manages moderate disease with higher-dose steroids, azathioprine, or cyclosporine. For severe disease, Dr. Bermas reported that she turns to pulse steroids, azathioprine, cyclosporine, or intravenous immunoglobulin. In life-or-death situations, there are many case reports of cyclophosphamide being used successfully in the third trimester, a time by which most organogenesis is completed.
Dr. Bermas reported having no financial conflicts of interest.
“Corticosteroids have been my ace in the hole in treating many patients … [They] can get you out of a lot of trouble.'
Source DR. BERMAS
from a Symposium Sponsored by the American College of Rheumatology
Suture Selection Optimizes Surgical Repair
DENVER – Use of unidirectional knotless barbed suture in laparoscopic myomectomy offers several key advantages over conventional continuous suture with intraoperative knots, according to an award-winning prospective randomized trial.
The unidirectional knotless barbed suture resulted in faster repair of uterine wall defects as well as less intraoperative blood loss, Dr. Simone Ferrero reported at the meeting.
Thus, using unidirectional knotless barbed suture to close uterine wall defects after laparoscopic enucleation of fibroids solves two of the biggest challenges laparoscopic surgeons face in endoscopic suturing: the difficulties in knot tying and maintenance of adequate tension on the suture line, said Dr. Ferrero of San Martino Hospital and the University of Genoa (Italy).
He reported on 44 women undergoing laparoscopic myomectomy who were randomized to closure of uterine wall defects with either the V-Loc TM 180 barbed absorbable thread made by Covidien or with Ethicon's Vicryl suture material.
Participants had a median of one intramural fibroid averaging 7.5 cm at its greatest diameter.
Location of the myomas was similar in the two groups.
The V-Loc features a surgical needle at one end and a loop at the other, which is used to secure the barbed suture; when the suture is completed, the surgeon snips off and removes the needle.
Total operative time in the two study arms was similar.
However, the mean 11.5 minutes required to suture the uterine wall defect in the V-Loc group was significantly briefer than the 17.4 minutes with the continuous suture with intraoperative knots.
Moreover, the mean difference in hemoglobin concentration between the day before surgery and the day after was 0.6 g/dL in the V-Loc group compared with 0.9 g/dL with Vicryl with intraoperative knots, indicating significantly less intraoperative blood loss occurred in women whose uterine wall defects were repaired with the unidirectional knotless barbed suture, he reported.
The likely explanation for the reduced blood loss with the use of V-Loc stems from faster closure of the defects coupled with the fact that the tension on the suture line causes the suture to resist migration, Dr. Ferrero continued.
After each operation, the surgeons rated the degree of difficulty in suturing the uterine wall defects using a 1-10 visual analog scale.
Surgeons rated the degree of surgical difficulty using continuous suture with intraoperative knots as nearly twice as great, with a mean score of 6.1 out of a possible 10, compared with 3.7 with unilateral knotless barbed suture.
The Italian single-center clinical trial was awarded the Society of Reproductive Surgeons' Prize as the outstanding study in that field presented at the ASRM meeting.
Planned future studies include an evaluation of whether the use of a unidirectional knotless barbed suture affects the risk of uterine rupture during pregnancy.
Dr. Ferrero and his colleagues have an ongoing study looking to see if the uterine wall scar 6 months post myomectomy is different depending upon the type of suture used in the repair.
Dr. Ferrero said he had no financial conflicts of interest.
DENVER – Use of unidirectional knotless barbed suture in laparoscopic myomectomy offers several key advantages over conventional continuous suture with intraoperative knots, according to an award-winning prospective randomized trial.
The unidirectional knotless barbed suture resulted in faster repair of uterine wall defects as well as less intraoperative blood loss, Dr. Simone Ferrero reported at the meeting.
Thus, using unidirectional knotless barbed suture to close uterine wall defects after laparoscopic enucleation of fibroids solves two of the biggest challenges laparoscopic surgeons face in endoscopic suturing: the difficulties in knot tying and maintenance of adequate tension on the suture line, said Dr. Ferrero of San Martino Hospital and the University of Genoa (Italy).
He reported on 44 women undergoing laparoscopic myomectomy who were randomized to closure of uterine wall defects with either the V-Loc TM 180 barbed absorbable thread made by Covidien or with Ethicon's Vicryl suture material.
Participants had a median of one intramural fibroid averaging 7.5 cm at its greatest diameter.
Location of the myomas was similar in the two groups.
The V-Loc features a surgical needle at one end and a loop at the other, which is used to secure the barbed suture; when the suture is completed, the surgeon snips off and removes the needle.
Total operative time in the two study arms was similar.
However, the mean 11.5 minutes required to suture the uterine wall defect in the V-Loc group was significantly briefer than the 17.4 minutes with the continuous suture with intraoperative knots.
Moreover, the mean difference in hemoglobin concentration between the day before surgery and the day after was 0.6 g/dL in the V-Loc group compared with 0.9 g/dL with Vicryl with intraoperative knots, indicating significantly less intraoperative blood loss occurred in women whose uterine wall defects were repaired with the unidirectional knotless barbed suture, he reported.
The likely explanation for the reduced blood loss with the use of V-Loc stems from faster closure of the defects coupled with the fact that the tension on the suture line causes the suture to resist migration, Dr. Ferrero continued.
After each operation, the surgeons rated the degree of difficulty in suturing the uterine wall defects using a 1-10 visual analog scale.
Surgeons rated the degree of surgical difficulty using continuous suture with intraoperative knots as nearly twice as great, with a mean score of 6.1 out of a possible 10, compared with 3.7 with unilateral knotless barbed suture.
The Italian single-center clinical trial was awarded the Society of Reproductive Surgeons' Prize as the outstanding study in that field presented at the ASRM meeting.
Planned future studies include an evaluation of whether the use of a unidirectional knotless barbed suture affects the risk of uterine rupture during pregnancy.
Dr. Ferrero and his colleagues have an ongoing study looking to see if the uterine wall scar 6 months post myomectomy is different depending upon the type of suture used in the repair.
Dr. Ferrero said he had no financial conflicts of interest.
DENVER – Use of unidirectional knotless barbed suture in laparoscopic myomectomy offers several key advantages over conventional continuous suture with intraoperative knots, according to an award-winning prospective randomized trial.
The unidirectional knotless barbed suture resulted in faster repair of uterine wall defects as well as less intraoperative blood loss, Dr. Simone Ferrero reported at the meeting.
Thus, using unidirectional knotless barbed suture to close uterine wall defects after laparoscopic enucleation of fibroids solves two of the biggest challenges laparoscopic surgeons face in endoscopic suturing: the difficulties in knot tying and maintenance of adequate tension on the suture line, said Dr. Ferrero of San Martino Hospital and the University of Genoa (Italy).
He reported on 44 women undergoing laparoscopic myomectomy who were randomized to closure of uterine wall defects with either the V-Loc TM 180 barbed absorbable thread made by Covidien or with Ethicon's Vicryl suture material.
Participants had a median of one intramural fibroid averaging 7.5 cm at its greatest diameter.
Location of the myomas was similar in the two groups.
The V-Loc features a surgical needle at one end and a loop at the other, which is used to secure the barbed suture; when the suture is completed, the surgeon snips off and removes the needle.
Total operative time in the two study arms was similar.
However, the mean 11.5 minutes required to suture the uterine wall defect in the V-Loc group was significantly briefer than the 17.4 minutes with the continuous suture with intraoperative knots.
Moreover, the mean difference in hemoglobin concentration between the day before surgery and the day after was 0.6 g/dL in the V-Loc group compared with 0.9 g/dL with Vicryl with intraoperative knots, indicating significantly less intraoperative blood loss occurred in women whose uterine wall defects were repaired with the unidirectional knotless barbed suture, he reported.
The likely explanation for the reduced blood loss with the use of V-Loc stems from faster closure of the defects coupled with the fact that the tension on the suture line causes the suture to resist migration, Dr. Ferrero continued.
After each operation, the surgeons rated the degree of difficulty in suturing the uterine wall defects using a 1-10 visual analog scale.
Surgeons rated the degree of surgical difficulty using continuous suture with intraoperative knots as nearly twice as great, with a mean score of 6.1 out of a possible 10, compared with 3.7 with unilateral knotless barbed suture.
The Italian single-center clinical trial was awarded the Society of Reproductive Surgeons' Prize as the outstanding study in that field presented at the ASRM meeting.
Planned future studies include an evaluation of whether the use of a unidirectional knotless barbed suture affects the risk of uterine rupture during pregnancy.
Dr. Ferrero and his colleagues have an ongoing study looking to see if the uterine wall scar 6 months post myomectomy is different depending upon the type of suture used in the repair.
Dr. Ferrero said he had no financial conflicts of interest.
From the Annual Meeting of the American Society for Reproductive Medicine