User login
Women usually given dabigatran in lower dose
In real-world practice, women who are prescribed dabigatran for atrial fibrillation (AF) are usually given the lower dose of the drug, even though the higher dose appears to be more protective against stroke, according to Canadian database analysis published online Oct. 27 in Circulation: Cardiovascular Quality and Outcomes.
In RE-LY, the main randomized clinical trial showing that dabigatran is more effective at preventing stroke and provoked fewer bleeding episodes than warfarin, only 37% of the participants were women (N Engl J Med. 2009 Sep 17; 361[12]1139-51).This raises the question of whether the study’s results are truly applicable to women. In addition, the women in that trial showed plasma concentrations of dabigatran that were 30% higher than those in men, suggesting that the drug’s safety profile may differ between women and men. However, there was no mention of sex differences related to outcomes, said Meytal Avgil Tsadok, Ph.D., of the division of clinical epidemiology, McGill University Health Center, Montreal, and her associates.
To examine sex-based differences in prescribing patterns in real-world practice, the investigators performed a population-based cohort study among 631,110 residents of Quebec who were discharged from the hospital with either a primary or a secondary diagnosis of AF during a 14-year period. They identified 15,918 dabigatran users and matched them for comorbidity, age at AF diagnosis, and date of first prescription for anticoagulants with 47,192 warfarin users (control subjects). The 31,786 women and 31,324 men participating in this study were followed for a median of 1.3 years (range, 0-3.2 years) for the development of stroke/TIA, bleeding events, or hospitalization for MI.
The researchers found that dabigatran use differed markedly between women and men. Men were prescribed the lower dose (110 mg) in nearly equal numbers with the higher dose (150 mg) of dabigatran, but women were prescribed the lower dose (64.8%) much more often than the higher dose (35.2%). In a further analysis of the data, women were much more likely than men to fill prescriptions of the lower dose (odds ratio, 1.35). This was true even though women, but not men, showed a trend toward a lower incidence of stroke when prescribed the higher dose of dabigatran, compared with warfarin.
These prescribing practices remained consistent even when the participants were categorized according to age. More women than men used low-dose dabigatran whether they were younger than 75 years of age (22.8% vs. 18.5%) or older than 75 years (83.5% vs. 76.0%). These findings show that, regardless of patient age or comorbidities, “women have 35% higher chances to be prescribed a lower dabigatran dose than men, although women have a higher baseline risk for stroke,” Dr. Tsadok and her associates said (Circ Cardiovasc Qual Outcomes. 2015 Oct 27 [doi: 10.1161/circoutcomes.114.001398]).
The reason for this discrepancy is not yet known. A similar pattern of prescribing was noted in a Danish population-based cohort study. It’s possible that clinicians perceive women as frailer patients than men, “so they tend to be more concerned about safety and, therefore, prescribe women with a lower dose, compromising efficacy,” the investigators said.
Their study was limited in that follow-up was relatively short at approximately 1 year. It is therefore possible that they underestimated the risks of stroke/TIA, bleeding events, or MI hospitalization, Dr. Tsadok and her associates added.
This study was supported by the Canadian Institutes of Health Research. Dr. Tsadok and her associates reported having no relevant financial disclosures.
In real-world practice, women who are prescribed dabigatran for atrial fibrillation (AF) are usually given the lower dose of the drug, even though the higher dose appears to be more protective against stroke, according to Canadian database analysis published online Oct. 27 in Circulation: Cardiovascular Quality and Outcomes.
In RE-LY, the main randomized clinical trial showing that dabigatran is more effective at preventing stroke and provoked fewer bleeding episodes than warfarin, only 37% of the participants were women (N Engl J Med. 2009 Sep 17; 361[12]1139-51).This raises the question of whether the study’s results are truly applicable to women. In addition, the women in that trial showed plasma concentrations of dabigatran that were 30% higher than those in men, suggesting that the drug’s safety profile may differ between women and men. However, there was no mention of sex differences related to outcomes, said Meytal Avgil Tsadok, Ph.D., of the division of clinical epidemiology, McGill University Health Center, Montreal, and her associates.
To examine sex-based differences in prescribing patterns in real-world practice, the investigators performed a population-based cohort study among 631,110 residents of Quebec who were discharged from the hospital with either a primary or a secondary diagnosis of AF during a 14-year period. They identified 15,918 dabigatran users and matched them for comorbidity, age at AF diagnosis, and date of first prescription for anticoagulants with 47,192 warfarin users (control subjects). The 31,786 women and 31,324 men participating in this study were followed for a median of 1.3 years (range, 0-3.2 years) for the development of stroke/TIA, bleeding events, or hospitalization for MI.
The researchers found that dabigatran use differed markedly between women and men. Men were prescribed the lower dose (110 mg) in nearly equal numbers with the higher dose (150 mg) of dabigatran, but women were prescribed the lower dose (64.8%) much more often than the higher dose (35.2%). In a further analysis of the data, women were much more likely than men to fill prescriptions of the lower dose (odds ratio, 1.35). This was true even though women, but not men, showed a trend toward a lower incidence of stroke when prescribed the higher dose of dabigatran, compared with warfarin.
These prescribing practices remained consistent even when the participants were categorized according to age. More women than men used low-dose dabigatran whether they were younger than 75 years of age (22.8% vs. 18.5%) or older than 75 years (83.5% vs. 76.0%). These findings show that, regardless of patient age or comorbidities, “women have 35% higher chances to be prescribed a lower dabigatran dose than men, although women have a higher baseline risk for stroke,” Dr. Tsadok and her associates said (Circ Cardiovasc Qual Outcomes. 2015 Oct 27 [doi: 10.1161/circoutcomes.114.001398]).
The reason for this discrepancy is not yet known. A similar pattern of prescribing was noted in a Danish population-based cohort study. It’s possible that clinicians perceive women as frailer patients than men, “so they tend to be more concerned about safety and, therefore, prescribe women with a lower dose, compromising efficacy,” the investigators said.
Their study was limited in that follow-up was relatively short at approximately 1 year. It is therefore possible that they underestimated the risks of stroke/TIA, bleeding events, or MI hospitalization, Dr. Tsadok and her associates added.
This study was supported by the Canadian Institutes of Health Research. Dr. Tsadok and her associates reported having no relevant financial disclosures.
In real-world practice, women who are prescribed dabigatran for atrial fibrillation (AF) are usually given the lower dose of the drug, even though the higher dose appears to be more protective against stroke, according to Canadian database analysis published online Oct. 27 in Circulation: Cardiovascular Quality and Outcomes.
In RE-LY, the main randomized clinical trial showing that dabigatran is more effective at preventing stroke and provoked fewer bleeding episodes than warfarin, only 37% of the participants were women (N Engl J Med. 2009 Sep 17; 361[12]1139-51).This raises the question of whether the study’s results are truly applicable to women. In addition, the women in that trial showed plasma concentrations of dabigatran that were 30% higher than those in men, suggesting that the drug’s safety profile may differ between women and men. However, there was no mention of sex differences related to outcomes, said Meytal Avgil Tsadok, Ph.D., of the division of clinical epidemiology, McGill University Health Center, Montreal, and her associates.
To examine sex-based differences in prescribing patterns in real-world practice, the investigators performed a population-based cohort study among 631,110 residents of Quebec who were discharged from the hospital with either a primary or a secondary diagnosis of AF during a 14-year period. They identified 15,918 dabigatran users and matched them for comorbidity, age at AF diagnosis, and date of first prescription for anticoagulants with 47,192 warfarin users (control subjects). The 31,786 women and 31,324 men participating in this study were followed for a median of 1.3 years (range, 0-3.2 years) for the development of stroke/TIA, bleeding events, or hospitalization for MI.
The researchers found that dabigatran use differed markedly between women and men. Men were prescribed the lower dose (110 mg) in nearly equal numbers with the higher dose (150 mg) of dabigatran, but women were prescribed the lower dose (64.8%) much more often than the higher dose (35.2%). In a further analysis of the data, women were much more likely than men to fill prescriptions of the lower dose (odds ratio, 1.35). This was true even though women, but not men, showed a trend toward a lower incidence of stroke when prescribed the higher dose of dabigatran, compared with warfarin.
These prescribing practices remained consistent even when the participants were categorized according to age. More women than men used low-dose dabigatran whether they were younger than 75 years of age (22.8% vs. 18.5%) or older than 75 years (83.5% vs. 76.0%). These findings show that, regardless of patient age or comorbidities, “women have 35% higher chances to be prescribed a lower dabigatran dose than men, although women have a higher baseline risk for stroke,” Dr. Tsadok and her associates said (Circ Cardiovasc Qual Outcomes. 2015 Oct 27 [doi: 10.1161/circoutcomes.114.001398]).
The reason for this discrepancy is not yet known. A similar pattern of prescribing was noted in a Danish population-based cohort study. It’s possible that clinicians perceive women as frailer patients than men, “so they tend to be more concerned about safety and, therefore, prescribe women with a lower dose, compromising efficacy,” the investigators said.
Their study was limited in that follow-up was relatively short at approximately 1 year. It is therefore possible that they underestimated the risks of stroke/TIA, bleeding events, or MI hospitalization, Dr. Tsadok and her associates added.
This study was supported by the Canadian Institutes of Health Research. Dr. Tsadok and her associates reported having no relevant financial disclosures.
FROM CIRCULATION: CARDIOVASCULAR QUALITY AND OUTCOMES
Key clinical point: Women prescribed dabigatran for AF are usually given the lower dose of the drug, for unknown reasons.
Major finding: Men were prescribed the lower dose (110 mg) in nearly equal numbers with the higher dose (150 mg) of dabigatran, but women were prescribed the lower dose (64.8%) much more often than the higher dose (35.2%).
Data source: An analysis of prescribing patterns in a population-based cohort of 31,786 women and 31,324 men who had AF living in Quebec.
Disclosures: This study was supported by the Canadian Institutes of Health Research. Dr. Tsadok and her associates reported having no relevant financial disclosures.
Physicians unlikely to scale back doses of BP, glycemic meds in elderly
Physicians seem to be unwilling to reduce the dose used of antihypertensive and hypoglycemic medications in older patients, even when these treatments reduce blood pressure and hemoglobin A1c to well below recommended levels and can cause clear harm, according to a report published online Oct. 26 in JAMA Internal Medicine.
This indicates that clinicians must adopt a new perspective regarding cardiovascular treatments and “assess the harms of intensive therapy just as they do the benefits,” wrote Dr. Jeremy B. Sussman of the Department of Veterans Affairs Center for Clinical Management Research and the University of Michigan’s Institute of Healthcare Policy and Innovation, both in Ann Arbor.
To examine the frequency of cutting the intensity of treatment among older patients with type 2 diabetes, Dr. Sussman and his colleagues performed a retrospective analysis of a Veterans Affairs database, focusing on all primary care patients aged 70 years and older with type 2 diabetes. They assessed pharmacy records from a 1-year period to identify deintensification among 211,667 patients who were receiving antihypertensive medications and 179,991 who were receiving medications to reduce HbA1c. Many had multiple comorbidities, and many were nearing the end of their lives.
A total of 51% of the BP cohort and 20% of the HbA1c cohort achieved blood pressure readings or HbA1c levels either lower or much lower than recommended target levels, yet physicians did not reduce or change their medications. Just as worrisome, patients with very low BP or very low HbA1c were no more likely than were those with normal levels to undergo medication adjustments, the investigators said (JAMA Intern Med. 2015 Oct 26. doi: 10.1001/jamainternmed.2015.5110).
In fact, the majority (61.6%) of patients with very low, potentially dangerous blood pressure did not have their blood pressure measured during the ensuing 6 months, and the majority (79.8%) of patients with very low, potentially dangerous HbA1c did not have their HbA1c measured during the ensuing 6 months. This suggests that health care professionals did not recognize very low levels as a problem in need of monitoring, Dr. Sussman and his associates noted.
Most concerning of all, even patients with very low BP and/or very low HbA1c levels who had a short life expectancy were unlikely to have their medication regimen eased up. Such patients are particularly unlikely to benefit from these therapies and are particularly vulnerable to their adverse effects, the researchers said.
One reason for this kind of overtreatment is that its harms usually are not addressed in clinical guidelines, quality-of-care measures, or pay-for-performance programs. “Until guidelines and performance measures specifically call for deintensification for patients who are at risk for being harmed by overtreatment, rates [of deintensification] are likely to remain low,” they added.
The failure to reduce or change antihypertensive or glycemic medication – even when the patient’s blood pressure and HbA1c are very low and even when the patient has a short life expectancy – indicates that physicians are generally reluctant to reduce the intensity of treatment.
Sussman, et al. call for changing clinical guidelines, quality measures, and performance management to include recommendations and incentives to avoid overtreatment. But before this can be done, the harms of overtreatment must be better documented, specific risk groups must be identified, and particular target levels for BP and HbA1c must be determined, using data from both clinical trials and large observational studies.
Clinical performance measures coupling racheting down the intensity of treatment with appropriate clinical assessments and monitoring seem reasonable to safely discontinue unnecessary and potentially harmful treatments while retaining the benefits of cardiovascular prevention.
Dr. Enrico Mossello is in the division of geriatric medicine and cardiology and the department of experimental and clinical medicine at the University of Florence (Italy) and Careggi Teaching Hospital. He reported having no relevant financial disclosures. Dr. Mossello made these remarks in an Invited Commentary accompanying Dr. Sussman’s report (JAMA Intern Med. 2015 Oct 26. doi: 10.1001/jamainternmed.2015.5941).
The failure to reduce or change antihypertensive or glycemic medication – even when the patient’s blood pressure and HbA1c are very low and even when the patient has a short life expectancy – indicates that physicians are generally reluctant to reduce the intensity of treatment.
Sussman, et al. call for changing clinical guidelines, quality measures, and performance management to include recommendations and incentives to avoid overtreatment. But before this can be done, the harms of overtreatment must be better documented, specific risk groups must be identified, and particular target levels for BP and HbA1c must be determined, using data from both clinical trials and large observational studies.
Clinical performance measures coupling racheting down the intensity of treatment with appropriate clinical assessments and monitoring seem reasonable to safely discontinue unnecessary and potentially harmful treatments while retaining the benefits of cardiovascular prevention.
Dr. Enrico Mossello is in the division of geriatric medicine and cardiology and the department of experimental and clinical medicine at the University of Florence (Italy) and Careggi Teaching Hospital. He reported having no relevant financial disclosures. Dr. Mossello made these remarks in an Invited Commentary accompanying Dr. Sussman’s report (JAMA Intern Med. 2015 Oct 26. doi: 10.1001/jamainternmed.2015.5941).
The failure to reduce or change antihypertensive or glycemic medication – even when the patient’s blood pressure and HbA1c are very low and even when the patient has a short life expectancy – indicates that physicians are generally reluctant to reduce the intensity of treatment.
Sussman, et al. call for changing clinical guidelines, quality measures, and performance management to include recommendations and incentives to avoid overtreatment. But before this can be done, the harms of overtreatment must be better documented, specific risk groups must be identified, and particular target levels for BP and HbA1c must be determined, using data from both clinical trials and large observational studies.
Clinical performance measures coupling racheting down the intensity of treatment with appropriate clinical assessments and monitoring seem reasonable to safely discontinue unnecessary and potentially harmful treatments while retaining the benefits of cardiovascular prevention.
Dr. Enrico Mossello is in the division of geriatric medicine and cardiology and the department of experimental and clinical medicine at the University of Florence (Italy) and Careggi Teaching Hospital. He reported having no relevant financial disclosures. Dr. Mossello made these remarks in an Invited Commentary accompanying Dr. Sussman’s report (JAMA Intern Med. 2015 Oct 26. doi: 10.1001/jamainternmed.2015.5941).
Physicians seem to be unwilling to reduce the dose used of antihypertensive and hypoglycemic medications in older patients, even when these treatments reduce blood pressure and hemoglobin A1c to well below recommended levels and can cause clear harm, according to a report published online Oct. 26 in JAMA Internal Medicine.
This indicates that clinicians must adopt a new perspective regarding cardiovascular treatments and “assess the harms of intensive therapy just as they do the benefits,” wrote Dr. Jeremy B. Sussman of the Department of Veterans Affairs Center for Clinical Management Research and the University of Michigan’s Institute of Healthcare Policy and Innovation, both in Ann Arbor.
To examine the frequency of cutting the intensity of treatment among older patients with type 2 diabetes, Dr. Sussman and his colleagues performed a retrospective analysis of a Veterans Affairs database, focusing on all primary care patients aged 70 years and older with type 2 diabetes. They assessed pharmacy records from a 1-year period to identify deintensification among 211,667 patients who were receiving antihypertensive medications and 179,991 who were receiving medications to reduce HbA1c. Many had multiple comorbidities, and many were nearing the end of their lives.
A total of 51% of the BP cohort and 20% of the HbA1c cohort achieved blood pressure readings or HbA1c levels either lower or much lower than recommended target levels, yet physicians did not reduce or change their medications. Just as worrisome, patients with very low BP or very low HbA1c were no more likely than were those with normal levels to undergo medication adjustments, the investigators said (JAMA Intern Med. 2015 Oct 26. doi: 10.1001/jamainternmed.2015.5110).
In fact, the majority (61.6%) of patients with very low, potentially dangerous blood pressure did not have their blood pressure measured during the ensuing 6 months, and the majority (79.8%) of patients with very low, potentially dangerous HbA1c did not have their HbA1c measured during the ensuing 6 months. This suggests that health care professionals did not recognize very low levels as a problem in need of monitoring, Dr. Sussman and his associates noted.
Most concerning of all, even patients with very low BP and/or very low HbA1c levels who had a short life expectancy were unlikely to have their medication regimen eased up. Such patients are particularly unlikely to benefit from these therapies and are particularly vulnerable to their adverse effects, the researchers said.
One reason for this kind of overtreatment is that its harms usually are not addressed in clinical guidelines, quality-of-care measures, or pay-for-performance programs. “Until guidelines and performance measures specifically call for deintensification for patients who are at risk for being harmed by overtreatment, rates [of deintensification] are likely to remain low,” they added.
Physicians seem to be unwilling to reduce the dose used of antihypertensive and hypoglycemic medications in older patients, even when these treatments reduce blood pressure and hemoglobin A1c to well below recommended levels and can cause clear harm, according to a report published online Oct. 26 in JAMA Internal Medicine.
This indicates that clinicians must adopt a new perspective regarding cardiovascular treatments and “assess the harms of intensive therapy just as they do the benefits,” wrote Dr. Jeremy B. Sussman of the Department of Veterans Affairs Center for Clinical Management Research and the University of Michigan’s Institute of Healthcare Policy and Innovation, both in Ann Arbor.
To examine the frequency of cutting the intensity of treatment among older patients with type 2 diabetes, Dr. Sussman and his colleagues performed a retrospective analysis of a Veterans Affairs database, focusing on all primary care patients aged 70 years and older with type 2 diabetes. They assessed pharmacy records from a 1-year period to identify deintensification among 211,667 patients who were receiving antihypertensive medications and 179,991 who were receiving medications to reduce HbA1c. Many had multiple comorbidities, and many were nearing the end of their lives.
A total of 51% of the BP cohort and 20% of the HbA1c cohort achieved blood pressure readings or HbA1c levels either lower or much lower than recommended target levels, yet physicians did not reduce or change their medications. Just as worrisome, patients with very low BP or very low HbA1c were no more likely than were those with normal levels to undergo medication adjustments, the investigators said (JAMA Intern Med. 2015 Oct 26. doi: 10.1001/jamainternmed.2015.5110).
In fact, the majority (61.6%) of patients with very low, potentially dangerous blood pressure did not have their blood pressure measured during the ensuing 6 months, and the majority (79.8%) of patients with very low, potentially dangerous HbA1c did not have their HbA1c measured during the ensuing 6 months. This suggests that health care professionals did not recognize very low levels as a problem in need of monitoring, Dr. Sussman and his associates noted.
Most concerning of all, even patients with very low BP and/or very low HbA1c levels who had a short life expectancy were unlikely to have their medication regimen eased up. Such patients are particularly unlikely to benefit from these therapies and are particularly vulnerable to their adverse effects, the researchers said.
One reason for this kind of overtreatment is that its harms usually are not addressed in clinical guidelines, quality-of-care measures, or pay-for-performance programs. “Until guidelines and performance measures specifically call for deintensification for patients who are at risk for being harmed by overtreatment, rates [of deintensification] are likely to remain low,” they added.
FROM JAMA INTERNAL MEDICINE
Key clinical point: Racheting down the doses of antihypertensive and hypoglycemic medications remains an uncommon clinical practice in older patients even when these treatments could be harmful.
Major finding: In a study of patients with diabetes mellitus, 51% of those being treated for hypertension and 20% of those on medication for diabetes achieved blood pressure or hemoglobin A1c levels either lower or much lower than recommended target levels, yet their medications were not reduced or changed.
Data source: A 1-year retrospective cohort study involving 211,667 patients older than 70 years with type 2 diabetes who were receiving BP medications and 179,991 receiving hypoglycemic medications.
Disclosures: This study was supported in part by the Veterans Health Administration’s Office of Informatics and Analytics and the Veterans Affairs Health Services Research and Development Service. Dr. Sussman and his associates reported having no relevant financial disclosures.
Benefits, risks of total knee replacement for OA illuminated in trial
Total knee replacement was superior to nonsurgical treatment in relieving pain, restoring function, and improving quality of life for patients with moderate to severe knee osteoarthritis, according to a report published online Oct. 22 in the New England Journal of Medicine.
Even though the number of total knee replacements performed each year is large and steadily increasing – with more than 670,000 done in 2012 in the United States alone – no high-quality randomized, controlled trials have ever compared the effectiveness of the procedure against nonsurgical treatment, said Søren T. Skou, Ph.D., of the Research Unit for Musculoskeletal Function and Physiotherapy, Institute of Sports Science and Clinical Biomechanics, University of Southern Denmark, Odense, and his associates.
Dr. Skou and his colleagues remedied that situation by randomly assigning 100 adults (mean age, 66 years) who were eligible for unilateral total knee replacement to either undergo the procedure and then receive a comprehensive nonsurgical intervention (50 patients) or receive the comprehensive nonsurgical intervention alone (50 patients) at two specialized university clinics in Denmark. The 12-week nonsurgical intervention comprised a twice-weekly group exercise program to restore neutral, functional realignment of the legs; two 1-hour education sessions regarding osteoarthritis characteristics, treatments, and self-help strategies; a dietary (weight-loss) program; provision of individually fitted insoles with medial arch support and a lateral wedge if patients had knee-lateral-to-foot positioning; and as-needed pain medication for pain – acetaminophen and ibuprofen – and pantoprazole, a proton-pump inhibitor.
The primary outcome measure in the trial was the between-group difference at 1 year in improvement on four subscales of the Knee Injury and Osteoarthritis Outcome Scores (KOOS) for pain, symptoms, activities of daily living, and quality of life. The surgical group showed a significantly greater improvement (32.5 out of a possible 100 points) than the nonsurgical group (16.0 points) in this outcome. The surgical group also showed significantly greater improvements in all five individual subscales and in a timed chair-rising test, a timed 20-meter walk test, and on a quality-of-life index, the investigators said (N Engl J Med. 2015 373;17:1597-606).
However, it is important to note that patients who had only the nonsurgical intervention showed clinically relevant improvements, and only 26% of them chose to have the surgery after the conclusion of the study. As expected, the surgical group had more serious adverse events than did the nonsurgical group (24 vs. 6), including three cases of deep venous thrombosis and three cases of knee stiffness requiring brisement forcé while the patient was anesthetized, Dr. Skou and his associates said.
This study was supported by the Obel Family Foundation, the Danish Rheumatism Association, the Health Science Foundation of the North Denmark Region, Foot Science International, Spar Nord Foundation, the Bevica Foundation, the Association of Danish Physiotherapists Research Fund, the Medical Specialist Heinrich Kopp’s Grant, and the Danish Medical Association Research Fund. Dr. Skou and his associates reported having no relevant financial disclosures.
![]() |
Dr. Jeffrey N. Katz |
This study provides the first rigorously controlled data to inform discussions about whether patients should undergo total knee replacement or opt for comprehensive nonsurgical treatment. Surgery proved markedly superior in this trial, with 85% of surgical patients reporting a clinically important improvement in pain and function at 1 year, compared with 68% of nonsurgical patients.
But surgery was associated with several severe adverse events, including deep venous thrombosis, deep wound infection, supracondylar fracture, and stiffness requiring treatment under general anesthesia. Each patient must weigh these considerations; each physician should present the relevant data to their patients and then listen carefully to their preferences.
Dr. Jeffrey N. Katz is in the departments of medicine and orthopedic surgery at Brigham and Women’s Hospital and Harvard University, Boston. He reported having no relevant financial disclosures. Dr. Katz made these remarks in an editorial accompanying Dr. Skou’s report (N Engl J Med. 2015 373;17:1668-9).
![]() |
Dr. Jeffrey N. Katz |
This study provides the first rigorously controlled data to inform discussions about whether patients should undergo total knee replacement or opt for comprehensive nonsurgical treatment. Surgery proved markedly superior in this trial, with 85% of surgical patients reporting a clinically important improvement in pain and function at 1 year, compared with 68% of nonsurgical patients.
But surgery was associated with several severe adverse events, including deep venous thrombosis, deep wound infection, supracondylar fracture, and stiffness requiring treatment under general anesthesia. Each patient must weigh these considerations; each physician should present the relevant data to their patients and then listen carefully to their preferences.
Dr. Jeffrey N. Katz is in the departments of medicine and orthopedic surgery at Brigham and Women’s Hospital and Harvard University, Boston. He reported having no relevant financial disclosures. Dr. Katz made these remarks in an editorial accompanying Dr. Skou’s report (N Engl J Med. 2015 373;17:1668-9).
![]() |
Dr. Jeffrey N. Katz |
This study provides the first rigorously controlled data to inform discussions about whether patients should undergo total knee replacement or opt for comprehensive nonsurgical treatment. Surgery proved markedly superior in this trial, with 85% of surgical patients reporting a clinically important improvement in pain and function at 1 year, compared with 68% of nonsurgical patients.
But surgery was associated with several severe adverse events, including deep venous thrombosis, deep wound infection, supracondylar fracture, and stiffness requiring treatment under general anesthesia. Each patient must weigh these considerations; each physician should present the relevant data to their patients and then listen carefully to their preferences.
Dr. Jeffrey N. Katz is in the departments of medicine and orthopedic surgery at Brigham and Women’s Hospital and Harvard University, Boston. He reported having no relevant financial disclosures. Dr. Katz made these remarks in an editorial accompanying Dr. Skou’s report (N Engl J Med. 2015 373;17:1668-9).
Total knee replacement was superior to nonsurgical treatment in relieving pain, restoring function, and improving quality of life for patients with moderate to severe knee osteoarthritis, according to a report published online Oct. 22 in the New England Journal of Medicine.
Even though the number of total knee replacements performed each year is large and steadily increasing – with more than 670,000 done in 2012 in the United States alone – no high-quality randomized, controlled trials have ever compared the effectiveness of the procedure against nonsurgical treatment, said Søren T. Skou, Ph.D., of the Research Unit for Musculoskeletal Function and Physiotherapy, Institute of Sports Science and Clinical Biomechanics, University of Southern Denmark, Odense, and his associates.
Dr. Skou and his colleagues remedied that situation by randomly assigning 100 adults (mean age, 66 years) who were eligible for unilateral total knee replacement to either undergo the procedure and then receive a comprehensive nonsurgical intervention (50 patients) or receive the comprehensive nonsurgical intervention alone (50 patients) at two specialized university clinics in Denmark. The 12-week nonsurgical intervention comprised a twice-weekly group exercise program to restore neutral, functional realignment of the legs; two 1-hour education sessions regarding osteoarthritis characteristics, treatments, and self-help strategies; a dietary (weight-loss) program; provision of individually fitted insoles with medial arch support and a lateral wedge if patients had knee-lateral-to-foot positioning; and as-needed pain medication for pain – acetaminophen and ibuprofen – and pantoprazole, a proton-pump inhibitor.
The primary outcome measure in the trial was the between-group difference at 1 year in improvement on four subscales of the Knee Injury and Osteoarthritis Outcome Scores (KOOS) for pain, symptoms, activities of daily living, and quality of life. The surgical group showed a significantly greater improvement (32.5 out of a possible 100 points) than the nonsurgical group (16.0 points) in this outcome. The surgical group also showed significantly greater improvements in all five individual subscales and in a timed chair-rising test, a timed 20-meter walk test, and on a quality-of-life index, the investigators said (N Engl J Med. 2015 373;17:1597-606).
However, it is important to note that patients who had only the nonsurgical intervention showed clinically relevant improvements, and only 26% of them chose to have the surgery after the conclusion of the study. As expected, the surgical group had more serious adverse events than did the nonsurgical group (24 vs. 6), including three cases of deep venous thrombosis and three cases of knee stiffness requiring brisement forcé while the patient was anesthetized, Dr. Skou and his associates said.
This study was supported by the Obel Family Foundation, the Danish Rheumatism Association, the Health Science Foundation of the North Denmark Region, Foot Science International, Spar Nord Foundation, the Bevica Foundation, the Association of Danish Physiotherapists Research Fund, the Medical Specialist Heinrich Kopp’s Grant, and the Danish Medical Association Research Fund. Dr. Skou and his associates reported having no relevant financial disclosures.
Total knee replacement was superior to nonsurgical treatment in relieving pain, restoring function, and improving quality of life for patients with moderate to severe knee osteoarthritis, according to a report published online Oct. 22 in the New England Journal of Medicine.
Even though the number of total knee replacements performed each year is large and steadily increasing – with more than 670,000 done in 2012 in the United States alone – no high-quality randomized, controlled trials have ever compared the effectiveness of the procedure against nonsurgical treatment, said Søren T. Skou, Ph.D., of the Research Unit for Musculoskeletal Function and Physiotherapy, Institute of Sports Science and Clinical Biomechanics, University of Southern Denmark, Odense, and his associates.
Dr. Skou and his colleagues remedied that situation by randomly assigning 100 adults (mean age, 66 years) who were eligible for unilateral total knee replacement to either undergo the procedure and then receive a comprehensive nonsurgical intervention (50 patients) or receive the comprehensive nonsurgical intervention alone (50 patients) at two specialized university clinics in Denmark. The 12-week nonsurgical intervention comprised a twice-weekly group exercise program to restore neutral, functional realignment of the legs; two 1-hour education sessions regarding osteoarthritis characteristics, treatments, and self-help strategies; a dietary (weight-loss) program; provision of individually fitted insoles with medial arch support and a lateral wedge if patients had knee-lateral-to-foot positioning; and as-needed pain medication for pain – acetaminophen and ibuprofen – and pantoprazole, a proton-pump inhibitor.
The primary outcome measure in the trial was the between-group difference at 1 year in improvement on four subscales of the Knee Injury and Osteoarthritis Outcome Scores (KOOS) for pain, symptoms, activities of daily living, and quality of life. The surgical group showed a significantly greater improvement (32.5 out of a possible 100 points) than the nonsurgical group (16.0 points) in this outcome. The surgical group also showed significantly greater improvements in all five individual subscales and in a timed chair-rising test, a timed 20-meter walk test, and on a quality-of-life index, the investigators said (N Engl J Med. 2015 373;17:1597-606).
However, it is important to note that patients who had only the nonsurgical intervention showed clinically relevant improvements, and only 26% of them chose to have the surgery after the conclusion of the study. As expected, the surgical group had more serious adverse events than did the nonsurgical group (24 vs. 6), including three cases of deep venous thrombosis and three cases of knee stiffness requiring brisement forcé while the patient was anesthetized, Dr. Skou and his associates said.
This study was supported by the Obel Family Foundation, the Danish Rheumatism Association, the Health Science Foundation of the North Denmark Region, Foot Science International, Spar Nord Foundation, the Bevica Foundation, the Association of Danish Physiotherapists Research Fund, the Medical Specialist Heinrich Kopp’s Grant, and the Danish Medical Association Research Fund. Dr. Skou and his associates reported having no relevant financial disclosures.
FROM THE NEW ENGLAND JOURNAL OF MEDICINE
Key clinical point: Total knee replacement is superior to nonsurgical treatment in decreasing pain and improving function and quality of life.
Major finding: The surgical group showed a significantly greater improvement 1 year from baseline (32.5 out of a possible 100 points) than did the nonsurgical group (16.0 points) in mean Knee Injury and Osteoarthritis Outcome Scores (KOOS) for pain, symptoms, activities of daily living, and quality of life.
Data source: A randomized, controlled trial comparing 1-year outcomes after total knee replacement (50 patients) vs. nonsurgical treatment (50 patients) for osteoarthritis.
Disclosures: This study was supported by the Obel Family Foundation, the Danish Rheumatism Association, the Health Science Foundation of the North Denmark Region, Foot Science International, Spar Nord Foundation, the Bevica Foundation, the Association of Danish Physiotherapists Research Fund, the Medical Specialist Heinrich Kopp’s Grant, and the Danish Medical Association Research Fund. Dr. Skou and his associates reported having no relevant financial disclosures.
American Cancer Society recommends annual mammography starting at age 45
For asymptomatic women at average risk of breast cancer, the American Cancer Society recommends annual mammograms from age 45 until age 54, with a transition to biennial screening mammography starting at age 55, according to new guidelines published Oct. 20.
This is the first time the American Cancer Society (ACS) has updated its breast cancer screening guidelines since 2003. The new version makes several changes, including shifting the start of annual mammography from age 40 to 45 years, and increasing the suggested screening interval for postmenopausal women (JAMA. 2015;314[15]:1599-1614. doi:10.1001/jama.2015.12783).
The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel
For the first time, the guidelines address the question of when to stop routine mammography, recommending a halt to routine screening for women with a life expectancy under 10 years. The ACS guidelines also recommend against clinical breast examinations at any age.
These changes bring the ACS guidelines more into line with recommendations from the U.S. Preventive Services Task Force, Dr. Nancy L. Keating and Dr. Lydia E. Pace, both of Brigham and Women’s Hospital, Boston, wrote in an editorial accompanying the report.
The two organizations are now in agreement on most recommendations and emphasize that breast cancer screening decisions should be individualized to reflect a woman’s values and preferences, not just her underlying risk. Both sets of recommendations also give greater consideration to the potential harms of mammography: overdiagnosis and overtreatment of indolent breast cancers, as well as false-positive results, additional imaging studies, and unnecessary biopsies.
The ACS updated the guideline after noting that new evidence had accumulated from long-term follow-up of both randomized controlled trials and population-based screening programs. The guideline development group, which included four clinicians, two biostatisticians, two epidemiologists, an economist, and two patient representatives, based its revised recommendations on an independent systemic evidence review of the breast cancer screening literature conducted by the Duke University Evidence Synthesis Group, as well as an analysis screening interval and outcomes from the Breast Cancer Surveillance Consortium.
For asymptomatic women at average risk of developing breast cancer, the ACS guideline makes the following recommendations:
Begin routine annual screening mammography at age 45 years (rather than age 40). Assessing the burden of breast cancer by 5-year rather than 10-year age categories demonstrated that the risk/benefit profiles of women aged 40-44 years differed markedly from those of older women and no longer warranted a recommendation to begin screening at age 40, wrote Dr. Kevin C. Oeffinger of Memorial Sloan Kettering Cancer Center, New York, and his associates in the ACS Guideline Development Group.
However, the ACS encourages clinicians to discuss breast cancer screening with patients “around the age of 40 years.” Women who want to begin annual screening mammography before age 45, based on a clear consideration of the trade-offs, should be given that choice, they wrote.
“Some women will value the potential early detection benefit and will be willing to accept the risk of additional testing,” Dr. Oeffinger and his associates wrote. “Other women will choose to defer beginning screening, based on the relatively lower risk of breast cancer.”
Women aged 45-54 years should receive annual screening mammography and at age 55 women should transition to biennial screening. The relative benefits of annual screening decline after menopause and as women age, and the majority of women are postmenopausal at age 55. At the same time, the relative harms of annual screening increase at this age, because the chance of false-positive results rises as the number of screenings rises. However, women who prefer to continue annual screening after age 55 should be given that opportunity, according to the ACS guidelines.
Women should continue screening mammography as long as their overall health is good and they have a life expectancy of 10 years or longer. Breast cancer incidence continues to increase with age until the age of 75-79 years, and mammography’s sensitivity and specificity improve with increasing age, so screening mammography in this age group will likely reduce breast cancer deaths. However, the authors noted that recent studies have raised concerns that older women with serious, or even terminal disorders, are still subjected to mammograms even though it will not increase their life expectancy or improve their quality of life.
“Health and life expectancy, not simply age, must be considered in screening decisions,” Dr. Oeffinger and his associates wrote.
Clinical breast examination is no longer recommended at any age. Historically, the ACS had advised periodic clinical breast exams for women younger than 40 and annual exams for women 40 and older. But there is no evidence that these exams, whether they are performed alone or in conjunction with mammography, enhance the detection of breast cancer, according to the guidelines.
Given that clinical breast exams are somewhat time consuming, “clinicians should use this time instead for ascertaining family history and counseling women regarding the importance of being alert to breast changes and the potential benefits, limitations, and harms of screening mammography,” the authors wrote.
“This new recommendation should not be interpreted to discount the potential value of clinical breast exams in low-resource settings where mammography screening may not be feasible,” they added.
In the accompanying editorial, Dr. Keating and Dr. Pace called this recommendation “a marked deviation from prior ACS guidelines and a stronger statement than that of the USPSTF,” which states only that the evidence is insufficient to recommend for or against clinical breast exams.
They noted that the majority of women who are diagnosed as having breast cancer “will do well regardless of whether their cancer was found by mammography.”
According to the most recent data, approximately 85% of women in their 40s and 50s who die of breast cancer would have died regardless of mammography screening. And even that 15% relative benefit translates to a very small absolute benefit: only 5 of 10,000 women in their 40s and 10 of 10,000 women in their 50s are likely to have a breast cancer death prevented by regular mammography, Dr. Keating and Dr. Pace wrote (JAMA 2015;314[15]:1569-71).
“It is important to remember and emphasize with average-risk women older than 40 years that there is no single right answer to the question ‘Should I have a mammogram?’ ” they wrote.
The American Cancer Society and the National Cancer Institute sponsored this work. Dr. Oeffinger reported having no relevant financial disclosures, and his associates reported ties to numerous industry sources.
After a little over a decade, the American Cancer Society has published guidelines for screening for the average-risk population. These guidelines provide some flexibility on the initiation of screening mammograms but strongly recommend starting at age 45. As this is an average-risk population, and the rate below this age is low and the rate of false positives is increased (due to dense breast tissue), this recommendation is based on good logic.
The question of frequency of screening is a little more challenging. The authors provide sound rationale for biennial screening after the age of 55. Unfortunately, patients are often hesitant to “skip a year” and this may be harder to enforce. Secondly, practitioners are often slow to adopt new practices as noted by changes in Pap test guidelines. Though this is a reasonable recommendation, it will take some education for patients to understand and will likely be less followed, at least in the beginning.
The final question of when to stop screening is fantastic. As mostly left-brain thinkers, we are often set on an actual age, completely disregarding the health of the patient. As the life expectancy of women in the United States is nearing 80 and many are surviving beyond that age with a high functioning status, consideration of this factor will allow for screening in women who can undergo management (if required) with favorable outcomes. Though practice changes take some time, it is likely that these recommendations will reduce unnecessary costs without impacting outcomes.
Given the substantial reduction in overall mortality, breast cancer screening is an integral part of women’s health. Providers in obstetrics and gynecology are often the primary source of education and the ordering team for breast cancer screening. Thus, it is critical for us to stay current on the recommendations for screening as well as the identification of high-risk women, allowing for well-informed decisions regarding individualized screening.
Dr. Ritu Salani is associate professor in gynecologic oncology at The Ohio State University, Columbus. Dr. Monica Hagan Vetter is a third-year resident in ob.gyn. at The Ohio State University. They reported having no financial disclosures.
After a little over a decade, the American Cancer Society has published guidelines for screening for the average-risk population. These guidelines provide some flexibility on the initiation of screening mammograms but strongly recommend starting at age 45. As this is an average-risk population, and the rate below this age is low and the rate of false positives is increased (due to dense breast tissue), this recommendation is based on good logic.
The question of frequency of screening is a little more challenging. The authors provide sound rationale for biennial screening after the age of 55. Unfortunately, patients are often hesitant to “skip a year” and this may be harder to enforce. Secondly, practitioners are often slow to adopt new practices as noted by changes in Pap test guidelines. Though this is a reasonable recommendation, it will take some education for patients to understand and will likely be less followed, at least in the beginning.
The final question of when to stop screening is fantastic. As mostly left-brain thinkers, we are often set on an actual age, completely disregarding the health of the patient. As the life expectancy of women in the United States is nearing 80 and many are surviving beyond that age with a high functioning status, consideration of this factor will allow for screening in women who can undergo management (if required) with favorable outcomes. Though practice changes take some time, it is likely that these recommendations will reduce unnecessary costs without impacting outcomes.
Given the substantial reduction in overall mortality, breast cancer screening is an integral part of women’s health. Providers in obstetrics and gynecology are often the primary source of education and the ordering team for breast cancer screening. Thus, it is critical for us to stay current on the recommendations for screening as well as the identification of high-risk women, allowing for well-informed decisions regarding individualized screening.
Dr. Ritu Salani is associate professor in gynecologic oncology at The Ohio State University, Columbus. Dr. Monica Hagan Vetter is a third-year resident in ob.gyn. at The Ohio State University. They reported having no financial disclosures.
After a little over a decade, the American Cancer Society has published guidelines for screening for the average-risk population. These guidelines provide some flexibility on the initiation of screening mammograms but strongly recommend starting at age 45. As this is an average-risk population, and the rate below this age is low and the rate of false positives is increased (due to dense breast tissue), this recommendation is based on good logic.
The question of frequency of screening is a little more challenging. The authors provide sound rationale for biennial screening after the age of 55. Unfortunately, patients are often hesitant to “skip a year” and this may be harder to enforce. Secondly, practitioners are often slow to adopt new practices as noted by changes in Pap test guidelines. Though this is a reasonable recommendation, it will take some education for patients to understand and will likely be less followed, at least in the beginning.
The final question of when to stop screening is fantastic. As mostly left-brain thinkers, we are often set on an actual age, completely disregarding the health of the patient. As the life expectancy of women in the United States is nearing 80 and many are surviving beyond that age with a high functioning status, consideration of this factor will allow for screening in women who can undergo management (if required) with favorable outcomes. Though practice changes take some time, it is likely that these recommendations will reduce unnecessary costs without impacting outcomes.
Given the substantial reduction in overall mortality, breast cancer screening is an integral part of women’s health. Providers in obstetrics and gynecology are often the primary source of education and the ordering team for breast cancer screening. Thus, it is critical for us to stay current on the recommendations for screening as well as the identification of high-risk women, allowing for well-informed decisions regarding individualized screening.
Dr. Ritu Salani is associate professor in gynecologic oncology at The Ohio State University, Columbus. Dr. Monica Hagan Vetter is a third-year resident in ob.gyn. at The Ohio State University. They reported having no financial disclosures.
For asymptomatic women at average risk of breast cancer, the American Cancer Society recommends annual mammograms from age 45 until age 54, with a transition to biennial screening mammography starting at age 55, according to new guidelines published Oct. 20.
This is the first time the American Cancer Society (ACS) has updated its breast cancer screening guidelines since 2003. The new version makes several changes, including shifting the start of annual mammography from age 40 to 45 years, and increasing the suggested screening interval for postmenopausal women (JAMA. 2015;314[15]:1599-1614. doi:10.1001/jama.2015.12783).
The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel
For the first time, the guidelines address the question of when to stop routine mammography, recommending a halt to routine screening for women with a life expectancy under 10 years. The ACS guidelines also recommend against clinical breast examinations at any age.
These changes bring the ACS guidelines more into line with recommendations from the U.S. Preventive Services Task Force, Dr. Nancy L. Keating and Dr. Lydia E. Pace, both of Brigham and Women’s Hospital, Boston, wrote in an editorial accompanying the report.
The two organizations are now in agreement on most recommendations and emphasize that breast cancer screening decisions should be individualized to reflect a woman’s values and preferences, not just her underlying risk. Both sets of recommendations also give greater consideration to the potential harms of mammography: overdiagnosis and overtreatment of indolent breast cancers, as well as false-positive results, additional imaging studies, and unnecessary biopsies.
The ACS updated the guideline after noting that new evidence had accumulated from long-term follow-up of both randomized controlled trials and population-based screening programs. The guideline development group, which included four clinicians, two biostatisticians, two epidemiologists, an economist, and two patient representatives, based its revised recommendations on an independent systemic evidence review of the breast cancer screening literature conducted by the Duke University Evidence Synthesis Group, as well as an analysis screening interval and outcomes from the Breast Cancer Surveillance Consortium.
For asymptomatic women at average risk of developing breast cancer, the ACS guideline makes the following recommendations:
Begin routine annual screening mammography at age 45 years (rather than age 40). Assessing the burden of breast cancer by 5-year rather than 10-year age categories demonstrated that the risk/benefit profiles of women aged 40-44 years differed markedly from those of older women and no longer warranted a recommendation to begin screening at age 40, wrote Dr. Kevin C. Oeffinger of Memorial Sloan Kettering Cancer Center, New York, and his associates in the ACS Guideline Development Group.
However, the ACS encourages clinicians to discuss breast cancer screening with patients “around the age of 40 years.” Women who want to begin annual screening mammography before age 45, based on a clear consideration of the trade-offs, should be given that choice, they wrote.
“Some women will value the potential early detection benefit and will be willing to accept the risk of additional testing,” Dr. Oeffinger and his associates wrote. “Other women will choose to defer beginning screening, based on the relatively lower risk of breast cancer.”
Women aged 45-54 years should receive annual screening mammography and at age 55 women should transition to biennial screening. The relative benefits of annual screening decline after menopause and as women age, and the majority of women are postmenopausal at age 55. At the same time, the relative harms of annual screening increase at this age, because the chance of false-positive results rises as the number of screenings rises. However, women who prefer to continue annual screening after age 55 should be given that opportunity, according to the ACS guidelines.
Women should continue screening mammography as long as their overall health is good and they have a life expectancy of 10 years or longer. Breast cancer incidence continues to increase with age until the age of 75-79 years, and mammography’s sensitivity and specificity improve with increasing age, so screening mammography in this age group will likely reduce breast cancer deaths. However, the authors noted that recent studies have raised concerns that older women with serious, or even terminal disorders, are still subjected to mammograms even though it will not increase their life expectancy or improve their quality of life.
“Health and life expectancy, not simply age, must be considered in screening decisions,” Dr. Oeffinger and his associates wrote.
Clinical breast examination is no longer recommended at any age. Historically, the ACS had advised periodic clinical breast exams for women younger than 40 and annual exams for women 40 and older. But there is no evidence that these exams, whether they are performed alone or in conjunction with mammography, enhance the detection of breast cancer, according to the guidelines.
Given that clinical breast exams are somewhat time consuming, “clinicians should use this time instead for ascertaining family history and counseling women regarding the importance of being alert to breast changes and the potential benefits, limitations, and harms of screening mammography,” the authors wrote.
“This new recommendation should not be interpreted to discount the potential value of clinical breast exams in low-resource settings where mammography screening may not be feasible,” they added.
In the accompanying editorial, Dr. Keating and Dr. Pace called this recommendation “a marked deviation from prior ACS guidelines and a stronger statement than that of the USPSTF,” which states only that the evidence is insufficient to recommend for or against clinical breast exams.
They noted that the majority of women who are diagnosed as having breast cancer “will do well regardless of whether their cancer was found by mammography.”
According to the most recent data, approximately 85% of women in their 40s and 50s who die of breast cancer would have died regardless of mammography screening. And even that 15% relative benefit translates to a very small absolute benefit: only 5 of 10,000 women in their 40s and 10 of 10,000 women in their 50s are likely to have a breast cancer death prevented by regular mammography, Dr. Keating and Dr. Pace wrote (JAMA 2015;314[15]:1569-71).
“It is important to remember and emphasize with average-risk women older than 40 years that there is no single right answer to the question ‘Should I have a mammogram?’ ” they wrote.
The American Cancer Society and the National Cancer Institute sponsored this work. Dr. Oeffinger reported having no relevant financial disclosures, and his associates reported ties to numerous industry sources.
For asymptomatic women at average risk of breast cancer, the American Cancer Society recommends annual mammograms from age 45 until age 54, with a transition to biennial screening mammography starting at age 55, according to new guidelines published Oct. 20.
This is the first time the American Cancer Society (ACS) has updated its breast cancer screening guidelines since 2003. The new version makes several changes, including shifting the start of annual mammography from age 40 to 45 years, and increasing the suggested screening interval for postmenopausal women (JAMA. 2015;314[15]:1599-1614. doi:10.1001/jama.2015.12783).
The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel
For the first time, the guidelines address the question of when to stop routine mammography, recommending a halt to routine screening for women with a life expectancy under 10 years. The ACS guidelines also recommend against clinical breast examinations at any age.
These changes bring the ACS guidelines more into line with recommendations from the U.S. Preventive Services Task Force, Dr. Nancy L. Keating and Dr. Lydia E. Pace, both of Brigham and Women’s Hospital, Boston, wrote in an editorial accompanying the report.
The two organizations are now in agreement on most recommendations and emphasize that breast cancer screening decisions should be individualized to reflect a woman’s values and preferences, not just her underlying risk. Both sets of recommendations also give greater consideration to the potential harms of mammography: overdiagnosis and overtreatment of indolent breast cancers, as well as false-positive results, additional imaging studies, and unnecessary biopsies.
The ACS updated the guideline after noting that new evidence had accumulated from long-term follow-up of both randomized controlled trials and population-based screening programs. The guideline development group, which included four clinicians, two biostatisticians, two epidemiologists, an economist, and two patient representatives, based its revised recommendations on an independent systemic evidence review of the breast cancer screening literature conducted by the Duke University Evidence Synthesis Group, as well as an analysis screening interval and outcomes from the Breast Cancer Surveillance Consortium.
For asymptomatic women at average risk of developing breast cancer, the ACS guideline makes the following recommendations:
Begin routine annual screening mammography at age 45 years (rather than age 40). Assessing the burden of breast cancer by 5-year rather than 10-year age categories demonstrated that the risk/benefit profiles of women aged 40-44 years differed markedly from those of older women and no longer warranted a recommendation to begin screening at age 40, wrote Dr. Kevin C. Oeffinger of Memorial Sloan Kettering Cancer Center, New York, and his associates in the ACS Guideline Development Group.
However, the ACS encourages clinicians to discuss breast cancer screening with patients “around the age of 40 years.” Women who want to begin annual screening mammography before age 45, based on a clear consideration of the trade-offs, should be given that choice, they wrote.
“Some women will value the potential early detection benefit and will be willing to accept the risk of additional testing,” Dr. Oeffinger and his associates wrote. “Other women will choose to defer beginning screening, based on the relatively lower risk of breast cancer.”
Women aged 45-54 years should receive annual screening mammography and at age 55 women should transition to biennial screening. The relative benefits of annual screening decline after menopause and as women age, and the majority of women are postmenopausal at age 55. At the same time, the relative harms of annual screening increase at this age, because the chance of false-positive results rises as the number of screenings rises. However, women who prefer to continue annual screening after age 55 should be given that opportunity, according to the ACS guidelines.
Women should continue screening mammography as long as their overall health is good and they have a life expectancy of 10 years or longer. Breast cancer incidence continues to increase with age until the age of 75-79 years, and mammography’s sensitivity and specificity improve with increasing age, so screening mammography in this age group will likely reduce breast cancer deaths. However, the authors noted that recent studies have raised concerns that older women with serious, or even terminal disorders, are still subjected to mammograms even though it will not increase their life expectancy or improve their quality of life.
“Health and life expectancy, not simply age, must be considered in screening decisions,” Dr. Oeffinger and his associates wrote.
Clinical breast examination is no longer recommended at any age. Historically, the ACS had advised periodic clinical breast exams for women younger than 40 and annual exams for women 40 and older. But there is no evidence that these exams, whether they are performed alone or in conjunction with mammography, enhance the detection of breast cancer, according to the guidelines.
Given that clinical breast exams are somewhat time consuming, “clinicians should use this time instead for ascertaining family history and counseling women regarding the importance of being alert to breast changes and the potential benefits, limitations, and harms of screening mammography,” the authors wrote.
“This new recommendation should not be interpreted to discount the potential value of clinical breast exams in low-resource settings where mammography screening may not be feasible,” they added.
In the accompanying editorial, Dr. Keating and Dr. Pace called this recommendation “a marked deviation from prior ACS guidelines and a stronger statement than that of the USPSTF,” which states only that the evidence is insufficient to recommend for or against clinical breast exams.
They noted that the majority of women who are diagnosed as having breast cancer “will do well regardless of whether their cancer was found by mammography.”
According to the most recent data, approximately 85% of women in their 40s and 50s who die of breast cancer would have died regardless of mammography screening. And even that 15% relative benefit translates to a very small absolute benefit: only 5 of 10,000 women in their 40s and 10 of 10,000 women in their 50s are likely to have a breast cancer death prevented by regular mammography, Dr. Keating and Dr. Pace wrote (JAMA 2015;314[15]:1569-71).
“It is important to remember and emphasize with average-risk women older than 40 years that there is no single right answer to the question ‘Should I have a mammogram?’ ” they wrote.
The American Cancer Society and the National Cancer Institute sponsored this work. Dr. Oeffinger reported having no relevant financial disclosures, and his associates reported ties to numerous industry sources.
FROM JAMA
Key clinical point: The American Cancer Society recommends annual mammograms for average-risk, asymptomatic women aged 45-54 years.
Major finding: The risk/benefit profiles of women aged 40-44 years differed markedly from those of older women and no longer warranted a recommendation to begin annual mammographic screening at age 40.
Data source: An update of the 2003 ACS guideline on breast cancer screening for women at average risk, based on a review of current evidence and an analysis of registry data for 15,440 women diagnosed during a 15-year period.
Disclosures: The American Cancer Society and the National Cancer Institute sponsored this work. Dr. Oeffinger reported having no relevant financial disclosures, and his associates reported ties to numerous industry sources.
Menopause status could guide breast cancer screening interval
Among postmenopausal women, breast cancers diagnosed following biennial mammography intervals are no more “unfavorable” than those diagnosed following annual intervals, according to a report published online Oct. 20 in JAMA Oncology.
“When considering recommendations regarding screening intervals, the potential benefit of diagnosing cancers at an earlier stage must be weighed against the increased potential for harms associated with more frequent screening, such as false-positive recalls and biopsies, which are 1.5 to 2 times higher in annual vs. biennial screeners,” wrote Diana L. Miglioretti, Ph.D., of the University of California, Davis, and her associates in the Breast Cancer Surveillance Consortium (BCSC).
The optimal frequency of mammographic screening remains controversial. The American Cancer Society commissioned the BCSC to analyze the most recent information on this issue as part of its effort to update the ACS guideline for breast cancer screening for women at average risk.
BCSC registries collect patient and clinical data from community radiology facilities across the country. For this analysis, Dr. Miglioretti and her colleagues focused on 15,440 women aged 40-85 years in these registries who were diagnosed as having breast cancer from 1996 to 2012. A total of 12,070 of the women underwent annual mammographic screening and 3,370 underwent biennial mammographic screening.
Among premenopausal women, those diagnosed after biennial mammograms were more likely to have tumors with unfavorable prognostic characteristics than were those diagnosed after annual mammograms (relative risk, 1.11). In contrast, among postmenopausal women, those diagnosed after biennial mammograms were not more likely to have tumors with unfavorable prognostic characteristics than were those diagnosed after annual mammograms (RR, 1.03), the investigators wrote (JAMA Oncol. 2015 Oct 20. doi: 10.1001/jamaoncology.2015.3084).
In an editorial accompanying this report, Dr. Wendy Y. Chen of Brigham and Women’s Hospital, Dana Farber Cancer Institute, and Harvard Medical School, all in Boston, wrote, “Although the authors do not endorse annual or biennial screening, they imply that biennial screening would be acceptable for postmenopausal women but inferior for premenopausal women.”
Most developed countries outside the United States – including the United Kingdom, Canada, and Australia – recommend screening every 2 or 3 years, Dr. Chen noted (JAMA Oncol. 2015 Oct 20 doi: 10.1001/jamaoncology.2015.3286).
This study and others clearly show that, with less frequent mammography, breast cancers will be larger and have a slightly more advanced stage when they are discovered, Dr. Chen wrote. But with a better understanding of tumor biology and improvements in targeted therapy, the best approach may not be simply trying to identify a smaller tumor, she added.
“Efforts should be focused on a better understanding of how screening interacts with tumor biology with a better understanding of the types of interval cancers and sojourn times and how these characteristics differ by age and/or menopausal status,” Dr. Chen wrote.
This study was supported by the American Cancer Society and the National Cancer Institute. Dr. Miglioretti reported having no relevant financial disclosures. One of the investigators reported being an unpaid advisor on General Electric Health Care’s breast medical advisory board.
Among postmenopausal women, breast cancers diagnosed following biennial mammography intervals are no more “unfavorable” than those diagnosed following annual intervals, according to a report published online Oct. 20 in JAMA Oncology.
“When considering recommendations regarding screening intervals, the potential benefit of diagnosing cancers at an earlier stage must be weighed against the increased potential for harms associated with more frequent screening, such as false-positive recalls and biopsies, which are 1.5 to 2 times higher in annual vs. biennial screeners,” wrote Diana L. Miglioretti, Ph.D., of the University of California, Davis, and her associates in the Breast Cancer Surveillance Consortium (BCSC).
The optimal frequency of mammographic screening remains controversial. The American Cancer Society commissioned the BCSC to analyze the most recent information on this issue as part of its effort to update the ACS guideline for breast cancer screening for women at average risk.
BCSC registries collect patient and clinical data from community radiology facilities across the country. For this analysis, Dr. Miglioretti and her colleagues focused on 15,440 women aged 40-85 years in these registries who were diagnosed as having breast cancer from 1996 to 2012. A total of 12,070 of the women underwent annual mammographic screening and 3,370 underwent biennial mammographic screening.
Among premenopausal women, those diagnosed after biennial mammograms were more likely to have tumors with unfavorable prognostic characteristics than were those diagnosed after annual mammograms (relative risk, 1.11). In contrast, among postmenopausal women, those diagnosed after biennial mammograms were not more likely to have tumors with unfavorable prognostic characteristics than were those diagnosed after annual mammograms (RR, 1.03), the investigators wrote (JAMA Oncol. 2015 Oct 20. doi: 10.1001/jamaoncology.2015.3084).
In an editorial accompanying this report, Dr. Wendy Y. Chen of Brigham and Women’s Hospital, Dana Farber Cancer Institute, and Harvard Medical School, all in Boston, wrote, “Although the authors do not endorse annual or biennial screening, they imply that biennial screening would be acceptable for postmenopausal women but inferior for premenopausal women.”
Most developed countries outside the United States – including the United Kingdom, Canada, and Australia – recommend screening every 2 or 3 years, Dr. Chen noted (JAMA Oncol. 2015 Oct 20 doi: 10.1001/jamaoncology.2015.3286).
This study and others clearly show that, with less frequent mammography, breast cancers will be larger and have a slightly more advanced stage when they are discovered, Dr. Chen wrote. But with a better understanding of tumor biology and improvements in targeted therapy, the best approach may not be simply trying to identify a smaller tumor, she added.
“Efforts should be focused on a better understanding of how screening interacts with tumor biology with a better understanding of the types of interval cancers and sojourn times and how these characteristics differ by age and/or menopausal status,” Dr. Chen wrote.
This study was supported by the American Cancer Society and the National Cancer Institute. Dr. Miglioretti reported having no relevant financial disclosures. One of the investigators reported being an unpaid advisor on General Electric Health Care’s breast medical advisory board.
Among postmenopausal women, breast cancers diagnosed following biennial mammography intervals are no more “unfavorable” than those diagnosed following annual intervals, according to a report published online Oct. 20 in JAMA Oncology.
“When considering recommendations regarding screening intervals, the potential benefit of diagnosing cancers at an earlier stage must be weighed against the increased potential for harms associated with more frequent screening, such as false-positive recalls and biopsies, which are 1.5 to 2 times higher in annual vs. biennial screeners,” wrote Diana L. Miglioretti, Ph.D., of the University of California, Davis, and her associates in the Breast Cancer Surveillance Consortium (BCSC).
The optimal frequency of mammographic screening remains controversial. The American Cancer Society commissioned the BCSC to analyze the most recent information on this issue as part of its effort to update the ACS guideline for breast cancer screening for women at average risk.
BCSC registries collect patient and clinical data from community radiology facilities across the country. For this analysis, Dr. Miglioretti and her colleagues focused on 15,440 women aged 40-85 years in these registries who were diagnosed as having breast cancer from 1996 to 2012. A total of 12,070 of the women underwent annual mammographic screening and 3,370 underwent biennial mammographic screening.
Among premenopausal women, those diagnosed after biennial mammograms were more likely to have tumors with unfavorable prognostic characteristics than were those diagnosed after annual mammograms (relative risk, 1.11). In contrast, among postmenopausal women, those diagnosed after biennial mammograms were not more likely to have tumors with unfavorable prognostic characteristics than were those diagnosed after annual mammograms (RR, 1.03), the investigators wrote (JAMA Oncol. 2015 Oct 20. doi: 10.1001/jamaoncology.2015.3084).
In an editorial accompanying this report, Dr. Wendy Y. Chen of Brigham and Women’s Hospital, Dana Farber Cancer Institute, and Harvard Medical School, all in Boston, wrote, “Although the authors do not endorse annual or biennial screening, they imply that biennial screening would be acceptable for postmenopausal women but inferior for premenopausal women.”
Most developed countries outside the United States – including the United Kingdom, Canada, and Australia – recommend screening every 2 or 3 years, Dr. Chen noted (JAMA Oncol. 2015 Oct 20 doi: 10.1001/jamaoncology.2015.3286).
This study and others clearly show that, with less frequent mammography, breast cancers will be larger and have a slightly more advanced stage when they are discovered, Dr. Chen wrote. But with a better understanding of tumor biology and improvements in targeted therapy, the best approach may not be simply trying to identify a smaller tumor, she added.
“Efforts should be focused on a better understanding of how screening interacts with tumor biology with a better understanding of the types of interval cancers and sojourn times and how these characteristics differ by age and/or menopausal status,” Dr. Chen wrote.
This study was supported by the American Cancer Society and the National Cancer Institute. Dr. Miglioretti reported having no relevant financial disclosures. One of the investigators reported being an unpaid advisor on General Electric Health Care’s breast medical advisory board.
FROM JAMA ONCOLOGY
Key clinical point: After menopause, breast cancers diagnosed after 2-year mammography intervals are no more unfavorable than those arising after 1-year intervals.
Major finding: Among postmenopausal women, those diagnosed after biennial mammograms were not more likely to have tumors with unfavorable prognostic characteristics than were those diagnosed after annual mammograms (relative risk, 1.03).
Data source: A prospective cohort study involving 15,440 women diagnosed with breast cancer from 1996 to 2012.
Disclosures: This study was supported by the American Cancer Society and the National Cancer Institute. Dr. Miglioretti reported having no relevant financial disclosures. One of the investigators reported being an unpaid advisor on General Electric Health Care’s breast medical advisory board.
Susceptible S. aureus more deadly than MRSA
In neonatal intensive care units across the United States, invasive Staphylococcus aureus infections that are susceptible to methicillin are more common and more deadly than methicillin-resistant S. aureus infections, according to a report published Oct. 19 in JAMA Pediatrics.
This means infection prevention and control strategies in NICUs should not focus solely on MRSA but should broadly target methicillin-susceptible S. aureus as well, according to Dr. Jessica Ericson of Duke Clinical Research Institute and the department of pediatrics at Duke University, Durham, N.C., and her associates.
To assess the epidemiology of all invasive S. aureus infections among hospitalized neonates, the investigators analyzed data from 348 academic and community NICUs in 34 states during a 15-year period. They reviewed the medical records of a nationally representative sample of 887,910 infants cared for in these NICUs, of whom 3,888 (0.4%) developed 3,978 invasive S. aureus infections.
A total of 2,868 (72%) of these infections were susceptible to methicillin, while 1,110 (27.9%) were methicillin resistant. Thus, methicillin-susceptible organisms caused 2.6 times more invasive S. aureus infections than did MRSA. In the subgroup of approximately 2,500 affected neonates for whom mortality data were available, the rate of in-hospital death was similar between those with susceptible infections (10%) and those with MRSA (12%), Dr. Ericson and her associates reported (JAMA Ped. 2015 Oct. 19. doi:10.1001/jamapediatrics.2015.2380).
In a further analysis that adjusted for gestational age, sex, and race/ethnicity, there was no significant difference in risk of death between neonates who developed susceptible infections and those who developed MRSA infections at 7 days, 30 days, or hospital discharge, they added.
At present, most medical centers consider only MRSA in their screening and decolonization protocols. Given that the absolute numbers of infections and deaths caused by methicillin-susceptible S. aureus exceed those due to MRSA, hospitals should consider expanding their infection control efforts, Dr. Ericson and her associates said.
The key to minimizing morbidity and mortality from any organism, including S. aureus, is to prevent horizontal transmission that can lead to NICU outbreaks. Hand hygiene, the mainstay of this approach, is feasible, cost effective, and provides protection against other pathogens in addition to S. aureus.
In contrast, preventing vertical transmission by expanding MRSA-prevention techniques would require massive screening of neonates and the institution of contact precautions for thousands of colonized infants. This could be so labor intensive, time consuming, and costly that it wouldn’t be feasible.
Dr. Joseph B. Cantey is with the department of pediatrics at Texas A & M Health Science Center, College Station. He and his associates reported having no relevant financial conflicts. Dr. Cantey and his associates made these remarks in an editorial accompanying Dr. Ericson’s report (JAMA Ped. 2015 Oct 19. doi: 10.1001/jamapediatrics.2015.2980).
The key to minimizing morbidity and mortality from any organism, including S. aureus, is to prevent horizontal transmission that can lead to NICU outbreaks. Hand hygiene, the mainstay of this approach, is feasible, cost effective, and provides protection against other pathogens in addition to S. aureus.
In contrast, preventing vertical transmission by expanding MRSA-prevention techniques would require massive screening of neonates and the institution of contact precautions for thousands of colonized infants. This could be so labor intensive, time consuming, and costly that it wouldn’t be feasible.
Dr. Joseph B. Cantey is with the department of pediatrics at Texas A & M Health Science Center, College Station. He and his associates reported having no relevant financial conflicts. Dr. Cantey and his associates made these remarks in an editorial accompanying Dr. Ericson’s report (JAMA Ped. 2015 Oct 19. doi: 10.1001/jamapediatrics.2015.2980).
The key to minimizing morbidity and mortality from any organism, including S. aureus, is to prevent horizontal transmission that can lead to NICU outbreaks. Hand hygiene, the mainstay of this approach, is feasible, cost effective, and provides protection against other pathogens in addition to S. aureus.
In contrast, preventing vertical transmission by expanding MRSA-prevention techniques would require massive screening of neonates and the institution of contact precautions for thousands of colonized infants. This could be so labor intensive, time consuming, and costly that it wouldn’t be feasible.
Dr. Joseph B. Cantey is with the department of pediatrics at Texas A & M Health Science Center, College Station. He and his associates reported having no relevant financial conflicts. Dr. Cantey and his associates made these remarks in an editorial accompanying Dr. Ericson’s report (JAMA Ped. 2015 Oct 19. doi: 10.1001/jamapediatrics.2015.2980).
In neonatal intensive care units across the United States, invasive Staphylococcus aureus infections that are susceptible to methicillin are more common and more deadly than methicillin-resistant S. aureus infections, according to a report published Oct. 19 in JAMA Pediatrics.
This means infection prevention and control strategies in NICUs should not focus solely on MRSA but should broadly target methicillin-susceptible S. aureus as well, according to Dr. Jessica Ericson of Duke Clinical Research Institute and the department of pediatrics at Duke University, Durham, N.C., and her associates.
To assess the epidemiology of all invasive S. aureus infections among hospitalized neonates, the investigators analyzed data from 348 academic and community NICUs in 34 states during a 15-year period. They reviewed the medical records of a nationally representative sample of 887,910 infants cared for in these NICUs, of whom 3,888 (0.4%) developed 3,978 invasive S. aureus infections.
A total of 2,868 (72%) of these infections were susceptible to methicillin, while 1,110 (27.9%) were methicillin resistant. Thus, methicillin-susceptible organisms caused 2.6 times more invasive S. aureus infections than did MRSA. In the subgroup of approximately 2,500 affected neonates for whom mortality data were available, the rate of in-hospital death was similar between those with susceptible infections (10%) and those with MRSA (12%), Dr. Ericson and her associates reported (JAMA Ped. 2015 Oct. 19. doi:10.1001/jamapediatrics.2015.2380).
In a further analysis that adjusted for gestational age, sex, and race/ethnicity, there was no significant difference in risk of death between neonates who developed susceptible infections and those who developed MRSA infections at 7 days, 30 days, or hospital discharge, they added.
At present, most medical centers consider only MRSA in their screening and decolonization protocols. Given that the absolute numbers of infections and deaths caused by methicillin-susceptible S. aureus exceed those due to MRSA, hospitals should consider expanding their infection control efforts, Dr. Ericson and her associates said.
In neonatal intensive care units across the United States, invasive Staphylococcus aureus infections that are susceptible to methicillin are more common and more deadly than methicillin-resistant S. aureus infections, according to a report published Oct. 19 in JAMA Pediatrics.
This means infection prevention and control strategies in NICUs should not focus solely on MRSA but should broadly target methicillin-susceptible S. aureus as well, according to Dr. Jessica Ericson of Duke Clinical Research Institute and the department of pediatrics at Duke University, Durham, N.C., and her associates.
To assess the epidemiology of all invasive S. aureus infections among hospitalized neonates, the investigators analyzed data from 348 academic and community NICUs in 34 states during a 15-year period. They reviewed the medical records of a nationally representative sample of 887,910 infants cared for in these NICUs, of whom 3,888 (0.4%) developed 3,978 invasive S. aureus infections.
A total of 2,868 (72%) of these infections were susceptible to methicillin, while 1,110 (27.9%) were methicillin resistant. Thus, methicillin-susceptible organisms caused 2.6 times more invasive S. aureus infections than did MRSA. In the subgroup of approximately 2,500 affected neonates for whom mortality data were available, the rate of in-hospital death was similar between those with susceptible infections (10%) and those with MRSA (12%), Dr. Ericson and her associates reported (JAMA Ped. 2015 Oct. 19. doi:10.1001/jamapediatrics.2015.2380).
In a further analysis that adjusted for gestational age, sex, and race/ethnicity, there was no significant difference in risk of death between neonates who developed susceptible infections and those who developed MRSA infections at 7 days, 30 days, or hospital discharge, they added.
At present, most medical centers consider only MRSA in their screening and decolonization protocols. Given that the absolute numbers of infections and deaths caused by methicillin-susceptible S. aureus exceed those due to MRSA, hospitals should consider expanding their infection control efforts, Dr. Ericson and her associates said.
FROM JAMA PEDIATRICS
Key clinical point: In NICUs, invasive S. aureus infections susceptible to methicillin are more common and more deadly than MRSA infections.
Major finding: 72.1% of invasive S. aureus infections were susceptible to methicillin, while only 27.9% were methicillin-resistant and mortality was the same for both.
Data source: A multicenter retrospective cohort study involving 887,910 infants (3,978 S. aureus infections) in 348 U.S. NICUs during a 15-year period.
Disclosures: This study was supported by multiple agencies within the U.S. Department of Health and Human Services. Dr. Ericson reported having no relevant financial conflicts; her associates reported ties to numerous industry sources.
Picosecond-domain laser removes multicolor tattoos
A prototype picosecond-domain Nd:YAG laser was safe and effective at removing multicolor decorative tattoos in a preliminary study, achieving 79% removal in an average of 6.5 treatments, according to a report published in Lasers in Surgery and Medicine.
Nanosecond-domain Q-switched lasers have been the standard tools for tattoo removal for decades, but a new class of the device that generates picosecond-domain pulses was developed to remove tattoos more efficiently. In this prospective study, a prototype device (PicoWay, Syneron-Candela Corporation) was used to remove 31 multicolor tattoos on 21 patients aged 19-55 years (average age, 32 years), according to Dr. Eric F. Bernstein of Main Line Center for Laser Surgery, Ardmore, Pa., and his associates.
All the tattoos were previously untreated and measured no more than 10 cm by 10 cm in area. All were treated through a hydrogel dressing to protect the epidermis and minimize scarring. Treatment sessions were done at 6- to 10-week intervals until the tattoos were cleared or demonstrated a lack of further improvement, for a maximum of 7 treatments.
Three physicians blinded to treatment conditions assessed digital photographs of the treated areas taken at two fixed focal lengths before each treatment session, at 6-10 weeks following each treatment, and at 12 weeks after the final treatment session. This panel of physicians assessed the photographs, which were presented in a randomized order, grading them on percentage improvement for overall clearance and for clearance of each color contained within a given tattoo. The photographs had been taken with a cross-polarized flash, which enhances the view of the tattoo beyond what is normally seen by the naked eye.
Overall, the panel judged that the new device produced 79% clearance after an average of 6.5 treatments. Clearance scores were 92% for black ink, 85% for yellow ink, 80% for red ink, 78% for purple ink, 65% for green ink, and 43% for blue ink. Black and red inks were removed the most effectively, as expected; green and blue inks were more difficult to remove, also as expected. However, the 85% clearance of yellow ink, usually the most difficult color to remove, in an average of four sessions was “surprising and encouraging,” Dr. Bernstein and his associates said.
“It was hoped that picosecond-domain lasers would be ‘color blind’ and remove all colors equally; however, we found this not to be the case,” they noted (Lasers Surg Med. 2015;47[7]:542-8).
Regarding the safety of the new device, purpura was noted immediately after one treatment in one patient but completely resolved by the next session. Mild pinpoint bleeding developed immediately after 14% of sessions, almost always with attendant edema and erythema; all of these effects resolved with time. No immediate blistering was observed in any patient. At 3-month follow-up, no scarring and no moderate to severe pigmentary alterations were noted. Mild hypopigmentation occurred in red or yellow portions of two tattoos, and mild hyperpigmentation occurred in black areas of five tattoos.
“The true benefits of picosecond-domain devices for tattoo removal and other applications should become more apparent as these devices are used more frequently in clinical practice,” the investigators added.
This study was funded by Syneron-Candela Corporation, maker of the prototype picosecond-domain laser tested here. Dr. Bernstein reported serving as a consultant for Syneron-Candela, and two of his associates are employees of the company.
A prototype picosecond-domain Nd:YAG laser was safe and effective at removing multicolor decorative tattoos in a preliminary study, achieving 79% removal in an average of 6.5 treatments, according to a report published in Lasers in Surgery and Medicine.
Nanosecond-domain Q-switched lasers have been the standard tools for tattoo removal for decades, but a new class of the device that generates picosecond-domain pulses was developed to remove tattoos more efficiently. In this prospective study, a prototype device (PicoWay, Syneron-Candela Corporation) was used to remove 31 multicolor tattoos on 21 patients aged 19-55 years (average age, 32 years), according to Dr. Eric F. Bernstein of Main Line Center for Laser Surgery, Ardmore, Pa., and his associates.
All the tattoos were previously untreated and measured no more than 10 cm by 10 cm in area. All were treated through a hydrogel dressing to protect the epidermis and minimize scarring. Treatment sessions were done at 6- to 10-week intervals until the tattoos were cleared or demonstrated a lack of further improvement, for a maximum of 7 treatments.
Three physicians blinded to treatment conditions assessed digital photographs of the treated areas taken at two fixed focal lengths before each treatment session, at 6-10 weeks following each treatment, and at 12 weeks after the final treatment session. This panel of physicians assessed the photographs, which were presented in a randomized order, grading them on percentage improvement for overall clearance and for clearance of each color contained within a given tattoo. The photographs had been taken with a cross-polarized flash, which enhances the view of the tattoo beyond what is normally seen by the naked eye.
Overall, the panel judged that the new device produced 79% clearance after an average of 6.5 treatments. Clearance scores were 92% for black ink, 85% for yellow ink, 80% for red ink, 78% for purple ink, 65% for green ink, and 43% for blue ink. Black and red inks were removed the most effectively, as expected; green and blue inks were more difficult to remove, also as expected. However, the 85% clearance of yellow ink, usually the most difficult color to remove, in an average of four sessions was “surprising and encouraging,” Dr. Bernstein and his associates said.
“It was hoped that picosecond-domain lasers would be ‘color blind’ and remove all colors equally; however, we found this not to be the case,” they noted (Lasers Surg Med. 2015;47[7]:542-8).
Regarding the safety of the new device, purpura was noted immediately after one treatment in one patient but completely resolved by the next session. Mild pinpoint bleeding developed immediately after 14% of sessions, almost always with attendant edema and erythema; all of these effects resolved with time. No immediate blistering was observed in any patient. At 3-month follow-up, no scarring and no moderate to severe pigmentary alterations were noted. Mild hypopigmentation occurred in red or yellow portions of two tattoos, and mild hyperpigmentation occurred in black areas of five tattoos.
“The true benefits of picosecond-domain devices for tattoo removal and other applications should become more apparent as these devices are used more frequently in clinical practice,” the investigators added.
This study was funded by Syneron-Candela Corporation, maker of the prototype picosecond-domain laser tested here. Dr. Bernstein reported serving as a consultant for Syneron-Candela, and two of his associates are employees of the company.
A prototype picosecond-domain Nd:YAG laser was safe and effective at removing multicolor decorative tattoos in a preliminary study, achieving 79% removal in an average of 6.5 treatments, according to a report published in Lasers in Surgery and Medicine.
Nanosecond-domain Q-switched lasers have been the standard tools for tattoo removal for decades, but a new class of the device that generates picosecond-domain pulses was developed to remove tattoos more efficiently. In this prospective study, a prototype device (PicoWay, Syneron-Candela Corporation) was used to remove 31 multicolor tattoos on 21 patients aged 19-55 years (average age, 32 years), according to Dr. Eric F. Bernstein of Main Line Center for Laser Surgery, Ardmore, Pa., and his associates.
All the tattoos were previously untreated and measured no more than 10 cm by 10 cm in area. All were treated through a hydrogel dressing to protect the epidermis and minimize scarring. Treatment sessions were done at 6- to 10-week intervals until the tattoos were cleared or demonstrated a lack of further improvement, for a maximum of 7 treatments.
Three physicians blinded to treatment conditions assessed digital photographs of the treated areas taken at two fixed focal lengths before each treatment session, at 6-10 weeks following each treatment, and at 12 weeks after the final treatment session. This panel of physicians assessed the photographs, which were presented in a randomized order, grading them on percentage improvement for overall clearance and for clearance of each color contained within a given tattoo. The photographs had been taken with a cross-polarized flash, which enhances the view of the tattoo beyond what is normally seen by the naked eye.
Overall, the panel judged that the new device produced 79% clearance after an average of 6.5 treatments. Clearance scores were 92% for black ink, 85% for yellow ink, 80% for red ink, 78% for purple ink, 65% for green ink, and 43% for blue ink. Black and red inks were removed the most effectively, as expected; green and blue inks were more difficult to remove, also as expected. However, the 85% clearance of yellow ink, usually the most difficult color to remove, in an average of four sessions was “surprising and encouraging,” Dr. Bernstein and his associates said.
“It was hoped that picosecond-domain lasers would be ‘color blind’ and remove all colors equally; however, we found this not to be the case,” they noted (Lasers Surg Med. 2015;47[7]:542-8).
Regarding the safety of the new device, purpura was noted immediately after one treatment in one patient but completely resolved by the next session. Mild pinpoint bleeding developed immediately after 14% of sessions, almost always with attendant edema and erythema; all of these effects resolved with time. No immediate blistering was observed in any patient. At 3-month follow-up, no scarring and no moderate to severe pigmentary alterations were noted. Mild hypopigmentation occurred in red or yellow portions of two tattoos, and mild hyperpigmentation occurred in black areas of five tattoos.
“The true benefits of picosecond-domain devices for tattoo removal and other applications should become more apparent as these devices are used more frequently in clinical practice,” the investigators added.
This study was funded by Syneron-Candela Corporation, maker of the prototype picosecond-domain laser tested here. Dr. Bernstein reported serving as a consultant for Syneron-Candela, and two of his associates are employees of the company.
FROM LASERS IN SURGERY AND MEDICINE
Key clinical point: A prototype picosecond-domain laser achieved 79% removal of multicolor tattoos in an average of 6.5 treatments.
Major finding: Removal scores were 92% for black ink, 85% for yellow ink, 80% for red ink, 78% for purple ink, 65% for green ink, and 43% for blue ink.
Data source: A preliminary prospective study of the efficacy and safety of a picosecond-domain laser for removing 36 tattoos on 26 patients.
Disclosures: This study was funded by Syneron-Candela Corporation, maker of the prototype picosecond-domain laser tested here. Dr. Bernstein reported serving as a consultant for Syneron-Candela, and two of his associates are employees of the company.
Vitamin D, calcium don’t cut recurrent colorectal adenomas
Daily vitamin D and calcium supplements, taken alone or in combination, failed to reduce the risk of recurrent colorectal adenomas in middle-aged adults, according to a study published online Oct. 14 in the New England Journal of Medicine.
A great deal of evidence suggests that both vitamin D and calcium have antineoplastic properties, particularly in the colorectum. Researchers performed a multicenter, double-blind, placebo-controlled, randomized clinical trial at 11 academic medical centers to test the hypothesis that both agents individually would prevent recurrences in middle-aged patients who had just undergone removal of one or more colorectal adenomas, and that both agents taken together would be more chemoprotective than either one alone.
The investigators enrolled 2,259 individuals aged 45-75 years who were in good general health and who were expected to undergo follow-up colonoscopy either 3 or 5 years after they had the initial adenoma(s) removed. A total of 419 were randomly assigned to receive calcium alone (1,200 mg daily), 420 to receive vitamin D alone (1,000 IU daily), 421 to receive both supplements, 415 to receive two placebos, 295 to receive calcium plus placebo, and 289 to receive calcium plus vitamin D. Adherence to study medications and to follow-up colonoscopy was excellent, said Dr. John A. Baron of the Geisel School of Medicine at Dartmouth, Hanover N.H., and his associates.
A total of 880 participants were found to have recurrent adenomas during follow-up. Contrary to the investigators’ expectations, the study interventions, both alone and in combination, failed to exert any significant effect on the risk of recurrence. They also failed to affect the risk for advanced adenomas and for distal vs proximal adenomas, the investigators noted (N Engl J Med. 2015 Oct. 15. doi: 10.1056/NEJMoa1500409).
In subgroup analyses, the study interventions also failed to exert any meaningful effect according to patients’ baseline levels of vitamin D or calcium, or according to changes in patients’ vitamin D or calcium intake during the study period.
This study was large enough to detect modest chemopreventive effects. However, the dose of vitamin D was lower than that currently recommended by many experts, and was used for a limited time. So it is possible that vitamin D may have a modest chemopreventive potential, but not one as marked as some have proposed, Dr. Baron and his associates said.
Daily vitamin D and calcium supplements, taken alone or in combination, failed to reduce the risk of recurrent colorectal adenomas in middle-aged adults, according to a study published online Oct. 14 in the New England Journal of Medicine.
A great deal of evidence suggests that both vitamin D and calcium have antineoplastic properties, particularly in the colorectum. Researchers performed a multicenter, double-blind, placebo-controlled, randomized clinical trial at 11 academic medical centers to test the hypothesis that both agents individually would prevent recurrences in middle-aged patients who had just undergone removal of one or more colorectal adenomas, and that both agents taken together would be more chemoprotective than either one alone.
The investigators enrolled 2,259 individuals aged 45-75 years who were in good general health and who were expected to undergo follow-up colonoscopy either 3 or 5 years after they had the initial adenoma(s) removed. A total of 419 were randomly assigned to receive calcium alone (1,200 mg daily), 420 to receive vitamin D alone (1,000 IU daily), 421 to receive both supplements, 415 to receive two placebos, 295 to receive calcium plus placebo, and 289 to receive calcium plus vitamin D. Adherence to study medications and to follow-up colonoscopy was excellent, said Dr. John A. Baron of the Geisel School of Medicine at Dartmouth, Hanover N.H., and his associates.
A total of 880 participants were found to have recurrent adenomas during follow-up. Contrary to the investigators’ expectations, the study interventions, both alone and in combination, failed to exert any significant effect on the risk of recurrence. They also failed to affect the risk for advanced adenomas and for distal vs proximal adenomas, the investigators noted (N Engl J Med. 2015 Oct. 15. doi: 10.1056/NEJMoa1500409).
In subgroup analyses, the study interventions also failed to exert any meaningful effect according to patients’ baseline levels of vitamin D or calcium, or according to changes in patients’ vitamin D or calcium intake during the study period.
This study was large enough to detect modest chemopreventive effects. However, the dose of vitamin D was lower than that currently recommended by many experts, and was used for a limited time. So it is possible that vitamin D may have a modest chemopreventive potential, but not one as marked as some have proposed, Dr. Baron and his associates said.
Daily vitamin D and calcium supplements, taken alone or in combination, failed to reduce the risk of recurrent colorectal adenomas in middle-aged adults, according to a study published online Oct. 14 in the New England Journal of Medicine.
A great deal of evidence suggests that both vitamin D and calcium have antineoplastic properties, particularly in the colorectum. Researchers performed a multicenter, double-blind, placebo-controlled, randomized clinical trial at 11 academic medical centers to test the hypothesis that both agents individually would prevent recurrences in middle-aged patients who had just undergone removal of one or more colorectal adenomas, and that both agents taken together would be more chemoprotective than either one alone.
The investigators enrolled 2,259 individuals aged 45-75 years who were in good general health and who were expected to undergo follow-up colonoscopy either 3 or 5 years after they had the initial adenoma(s) removed. A total of 419 were randomly assigned to receive calcium alone (1,200 mg daily), 420 to receive vitamin D alone (1,000 IU daily), 421 to receive both supplements, 415 to receive two placebos, 295 to receive calcium plus placebo, and 289 to receive calcium plus vitamin D. Adherence to study medications and to follow-up colonoscopy was excellent, said Dr. John A. Baron of the Geisel School of Medicine at Dartmouth, Hanover N.H., and his associates.
A total of 880 participants were found to have recurrent adenomas during follow-up. Contrary to the investigators’ expectations, the study interventions, both alone and in combination, failed to exert any significant effect on the risk of recurrence. They also failed to affect the risk for advanced adenomas and for distal vs proximal adenomas, the investigators noted (N Engl J Med. 2015 Oct. 15. doi: 10.1056/NEJMoa1500409).
In subgroup analyses, the study interventions also failed to exert any meaningful effect according to patients’ baseline levels of vitamin D or calcium, or according to changes in patients’ vitamin D or calcium intake during the study period.
This study was large enough to detect modest chemopreventive effects. However, the dose of vitamin D was lower than that currently recommended by many experts, and was used for a limited time. So it is possible that vitamin D may have a modest chemopreventive potential, but not one as marked as some have proposed, Dr. Baron and his associates said.
FROM THE NEW ENGLAND JOURNAL OF MEDICINE
Key clinical point: Daily vitamin D and calcium supplements, taken alone or together, failed to reduce the risk of recurrent colorectal adenomas.
Major finding: During follow-up, 880 adults had recurrent adenomas; vitamin D and calcium supplements failed to exert any significant effect on the risk of recurrence.
Data source: A multicenter, placebo-controlled, double-blind, randomized clinical trial involving 2,259 adults followed for 3-5 years for recurrence of colorectal adenomas.
Disclosures: The National Cancer Institute supported the study. Pfizer Consume Healthcare provided the study medications. Financial disclosures of Dr. Baron and his associates are available with the full text of the article at NEJM.org.
Early inhaled budesonide cuts bronchopulmonary dysplasia, but ups mortality
Inhaled budesonide delivered within 24 hours of birth decreases the incidence of bronchopulmonary dysplasia in extremely preterm neonates, but this benefit may be offset by a possible increase in mortality, according to a report published online Oct. 15 in the New England Journal of Medicine.
Systemic glucocorticoids reduce the rate of bronchopulmonary dysplasia but appear to cause severe short- and long-term adverse effects including intestinal perforation and cerebral palsy. Administering the drugs by inhalation may avert these adverse systemic effects, but until now most studies of this mode of delivery have been small, haven’t initiated the treatment immediately after birth, and have produced inconclusive results. So researchers performed a large double-blind placebo-controlled randomized trial in which inhaled budesonide or a matching placebo was administered within 24 hours of birth to 863 extremely preterm neonates.
The infants were treated at 40 medical centers in nine countries during a 3-year period, until they no longer needed supplemental oxygen and positive-pressure support or reached a postmenstrual age of 32 weeks, said Dr. Dirk Bassler of the department of neonatology, University Hospital Zurich, and his associates.
The primary outcome measure – a composite of death or bronchopulmonary dysplasia at 36 weeks postmenstrual age – occurred in 40% of the budesonide group and 46% of the placebo group (relative risk, 0.86), indicating that the active drug produced a benefit of borderline significance, Dr. Bassler and his associates noted (N Engl J Med. 2015 Oct 14; doi: 10.1056/NEJMoa1501917). However, when the two components of the composite outcome were examined separately, inhaled budesonide was significantly better than was placebo at reducing the rate of bronchopulmonary dysplasia but was associated with a nonsignificant excess in mortality. The lung disorder developed in 28% of neonates assigned to active treatment and in 38% of those assigned to placebo (RR, 0.74), while mortality was 17% for budesonide and 14% for placebo (RR, 1.24). Notably, the nonsignificant difference in mortality may have been due to chance, the investigators said.
Budesonide also significantly reduced the incidence of two important secondary outcomes: patent ductus arteriosus requiring surgical ligation (RR, 0.55) and the need for reintubation after completion of the study drug (RR, 0.58). The therapy did not offer any benefit over placebo in the frequency of all other secondary outcomes, including retinopathy of prematurity, brain injury, necrotizing enterocolitis, patent ductus arteriosus requiring medical treatment, infections, oral candidiasis requiring treatment, hypertension requiring treatment, hyperglycemia requiring treatment, length of hospital stay, increase in weight or head circumference, and age at the last use of respiratory pressure support.
The rates of adverse events did not differ significantly between the two study groups.
The overall efficacy of early inhaled budesonide, as well as its associated risks, cannot be ascertained from these short-term outcomes alone. “Follow-up of our study cohort, including assessment of neurodevelopmental outcomes at 18-22 months of corrected age, is currently under way,” Dr. Bassler and his associates wrote.
This study was supported by the European Union and Chiesi Farmaceutici. Chiesi supplied the study drugs free of charge and Trudell Medical International supplied free spacers for the inhalers. Dr. Bassler and three of his associates reported receiving grant support and personal fees from Chiesi Farmaceutici. The other authors reported no relevant financial disclosures.
The risk/benefit profile of inhaled budesonide to prevent bronchopulmonary dysplasia remains uncertain, given that the treatment effects on the composite outcome in this study moved in opposite directions.
Inhaled budesonide’s ability to reduce rates of bronchopulmonary dysplasia, severe patent ductus arteriosus, and reintubation are probably real. But according to the available data, it is still uncertain whether the differential in mortality in favor of placebo represents truth or artifact.
Barbara Schmidt, M.D., is in the division of neonatology at Children’s Hospital of Philadelphia. She reported receiving nonfinancial support from Chiesi Farmaceutici outside of this work. Dr. Schmidt made these remarks in an editorial accompanying Dr. Bassler’s report (N Engl J Med. 2015 Oct 14. doi: 10.1056/NEJMe1509243).
The risk/benefit profile of inhaled budesonide to prevent bronchopulmonary dysplasia remains uncertain, given that the treatment effects on the composite outcome in this study moved in opposite directions.
Inhaled budesonide’s ability to reduce rates of bronchopulmonary dysplasia, severe patent ductus arteriosus, and reintubation are probably real. But according to the available data, it is still uncertain whether the differential in mortality in favor of placebo represents truth or artifact.
Barbara Schmidt, M.D., is in the division of neonatology at Children’s Hospital of Philadelphia. She reported receiving nonfinancial support from Chiesi Farmaceutici outside of this work. Dr. Schmidt made these remarks in an editorial accompanying Dr. Bassler’s report (N Engl J Med. 2015 Oct 14. doi: 10.1056/NEJMe1509243).
The risk/benefit profile of inhaled budesonide to prevent bronchopulmonary dysplasia remains uncertain, given that the treatment effects on the composite outcome in this study moved in opposite directions.
Inhaled budesonide’s ability to reduce rates of bronchopulmonary dysplasia, severe patent ductus arteriosus, and reintubation are probably real. But according to the available data, it is still uncertain whether the differential in mortality in favor of placebo represents truth or artifact.
Barbara Schmidt, M.D., is in the division of neonatology at Children’s Hospital of Philadelphia. She reported receiving nonfinancial support from Chiesi Farmaceutici outside of this work. Dr. Schmidt made these remarks in an editorial accompanying Dr. Bassler’s report (N Engl J Med. 2015 Oct 14. doi: 10.1056/NEJMe1509243).
Inhaled budesonide delivered within 24 hours of birth decreases the incidence of bronchopulmonary dysplasia in extremely preterm neonates, but this benefit may be offset by a possible increase in mortality, according to a report published online Oct. 15 in the New England Journal of Medicine.
Systemic glucocorticoids reduce the rate of bronchopulmonary dysplasia but appear to cause severe short- and long-term adverse effects including intestinal perforation and cerebral palsy. Administering the drugs by inhalation may avert these adverse systemic effects, but until now most studies of this mode of delivery have been small, haven’t initiated the treatment immediately after birth, and have produced inconclusive results. So researchers performed a large double-blind placebo-controlled randomized trial in which inhaled budesonide or a matching placebo was administered within 24 hours of birth to 863 extremely preterm neonates.
The infants were treated at 40 medical centers in nine countries during a 3-year period, until they no longer needed supplemental oxygen and positive-pressure support or reached a postmenstrual age of 32 weeks, said Dr. Dirk Bassler of the department of neonatology, University Hospital Zurich, and his associates.
The primary outcome measure – a composite of death or bronchopulmonary dysplasia at 36 weeks postmenstrual age – occurred in 40% of the budesonide group and 46% of the placebo group (relative risk, 0.86), indicating that the active drug produced a benefit of borderline significance, Dr. Bassler and his associates noted (N Engl J Med. 2015 Oct 14; doi: 10.1056/NEJMoa1501917). However, when the two components of the composite outcome were examined separately, inhaled budesonide was significantly better than was placebo at reducing the rate of bronchopulmonary dysplasia but was associated with a nonsignificant excess in mortality. The lung disorder developed in 28% of neonates assigned to active treatment and in 38% of those assigned to placebo (RR, 0.74), while mortality was 17% for budesonide and 14% for placebo (RR, 1.24). Notably, the nonsignificant difference in mortality may have been due to chance, the investigators said.
Budesonide also significantly reduced the incidence of two important secondary outcomes: patent ductus arteriosus requiring surgical ligation (RR, 0.55) and the need for reintubation after completion of the study drug (RR, 0.58). The therapy did not offer any benefit over placebo in the frequency of all other secondary outcomes, including retinopathy of prematurity, brain injury, necrotizing enterocolitis, patent ductus arteriosus requiring medical treatment, infections, oral candidiasis requiring treatment, hypertension requiring treatment, hyperglycemia requiring treatment, length of hospital stay, increase in weight or head circumference, and age at the last use of respiratory pressure support.
The rates of adverse events did not differ significantly between the two study groups.
The overall efficacy of early inhaled budesonide, as well as its associated risks, cannot be ascertained from these short-term outcomes alone. “Follow-up of our study cohort, including assessment of neurodevelopmental outcomes at 18-22 months of corrected age, is currently under way,” Dr. Bassler and his associates wrote.
This study was supported by the European Union and Chiesi Farmaceutici. Chiesi supplied the study drugs free of charge and Trudell Medical International supplied free spacers for the inhalers. Dr. Bassler and three of his associates reported receiving grant support and personal fees from Chiesi Farmaceutici. The other authors reported no relevant financial disclosures.
Inhaled budesonide delivered within 24 hours of birth decreases the incidence of bronchopulmonary dysplasia in extremely preterm neonates, but this benefit may be offset by a possible increase in mortality, according to a report published online Oct. 15 in the New England Journal of Medicine.
Systemic glucocorticoids reduce the rate of bronchopulmonary dysplasia but appear to cause severe short- and long-term adverse effects including intestinal perforation and cerebral palsy. Administering the drugs by inhalation may avert these adverse systemic effects, but until now most studies of this mode of delivery have been small, haven’t initiated the treatment immediately after birth, and have produced inconclusive results. So researchers performed a large double-blind placebo-controlled randomized trial in which inhaled budesonide or a matching placebo was administered within 24 hours of birth to 863 extremely preterm neonates.
The infants were treated at 40 medical centers in nine countries during a 3-year period, until they no longer needed supplemental oxygen and positive-pressure support or reached a postmenstrual age of 32 weeks, said Dr. Dirk Bassler of the department of neonatology, University Hospital Zurich, and his associates.
The primary outcome measure – a composite of death or bronchopulmonary dysplasia at 36 weeks postmenstrual age – occurred in 40% of the budesonide group and 46% of the placebo group (relative risk, 0.86), indicating that the active drug produced a benefit of borderline significance, Dr. Bassler and his associates noted (N Engl J Med. 2015 Oct 14; doi: 10.1056/NEJMoa1501917). However, when the two components of the composite outcome were examined separately, inhaled budesonide was significantly better than was placebo at reducing the rate of bronchopulmonary dysplasia but was associated with a nonsignificant excess in mortality. The lung disorder developed in 28% of neonates assigned to active treatment and in 38% of those assigned to placebo (RR, 0.74), while mortality was 17% for budesonide and 14% for placebo (RR, 1.24). Notably, the nonsignificant difference in mortality may have been due to chance, the investigators said.
Budesonide also significantly reduced the incidence of two important secondary outcomes: patent ductus arteriosus requiring surgical ligation (RR, 0.55) and the need for reintubation after completion of the study drug (RR, 0.58). The therapy did not offer any benefit over placebo in the frequency of all other secondary outcomes, including retinopathy of prematurity, brain injury, necrotizing enterocolitis, patent ductus arteriosus requiring medical treatment, infections, oral candidiasis requiring treatment, hypertension requiring treatment, hyperglycemia requiring treatment, length of hospital stay, increase in weight or head circumference, and age at the last use of respiratory pressure support.
The rates of adverse events did not differ significantly between the two study groups.
The overall efficacy of early inhaled budesonide, as well as its associated risks, cannot be ascertained from these short-term outcomes alone. “Follow-up of our study cohort, including assessment of neurodevelopmental outcomes at 18-22 months of corrected age, is currently under way,” Dr. Bassler and his associates wrote.
This study was supported by the European Union and Chiesi Farmaceutici. Chiesi supplied the study drugs free of charge and Trudell Medical International supplied free spacers for the inhalers. Dr. Bassler and three of his associates reported receiving grant support and personal fees from Chiesi Farmaceutici. The other authors reported no relevant financial disclosures.
FROM THE NEW ENGLAND JOURNAL OF MEDICINE
Key clinical point: Early inhaled budesonide decreases bronchopulmonary dysplasia in extremely preterm neonates, but may increase mortality.
Major finding: Bronchopulmonary dysplasia developed in 28% of neonates assigned to inhaled budesonide and in 38% of those assigned to placebo (RR, 0.74), while mortality was 17% with budesonide and 14% with placebo (RR, 1.24).
Data source: An international double-blind placebo-controlled randomized trial involving 863 extremely preterm neonates treated at 40 medical centers and followed to a postmenstrual age of 36 weeks.
Disclosures: This study was supported by the European Union and Chiesi Farmaceutici. Chiesi supplied the study drugs free of charge and Trudell Medical International supplied free spacers for the inhalers. Dr. Bassler and three of his associates reported receiving grant support and personal fees from Chiesi Farmaceutici. The other authors reported no relevant financial disclosures.
Antibiotics to reduce microbiota may improve treatment of sickle-cell disease
The human body’s microbiota regulates the aging of circulating neutrophils, and aged neutrophils, which are excessively active and adherent, promote tissue injury in inflammatory diseases. These two discoveries appear to point the way toward a simple, effective antibiotic treatment for sickle-cell disease, and may eventually lead to similar therapies for other disorders that induce inflammation-related organ damage, such as septic shock, according to a Research Letter published online Sept. 16 in Nature.
“To our knowledge, this is the first therapy shown to alleviate the chronic tissue damage induced by sickle-cell disease,” said Dachuan Zhang of the Gottesman Institute for Stem Cell and Regenerative Medicine Research and the department of cell biology, Albert Einstein College of Medicine, New York, and his associates. “Our results raise the possibility that manipulation of the microbiome may have sustained implications in disease outcome that should be further studied in clinical trials.”
In a series of in vitro and in vivo studies, the researchers demonstrated that aging neutrophils differ from others in that they are overactive and extra-adherent. Adherent neutrophils are already known to precipitate the acute vaso-occlusion that characterizes sickle-cell disease. Aging neutrophils also displayed other traits suggesting that exogenous inflammatory mediators may contribute to their excessive activity and adherence.
Dr. Zhang and his colleagues suspected that molecules in the microbiota – the ecologic community of all microorganisms residing in the body – may be involved, as they are known to cross the intestinal barrier to affect multiple systemic immune-cell populations, and a recent study suggested that the microbiota may regulate neutrophil production and function. To test this hypothesis they treated mice with broad-spectrum antibiotics, which caused dramatic depletion of microbiota volume and composition in the gut. This in turn significantly reduced aged neutrophils in the circulation, which immediately rebounded when the antibiotics were counteracted.
Further mouse studies revealed that neutrophil aging is delayed in a bacterially depleted environment, and that microbiota-derived molecules actually induce neutrophil aging. In a subsequent study of an in vivo model of septic shock, mice that were given antibiotics were protected from neutrophil-mediated damage in the vasculature and showed markedly prolonged survival, compared with untreated mice, the investigators noted (Nature. 2015 Sep 24;525[7570]. doi: 10.1038/nature15367 ).
In an in vivo model of sickle-cell disease, untreated mice with the disease showed markedly increased neutrophil activity and adhesion while affected mice given antibiotics showed marked microbiota depletion; enhanced blood flow; significantly reduced splenomegaly; and marked alleviation of liver necrosis, fibrosis, and inflammation. Survival was significantly improved in the treated mice. Finally, a laboratory-induced replenishment of aging neutrophils in the circulation resulted in acute vaso-occlusive crises and death within 10-30 hours in all affected mice.
“Together, these data suggest that the microbiota regulates aged neutrophil numbers, thereby affecting both acute vaso-occlusive crisis and the ensuing chronic tissue damage in sickle-cell disease,” Dr. Zhang and his associates said.
To assess how their findings applied to human beings, the investigators next studied 23 patients with sickle-cell disease who were not taking antibiotics, 11 patients with sickle-cell disease who were taking penicillin to prevent life-threatening infections, and 9 healthy control subjects. Compared with controls, only the patients who weren’t taking antibiotics showed a dramatic increase in circulating aged neutrophils. This protective effect of antibiotics was consistent across all ages, both genders, and regardless of hydroxyurea intake. Now, a prospective study involving age-matched participants is needed to confirm that antibiotics, by reducing the gut microbiota, decrease aged neutrophils in the circulation and thereby improve vaso-occlusive disease, the researchers said.
The American Heart Association, the National Institutes of Health, and the New York State Stem Cell Science Program funded the study. Dr. Zhang and his associates reported having no relevant disclosures.
The human body’s microbiota regulates the aging of circulating neutrophils, and aged neutrophils, which are excessively active and adherent, promote tissue injury in inflammatory diseases. These two discoveries appear to point the way toward a simple, effective antibiotic treatment for sickle-cell disease, and may eventually lead to similar therapies for other disorders that induce inflammation-related organ damage, such as septic shock, according to a Research Letter published online Sept. 16 in Nature.
“To our knowledge, this is the first therapy shown to alleviate the chronic tissue damage induced by sickle-cell disease,” said Dachuan Zhang of the Gottesman Institute for Stem Cell and Regenerative Medicine Research and the department of cell biology, Albert Einstein College of Medicine, New York, and his associates. “Our results raise the possibility that manipulation of the microbiome may have sustained implications in disease outcome that should be further studied in clinical trials.”
In a series of in vitro and in vivo studies, the researchers demonstrated that aging neutrophils differ from others in that they are overactive and extra-adherent. Adherent neutrophils are already known to precipitate the acute vaso-occlusion that characterizes sickle-cell disease. Aging neutrophils also displayed other traits suggesting that exogenous inflammatory mediators may contribute to their excessive activity and adherence.
Dr. Zhang and his colleagues suspected that molecules in the microbiota – the ecologic community of all microorganisms residing in the body – may be involved, as they are known to cross the intestinal barrier to affect multiple systemic immune-cell populations, and a recent study suggested that the microbiota may regulate neutrophil production and function. To test this hypothesis they treated mice with broad-spectrum antibiotics, which caused dramatic depletion of microbiota volume and composition in the gut. This in turn significantly reduced aged neutrophils in the circulation, which immediately rebounded when the antibiotics were counteracted.
Further mouse studies revealed that neutrophil aging is delayed in a bacterially depleted environment, and that microbiota-derived molecules actually induce neutrophil aging. In a subsequent study of an in vivo model of septic shock, mice that were given antibiotics were protected from neutrophil-mediated damage in the vasculature and showed markedly prolonged survival, compared with untreated mice, the investigators noted (Nature. 2015 Sep 24;525[7570]. doi: 10.1038/nature15367 ).
In an in vivo model of sickle-cell disease, untreated mice with the disease showed markedly increased neutrophil activity and adhesion while affected mice given antibiotics showed marked microbiota depletion; enhanced blood flow; significantly reduced splenomegaly; and marked alleviation of liver necrosis, fibrosis, and inflammation. Survival was significantly improved in the treated mice. Finally, a laboratory-induced replenishment of aging neutrophils in the circulation resulted in acute vaso-occlusive crises and death within 10-30 hours in all affected mice.
“Together, these data suggest that the microbiota regulates aged neutrophil numbers, thereby affecting both acute vaso-occlusive crisis and the ensuing chronic tissue damage in sickle-cell disease,” Dr. Zhang and his associates said.
To assess how their findings applied to human beings, the investigators next studied 23 patients with sickle-cell disease who were not taking antibiotics, 11 patients with sickle-cell disease who were taking penicillin to prevent life-threatening infections, and 9 healthy control subjects. Compared with controls, only the patients who weren’t taking antibiotics showed a dramatic increase in circulating aged neutrophils. This protective effect of antibiotics was consistent across all ages, both genders, and regardless of hydroxyurea intake. Now, a prospective study involving age-matched participants is needed to confirm that antibiotics, by reducing the gut microbiota, decrease aged neutrophils in the circulation and thereby improve vaso-occlusive disease, the researchers said.
The American Heart Association, the National Institutes of Health, and the New York State Stem Cell Science Program funded the study. Dr. Zhang and his associates reported having no relevant disclosures.
The human body’s microbiota regulates the aging of circulating neutrophils, and aged neutrophils, which are excessively active and adherent, promote tissue injury in inflammatory diseases. These two discoveries appear to point the way toward a simple, effective antibiotic treatment for sickle-cell disease, and may eventually lead to similar therapies for other disorders that induce inflammation-related organ damage, such as septic shock, according to a Research Letter published online Sept. 16 in Nature.
“To our knowledge, this is the first therapy shown to alleviate the chronic tissue damage induced by sickle-cell disease,” said Dachuan Zhang of the Gottesman Institute for Stem Cell and Regenerative Medicine Research and the department of cell biology, Albert Einstein College of Medicine, New York, and his associates. “Our results raise the possibility that manipulation of the microbiome may have sustained implications in disease outcome that should be further studied in clinical trials.”
In a series of in vitro and in vivo studies, the researchers demonstrated that aging neutrophils differ from others in that they are overactive and extra-adherent. Adherent neutrophils are already known to precipitate the acute vaso-occlusion that characterizes sickle-cell disease. Aging neutrophils also displayed other traits suggesting that exogenous inflammatory mediators may contribute to their excessive activity and adherence.
Dr. Zhang and his colleagues suspected that molecules in the microbiota – the ecologic community of all microorganisms residing in the body – may be involved, as they are known to cross the intestinal barrier to affect multiple systemic immune-cell populations, and a recent study suggested that the microbiota may regulate neutrophil production and function. To test this hypothesis they treated mice with broad-spectrum antibiotics, which caused dramatic depletion of microbiota volume and composition in the gut. This in turn significantly reduced aged neutrophils in the circulation, which immediately rebounded when the antibiotics were counteracted.
Further mouse studies revealed that neutrophil aging is delayed in a bacterially depleted environment, and that microbiota-derived molecules actually induce neutrophil aging. In a subsequent study of an in vivo model of septic shock, mice that were given antibiotics were protected from neutrophil-mediated damage in the vasculature and showed markedly prolonged survival, compared with untreated mice, the investigators noted (Nature. 2015 Sep 24;525[7570]. doi: 10.1038/nature15367 ).
In an in vivo model of sickle-cell disease, untreated mice with the disease showed markedly increased neutrophil activity and adhesion while affected mice given antibiotics showed marked microbiota depletion; enhanced blood flow; significantly reduced splenomegaly; and marked alleviation of liver necrosis, fibrosis, and inflammation. Survival was significantly improved in the treated mice. Finally, a laboratory-induced replenishment of aging neutrophils in the circulation resulted in acute vaso-occlusive crises and death within 10-30 hours in all affected mice.
“Together, these data suggest that the microbiota regulates aged neutrophil numbers, thereby affecting both acute vaso-occlusive crisis and the ensuing chronic tissue damage in sickle-cell disease,” Dr. Zhang and his associates said.
To assess how their findings applied to human beings, the investigators next studied 23 patients with sickle-cell disease who were not taking antibiotics, 11 patients with sickle-cell disease who were taking penicillin to prevent life-threatening infections, and 9 healthy control subjects. Compared with controls, only the patients who weren’t taking antibiotics showed a dramatic increase in circulating aged neutrophils. This protective effect of antibiotics was consistent across all ages, both genders, and regardless of hydroxyurea intake. Now, a prospective study involving age-matched participants is needed to confirm that antibiotics, by reducing the gut microbiota, decrease aged neutrophils in the circulation and thereby improve vaso-occlusive disease, the researchers said.
The American Heart Association, the National Institutes of Health, and the New York State Stem Cell Science Program funded the study. Dr. Zhang and his associates reported having no relevant disclosures.
FROM NATURE
Key clinical point: The body’s microbiota was found to regulate the aging of circulating neutrophils, a discovery that points the way to easily and markedly improve the chronic tissue damage induced by sickle-cell and perhaps other diseases.
Major finding: In an in vivo mouse model of sickle-cell disease, mice given antibiotics showed marked microbiota depletion; enhanced blood flow; significantly reduced splenomegaly; marked alleviation of liver necrosis, fibrosis, and inflammation; and significantly improved survival.
Data source: A series of in vitro, in vivo, and human studies, the latter involving 23 patients with SCD, 11 with SCD taking prophylactic antibiotics, and 9 healthy control subjects.
Disclosures: The American Heart Association, the National Institutes of Health, and the New York State Stem Cell Science Program funded the study. Dr. Zhang and his associates reported having no relevant disclosures.









