Slot System
Featured Buckets
Featured Buckets Admin

Mandatory disclosures may make docs avoid COIs

Article Type
Changed
Display Headline
Mandatory disclosures may make docs avoid COIs

Doctor consults with a family

Credit: Rhoda Baer

Previous research has indicated that requiring conflict of interest (COI) disclosures may lead advisers to give more biased advice.

However, virtually all of the prior studies questioned the effectiveness of COI disclosures that advisers were unable to avoid.

With a new study, researchers examined situations in which advisers have the ability to avoid COIs—such as doctors who can decide whether to accept gifts or payments from pharmaceutical companies.

And the results showed that when COIs can be avoided, disclosure successfully deters advisers from accepting COIs, so they have nothing to disclose except the absence of conflicts.

The research was published in Psychological Science.

“Prior research has cast doubt as to the effectiveness of disclosure for managing conflicts of interest, particularly when consumers have the burden of interpreting and reacting to the information,” said study author Sunita Sah, PhD, MB ChB, of Georgetown University in Washington, DC.

“Our findings suggest that disclosure can become a successful intervention to managing some conflicts of interest if it motivates professionals or providers to avoid such conflicts.”

For this study, the researchers conducted 3 experiments to determine how COIs influence advisers. In the first experiment, 97 adviser–advisee pairs participated in an online game with Amazon.com gift cards at stake.

Advisers informed the advisees regarding the number of filled dots on a grid. The estimators were paid based on their accuracy, but advisers had a conflict. They were paid more if advisees gave an estimate that was higher than the true value.

The set up—with advisees only seeing a small subset of the complete grid—was designed to simulate a situation in which a consumer receives advice from a better-informed but conflicted professional. The results replicated previous research and showed that disclosure led advisers to give higher (and more biased) recommendations than nondisclosure.

In the second experiment, the researchers again randomly assigned pairs of advisees and advisers to conditions in which the conflict was either disclosed or not disclosed.

There was, however, an important change from the first study. Advisers were given a choice of whether to accept or reject the COI.

Without disclosure, a majority of advisers (63%) chose the incentives that created a COI. But with disclosure, a minority (33%) accepted the conflict.

Advice was higher (and more biased) for those who chose incentives with conflicts than for those who did not, and advisers in the disclosure condition gave significantly less biased advice than those in the nondisclosure condition.

Finally, in a study with 248 participants, the researchers added a third condition to the second experiment: voluntary disclosure. In this third condition, advisers decided both whether to choose incentives that entailed a conflict and whether to disclose if they had a conflict.

Similar to mandatory disclosure, voluntary disclosure led advisers to avoid COIs and then disclose their freedom from conflicts to advisees.

“Disclosure doesn’t seem to be much good when conflicts are unavoidable, but it does seem to help when advisers have a choice about whether to subject themselves to conflicts,” said study author George Loewenstein, PhD, of Carnegie Mellon University in Pittsburgh.

“A nice feature of disclosure is that it is, in effect, ‘self-calibrating.’ Doctors, for example, are unlikely to find it worth it to accept small gifts such as pens or calendars if the gifts are going to be disclosed. Although larger gifts would be more tempting, doctors are likely to be deterred from accepting them because disclosure of large gifts would be more damaging to their reputations.”

Publications
Topics

Doctor consults with a family

Credit: Rhoda Baer

Previous research has indicated that requiring conflict of interest (COI) disclosures may lead advisers to give more biased advice.

However, virtually all of the prior studies questioned the effectiveness of COI disclosures that advisers were unable to avoid.

With a new study, researchers examined situations in which advisers have the ability to avoid COIs—such as doctors who can decide whether to accept gifts or payments from pharmaceutical companies.

And the results showed that when COIs can be avoided, disclosure successfully deters advisers from accepting COIs, so they have nothing to disclose except the absence of conflicts.

The research was published in Psychological Science.

“Prior research has cast doubt as to the effectiveness of disclosure for managing conflicts of interest, particularly when consumers have the burden of interpreting and reacting to the information,” said study author Sunita Sah, PhD, MB ChB, of Georgetown University in Washington, DC.

“Our findings suggest that disclosure can become a successful intervention to managing some conflicts of interest if it motivates professionals or providers to avoid such conflicts.”

For this study, the researchers conducted 3 experiments to determine how COIs influence advisers. In the first experiment, 97 adviser–advisee pairs participated in an online game with Amazon.com gift cards at stake.

Advisers informed the advisees regarding the number of filled dots on a grid. The estimators were paid based on their accuracy, but advisers had a conflict. They were paid more if advisees gave an estimate that was higher than the true value.

The set up—with advisees only seeing a small subset of the complete grid—was designed to simulate a situation in which a consumer receives advice from a better-informed but conflicted professional. The results replicated previous research and showed that disclosure led advisers to give higher (and more biased) recommendations than nondisclosure.

In the second experiment, the researchers again randomly assigned pairs of advisees and advisers to conditions in which the conflict was either disclosed or not disclosed.

There was, however, an important change from the first study. Advisers were given a choice of whether to accept or reject the COI.

Without disclosure, a majority of advisers (63%) chose the incentives that created a COI. But with disclosure, a minority (33%) accepted the conflict.

Advice was higher (and more biased) for those who chose incentives with conflicts than for those who did not, and advisers in the disclosure condition gave significantly less biased advice than those in the nondisclosure condition.

Finally, in a study with 248 participants, the researchers added a third condition to the second experiment: voluntary disclosure. In this third condition, advisers decided both whether to choose incentives that entailed a conflict and whether to disclose if they had a conflict.

Similar to mandatory disclosure, voluntary disclosure led advisers to avoid COIs and then disclose their freedom from conflicts to advisees.

“Disclosure doesn’t seem to be much good when conflicts are unavoidable, but it does seem to help when advisers have a choice about whether to subject themselves to conflicts,” said study author George Loewenstein, PhD, of Carnegie Mellon University in Pittsburgh.

“A nice feature of disclosure is that it is, in effect, ‘self-calibrating.’ Doctors, for example, are unlikely to find it worth it to accept small gifts such as pens or calendars if the gifts are going to be disclosed. Although larger gifts would be more tempting, doctors are likely to be deterred from accepting them because disclosure of large gifts would be more damaging to their reputations.”

Doctor consults with a family

Credit: Rhoda Baer

Previous research has indicated that requiring conflict of interest (COI) disclosures may lead advisers to give more biased advice.

However, virtually all of the prior studies questioned the effectiveness of COI disclosures that advisers were unable to avoid.

With a new study, researchers examined situations in which advisers have the ability to avoid COIs—such as doctors who can decide whether to accept gifts or payments from pharmaceutical companies.

And the results showed that when COIs can be avoided, disclosure successfully deters advisers from accepting COIs, so they have nothing to disclose except the absence of conflicts.

The research was published in Psychological Science.

“Prior research has cast doubt as to the effectiveness of disclosure for managing conflicts of interest, particularly when consumers have the burden of interpreting and reacting to the information,” said study author Sunita Sah, PhD, MB ChB, of Georgetown University in Washington, DC.

“Our findings suggest that disclosure can become a successful intervention to managing some conflicts of interest if it motivates professionals or providers to avoid such conflicts.”

For this study, the researchers conducted 3 experiments to determine how COIs influence advisers. In the first experiment, 97 adviser–advisee pairs participated in an online game with Amazon.com gift cards at stake.

Advisers informed the advisees regarding the number of filled dots on a grid. The estimators were paid based on their accuracy, but advisers had a conflict. They were paid more if advisees gave an estimate that was higher than the true value.

The set up—with advisees only seeing a small subset of the complete grid—was designed to simulate a situation in which a consumer receives advice from a better-informed but conflicted professional. The results replicated previous research and showed that disclosure led advisers to give higher (and more biased) recommendations than nondisclosure.

In the second experiment, the researchers again randomly assigned pairs of advisees and advisers to conditions in which the conflict was either disclosed or not disclosed.

There was, however, an important change from the first study. Advisers were given a choice of whether to accept or reject the COI.

Without disclosure, a majority of advisers (63%) chose the incentives that created a COI. But with disclosure, a minority (33%) accepted the conflict.

Advice was higher (and more biased) for those who chose incentives with conflicts than for those who did not, and advisers in the disclosure condition gave significantly less biased advice than those in the nondisclosure condition.

Finally, in a study with 248 participants, the researchers added a third condition to the second experiment: voluntary disclosure. In this third condition, advisers decided both whether to choose incentives that entailed a conflict and whether to disclose if they had a conflict.

Similar to mandatory disclosure, voluntary disclosure led advisers to avoid COIs and then disclose their freedom from conflicts to advisees.

“Disclosure doesn’t seem to be much good when conflicts are unavoidable, but it does seem to help when advisers have a choice about whether to subject themselves to conflicts,” said study author George Loewenstein, PhD, of Carnegie Mellon University in Pittsburgh.

“A nice feature of disclosure is that it is, in effect, ‘self-calibrating.’ Doctors, for example, are unlikely to find it worth it to accept small gifts such as pens or calendars if the gifts are going to be disclosed. Although larger gifts would be more tempting, doctors are likely to be deterred from accepting them because disclosure of large gifts would be more damaging to their reputations.”

Publications
Publications
Topics
Article Type
Display Headline
Mandatory disclosures may make docs avoid COIs
Display Headline
Mandatory disclosures may make docs avoid COIs
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

High-volume centers better for severe sepsis patients

Article Type
Changed
Display Headline
High-volume centers better for severe sepsis patients

Doctor and patient

Credit: CDC

A new study suggests that “practice makes perfect” when it comes to caring for patients with severe sepsis.

Researchers found that patients admitted to academic medical centers with a higher volume of severe sepsis patients had significantly lower mortality rates than patients treated at centers with lower volumes of sepsis patients.

And the superior outcomes did not come at a greater cost.

Allan J. Walkey, MD, of the Boston University School of Medicine in Massachusetts, and his colleagues reported these findings in the American Journal of Respiratory and Critical Care Medicine.

The researchers noted that processes of care can influence outcomes in patients with severe sepsis. However, it hasn’t been clear whether a hospital’s level of experience in caring for patients with severe sepsis affects patient outcomes.

So Dr Walkey and his colleagues conducted a large, retrospective study to find out. The team analyzed data from academic hospitals across the US, provided by the University HealthSystem Consortium.

They identified 56,997 patients with severe sepsis who were admitted to 124 academic medical centers in 2011.

The patients’ median length of stay was 12.5 days, the median direct cost for each patient was $26,304, and the average hospital mortality was 25.6 ± 5.3%.

Hospitals caring for more sepsis patients had a 7% lower mortality rate than hospitals with lower volumes.

The high-volume medical centers (604-977 cases) had a 22.2% adjusted mortality rate, and the lower-volume hospitals (30-317) had a 29.2% adjusted mortality rate (P<0.01).

There was no significant difference in direct costs between the low-volume and high-volume centers (P=0.79).

“Given the lack of new drugs to treat severe sepsis, medical professionals must look at other ways to increase patient safety and positive outcomes, including the process of how we deliver care,” Dr Walkey said.

“Our study results demonstrate that hospitals with more experience caring for patients with severe sepsis were able to achieve better outcomes than hospitals with less experience with sepsis, possibly due to better processes of care for patients with sepsis.”

Publications
Topics

Doctor and patient

Credit: CDC

A new study suggests that “practice makes perfect” when it comes to caring for patients with severe sepsis.

Researchers found that patients admitted to academic medical centers with a higher volume of severe sepsis patients had significantly lower mortality rates than patients treated at centers with lower volumes of sepsis patients.

And the superior outcomes did not come at a greater cost.

Allan J. Walkey, MD, of the Boston University School of Medicine in Massachusetts, and his colleagues reported these findings in the American Journal of Respiratory and Critical Care Medicine.

The researchers noted that processes of care can influence outcomes in patients with severe sepsis. However, it hasn’t been clear whether a hospital’s level of experience in caring for patients with severe sepsis affects patient outcomes.

So Dr Walkey and his colleagues conducted a large, retrospective study to find out. The team analyzed data from academic hospitals across the US, provided by the University HealthSystem Consortium.

They identified 56,997 patients with severe sepsis who were admitted to 124 academic medical centers in 2011.

The patients’ median length of stay was 12.5 days, the median direct cost for each patient was $26,304, and the average hospital mortality was 25.6 ± 5.3%.

Hospitals caring for more sepsis patients had a 7% lower mortality rate than hospitals with lower volumes.

The high-volume medical centers (604-977 cases) had a 22.2% adjusted mortality rate, and the lower-volume hospitals (30-317) had a 29.2% adjusted mortality rate (P<0.01).

There was no significant difference in direct costs between the low-volume and high-volume centers (P=0.79).

“Given the lack of new drugs to treat severe sepsis, medical professionals must look at other ways to increase patient safety and positive outcomes, including the process of how we deliver care,” Dr Walkey said.

“Our study results demonstrate that hospitals with more experience caring for patients with severe sepsis were able to achieve better outcomes than hospitals with less experience with sepsis, possibly due to better processes of care for patients with sepsis.”

Doctor and patient

Credit: CDC

A new study suggests that “practice makes perfect” when it comes to caring for patients with severe sepsis.

Researchers found that patients admitted to academic medical centers with a higher volume of severe sepsis patients had significantly lower mortality rates than patients treated at centers with lower volumes of sepsis patients.

And the superior outcomes did not come at a greater cost.

Allan J. Walkey, MD, of the Boston University School of Medicine in Massachusetts, and his colleagues reported these findings in the American Journal of Respiratory and Critical Care Medicine.

The researchers noted that processes of care can influence outcomes in patients with severe sepsis. However, it hasn’t been clear whether a hospital’s level of experience in caring for patients with severe sepsis affects patient outcomes.

So Dr Walkey and his colleagues conducted a large, retrospective study to find out. The team analyzed data from academic hospitals across the US, provided by the University HealthSystem Consortium.

They identified 56,997 patients with severe sepsis who were admitted to 124 academic medical centers in 2011.

The patients’ median length of stay was 12.5 days, the median direct cost for each patient was $26,304, and the average hospital mortality was 25.6 ± 5.3%.

Hospitals caring for more sepsis patients had a 7% lower mortality rate than hospitals with lower volumes.

The high-volume medical centers (604-977 cases) had a 22.2% adjusted mortality rate, and the lower-volume hospitals (30-317) had a 29.2% adjusted mortality rate (P<0.01).

There was no significant difference in direct costs between the low-volume and high-volume centers (P=0.79).

“Given the lack of new drugs to treat severe sepsis, medical professionals must look at other ways to increase patient safety and positive outcomes, including the process of how we deliver care,” Dr Walkey said.

“Our study results demonstrate that hospitals with more experience caring for patients with severe sepsis were able to achieve better outcomes than hospitals with less experience with sepsis, possibly due to better processes of care for patients with sepsis.”

Publications
Publications
Topics
Article Type
Display Headline
High-volume centers better for severe sepsis patients
Display Headline
High-volume centers better for severe sepsis patients
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Group simulates blood vessel growth

Article Type
Changed
Display Headline
Group simulates blood vessel growth

Angiogenesis

Louis Heiser & Robert Ackland

Bioengineers say they’ve found a way to accurately predict blood vessel growth, and this finding has implications for cancers and other diseases.

The team discovered that tiny blood vessels grow better in the lab if the tissue surrounding them is less dense.

And this discovery allowed them to create a computer simulation that can accurately predict such growth.

“Better understanding of the processes that regulate the growth of blood vessels puts us in a position, ultimately, to develop new treatments for diseases related to blood vessel growth,” said study author Jeff Weiss, PhD, of the University of Utah in Salt Lake City.

Dr Weiss and his colleagues described their research in PLOS ONE.

Like some previous studies, the group’s research showed that capillaries grow, branch, and interconnect best when the density of the surrounding tissue, the extracellular matrix, is lower rather than higher. But unlike earlier research, Dr Weiss and his colleagues used pieces of real blood vessels from rats (rather than single cells).

Earlier work also focused on how the extracellular matrix, made mostly of collagen, sends chemical signals to promote capillary growth. The current study focused more on how the collagen’s mechanical or physical properties—specifically, the density or stiffness of the matrix—affect blood vessel growth.

Both the lab experiments and computer simulations showed that the denser or stiffer this collagen matrix, the more difficult it is for blood vessels to form a network necessary to supply blood to living tissue.

Growing blood vessels

To grow a network of blood vessels, the researchers extracted blood vessel fragments from the fat tissues of rats and suspended them in liquid. This extract contained 35,000 of those blood-vessel fragments per mL of solution.

The blood vessel fragments were grown in plastic plates with tiny mold-like wells filled with gel-like collagen as the extracellular matrix. The team cultured the fragments for 6 days with 3 densities of collagen: 2 mg, 3 mg, and 4 mg of collagen per mL of solution.

Vessels in the lower-density collagen grew and branched more, had fewer dead ends, and interconnected with each other better than the vessels growing in the higher-density collagen. These blood vessel networks mirrored those found in living mammals.

Simulating growth

The vessels grown in the lab provided data on total length of the vessels, the degree to which they connected into a network of vessels, and the number of vessels branches and dead ends.

And these data allowed the researchers to program a 3-D computer simulation that accurately predicted blood vessel network formation based on collagen matrix density.

“Now, we can answer all sorts of ‘what if’ questions about the geometry of these tissues, their shape, boundaries, initial densities, and mechanical properties,” Dr Weiss said. “We can use the computer to predict the influence that these factors have in the layout of a vascular network structure.”

The 3-D computer simulation also enabled the researchers to “conduct” experiments that couldn’t be done in the lab. One simulation showed blood vessels grow easily from denser toward less-dense collagen, but not the other way around.

A second simulation showed that vessels grew in collagen, except where a dense piece of collagen was placed in the center of less-dense collagen.

The third simulation showed that when researchers simulated 2 bands of less-dense collagen surrounded by bands of stiffer collagen, the nerve vessels grew along the bands of lower density.

Applications for cancer, other diseases

The researchers said these findings could ultimately be applied to aid the development of treatments for patients with cancer or diabetes, as well as patients who have had a heart attack and those who require tissue implants.

 

 

By better understanding the role that density of surrounding tissue plays in vessel formation, bioengineers could prepare “prevascularized” implantable tissues already equipped with blood vessels that match a patient’s blood vessel structure.

Prevascularized tissues might also help diabetes patients suffering from wounds that heal slowly—if at all—due to impaired blood microcirculation. Implanted skin grafts with their own blood vessels could stimulate blood flow to promote healing of diabetic ulcers.

Dr Weiss said he envisions prevascularized patches rehabilitating heart muscle that is damaged when a heart attack cuts off part of the heart’s oxygen supply, turning some of the heart into stiff scar tissue. A tissue patch implanted on the scar tissue could encourage blood vessel regrowth to repair the damaged, oxygen-deprived heart muscle.

As for cancer metastasis, most tumors begin as dense, blood-free masses. To grow and spread, the tumor tricks the body into fueling it with oxygenated blood vessels.

“The vessels grow in and then provide a pathway for the tumor to spread,” Dr Weiss noted. “This research will help us understand the physical parameters that control whether blood vessels reach the tumor.”

Publications
Topics

Angiogenesis

Louis Heiser & Robert Ackland

Bioengineers say they’ve found a way to accurately predict blood vessel growth, and this finding has implications for cancers and other diseases.

The team discovered that tiny blood vessels grow better in the lab if the tissue surrounding them is less dense.

And this discovery allowed them to create a computer simulation that can accurately predict such growth.

“Better understanding of the processes that regulate the growth of blood vessels puts us in a position, ultimately, to develop new treatments for diseases related to blood vessel growth,” said study author Jeff Weiss, PhD, of the University of Utah in Salt Lake City.

Dr Weiss and his colleagues described their research in PLOS ONE.

Like some previous studies, the group’s research showed that capillaries grow, branch, and interconnect best when the density of the surrounding tissue, the extracellular matrix, is lower rather than higher. But unlike earlier research, Dr Weiss and his colleagues used pieces of real blood vessels from rats (rather than single cells).

Earlier work also focused on how the extracellular matrix, made mostly of collagen, sends chemical signals to promote capillary growth. The current study focused more on how the collagen’s mechanical or physical properties—specifically, the density or stiffness of the matrix—affect blood vessel growth.

Both the lab experiments and computer simulations showed that the denser or stiffer this collagen matrix, the more difficult it is for blood vessels to form a network necessary to supply blood to living tissue.

Growing blood vessels

To grow a network of blood vessels, the researchers extracted blood vessel fragments from the fat tissues of rats and suspended them in liquid. This extract contained 35,000 of those blood-vessel fragments per mL of solution.

The blood vessel fragments were grown in plastic plates with tiny mold-like wells filled with gel-like collagen as the extracellular matrix. The team cultured the fragments for 6 days with 3 densities of collagen: 2 mg, 3 mg, and 4 mg of collagen per mL of solution.

Vessels in the lower-density collagen grew and branched more, had fewer dead ends, and interconnected with each other better than the vessels growing in the higher-density collagen. These blood vessel networks mirrored those found in living mammals.

Simulating growth

The vessels grown in the lab provided data on total length of the vessels, the degree to which they connected into a network of vessels, and the number of vessels branches and dead ends.

And these data allowed the researchers to program a 3-D computer simulation that accurately predicted blood vessel network formation based on collagen matrix density.

“Now, we can answer all sorts of ‘what if’ questions about the geometry of these tissues, their shape, boundaries, initial densities, and mechanical properties,” Dr Weiss said. “We can use the computer to predict the influence that these factors have in the layout of a vascular network structure.”

The 3-D computer simulation also enabled the researchers to “conduct” experiments that couldn’t be done in the lab. One simulation showed blood vessels grow easily from denser toward less-dense collagen, but not the other way around.

A second simulation showed that vessels grew in collagen, except where a dense piece of collagen was placed in the center of less-dense collagen.

The third simulation showed that when researchers simulated 2 bands of less-dense collagen surrounded by bands of stiffer collagen, the nerve vessels grew along the bands of lower density.

Applications for cancer, other diseases

The researchers said these findings could ultimately be applied to aid the development of treatments for patients with cancer or diabetes, as well as patients who have had a heart attack and those who require tissue implants.

 

 

By better understanding the role that density of surrounding tissue plays in vessel formation, bioengineers could prepare “prevascularized” implantable tissues already equipped with blood vessels that match a patient’s blood vessel structure.

Prevascularized tissues might also help diabetes patients suffering from wounds that heal slowly—if at all—due to impaired blood microcirculation. Implanted skin grafts with their own blood vessels could stimulate blood flow to promote healing of diabetic ulcers.

Dr Weiss said he envisions prevascularized patches rehabilitating heart muscle that is damaged when a heart attack cuts off part of the heart’s oxygen supply, turning some of the heart into stiff scar tissue. A tissue patch implanted on the scar tissue could encourage blood vessel regrowth to repair the damaged, oxygen-deprived heart muscle.

As for cancer metastasis, most tumors begin as dense, blood-free masses. To grow and spread, the tumor tricks the body into fueling it with oxygenated blood vessels.

“The vessels grow in and then provide a pathway for the tumor to spread,” Dr Weiss noted. “This research will help us understand the physical parameters that control whether blood vessels reach the tumor.”

Angiogenesis

Louis Heiser & Robert Ackland

Bioengineers say they’ve found a way to accurately predict blood vessel growth, and this finding has implications for cancers and other diseases.

The team discovered that tiny blood vessels grow better in the lab if the tissue surrounding them is less dense.

And this discovery allowed them to create a computer simulation that can accurately predict such growth.

“Better understanding of the processes that regulate the growth of blood vessels puts us in a position, ultimately, to develop new treatments for diseases related to blood vessel growth,” said study author Jeff Weiss, PhD, of the University of Utah in Salt Lake City.

Dr Weiss and his colleagues described their research in PLOS ONE.

Like some previous studies, the group’s research showed that capillaries grow, branch, and interconnect best when the density of the surrounding tissue, the extracellular matrix, is lower rather than higher. But unlike earlier research, Dr Weiss and his colleagues used pieces of real blood vessels from rats (rather than single cells).

Earlier work also focused on how the extracellular matrix, made mostly of collagen, sends chemical signals to promote capillary growth. The current study focused more on how the collagen’s mechanical or physical properties—specifically, the density or stiffness of the matrix—affect blood vessel growth.

Both the lab experiments and computer simulations showed that the denser or stiffer this collagen matrix, the more difficult it is for blood vessels to form a network necessary to supply blood to living tissue.

Growing blood vessels

To grow a network of blood vessels, the researchers extracted blood vessel fragments from the fat tissues of rats and suspended them in liquid. This extract contained 35,000 of those blood-vessel fragments per mL of solution.

The blood vessel fragments were grown in plastic plates with tiny mold-like wells filled with gel-like collagen as the extracellular matrix. The team cultured the fragments for 6 days with 3 densities of collagen: 2 mg, 3 mg, and 4 mg of collagen per mL of solution.

Vessels in the lower-density collagen grew and branched more, had fewer dead ends, and interconnected with each other better than the vessels growing in the higher-density collagen. These blood vessel networks mirrored those found in living mammals.

Simulating growth

The vessels grown in the lab provided data on total length of the vessels, the degree to which they connected into a network of vessels, and the number of vessels branches and dead ends.

And these data allowed the researchers to program a 3-D computer simulation that accurately predicted blood vessel network formation based on collagen matrix density.

“Now, we can answer all sorts of ‘what if’ questions about the geometry of these tissues, their shape, boundaries, initial densities, and mechanical properties,” Dr Weiss said. “We can use the computer to predict the influence that these factors have in the layout of a vascular network structure.”

The 3-D computer simulation also enabled the researchers to “conduct” experiments that couldn’t be done in the lab. One simulation showed blood vessels grow easily from denser toward less-dense collagen, but not the other way around.

A second simulation showed that vessels grew in collagen, except where a dense piece of collagen was placed in the center of less-dense collagen.

The third simulation showed that when researchers simulated 2 bands of less-dense collagen surrounded by bands of stiffer collagen, the nerve vessels grew along the bands of lower density.

Applications for cancer, other diseases

The researchers said these findings could ultimately be applied to aid the development of treatments for patients with cancer or diabetes, as well as patients who have had a heart attack and those who require tissue implants.

 

 

By better understanding the role that density of surrounding tissue plays in vessel formation, bioengineers could prepare “prevascularized” implantable tissues already equipped with blood vessels that match a patient’s blood vessel structure.

Prevascularized tissues might also help diabetes patients suffering from wounds that heal slowly—if at all—due to impaired blood microcirculation. Implanted skin grafts with their own blood vessels could stimulate blood flow to promote healing of diabetic ulcers.

Dr Weiss said he envisions prevascularized patches rehabilitating heart muscle that is damaged when a heart attack cuts off part of the heart’s oxygen supply, turning some of the heart into stiff scar tissue. A tissue patch implanted on the scar tissue could encourage blood vessel regrowth to repair the damaged, oxygen-deprived heart muscle.

As for cancer metastasis, most tumors begin as dense, blood-free masses. To grow and spread, the tumor tricks the body into fueling it with oxygenated blood vessels.

“The vessels grow in and then provide a pathway for the tumor to spread,” Dr Weiss noted. “This research will help us understand the physical parameters that control whether blood vessels reach the tumor.”

Publications
Publications
Topics
Article Type
Display Headline
Group simulates blood vessel growth
Display Headline
Group simulates blood vessel growth
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Study explains why FDA rejects new drug applications

Article Type
Changed
Display Headline
Study explains why FDA rejects new drug applications

Preparing medication for a trial

Credit: Esther Dyson

New research suggests that drugs are often rejected by the US Food and Drug Administration (FDA), not because they are unsafe or ineffective, but because there is not enough evidence to determine the drugs’ safety and efficacy.

Investigators reviewed about 300 drug applications and discovered a number of reasons why drugs were denied approval on the first try.

The FDA cited issues with dosing, trial populations, study endpoints, and inconsistencies in data as reasons for denial.

Leonard V. Sacks, MBBCh, of the FDA in Silver Springs, Maryland, and his colleagues conducted this research and reported the results in JAMA.

The team reviewed marketing applications for all new molecular entities (NMEs; active ingredients never before marketed in the US) first submitted to the FDA between 2000 and 2012.

They used FDA correspondence and reviews to determine the scientific and regulatory reasons approvals were delayed or denied.

Of the 302 NME applications, 222 (73.5%) ultimately achieved marketing approval.

Half of all NMEs (151) were rejected on the first try, but 71 (47.0%) of these were approved following resubmission. The median time to approval was 435 days after the first action letter (range, 47-2374 days).

Drugs were denied approval for a number of reasons, including:

  • Uncertainty about the optimal dose to maximize efficacy and minimize safety risks (15.9%)
  • Inconsistent results for multiple predefined study endpoints (13.2%)
  • Trial endpoints were unsatisfactory (13.2%)
  • Inconsistencies in efficacy for portions of the study population (11.3%)
  • The populations studied did not reflect the populations likely to use the drug (7.3%).

The investigators also found that the frequency of safety deficiencies was similar among never-approved drugs and drugs with delayed approval. However, efficacy deficiencies were significantly more frequent among the never-approved drugs than among those with delayed approvals.

There were 48 drugs with initial efficacy concerns, and only 31.3% of these were eventually approved, compared to 61.5% of the 39 drugs with safety concerns alone.

There were 20 drugs (13.2%) that, despite showing superiority to placebo, were considered to have inadequate efficacy compared with the standard of care.

The investigators said that, taken together, these findings suggest there is room for improvement in new drug applications. But if drug sponsors increase communication with the FDA, particularly with regard to study design, they could reduce delays in drug approval.

Publications
Topics

Preparing medication for a trial

Credit: Esther Dyson

New research suggests that drugs are often rejected by the US Food and Drug Administration (FDA), not because they are unsafe or ineffective, but because there is not enough evidence to determine the drugs’ safety and efficacy.

Investigators reviewed about 300 drug applications and discovered a number of reasons why drugs were denied approval on the first try.

The FDA cited issues with dosing, trial populations, study endpoints, and inconsistencies in data as reasons for denial.

Leonard V. Sacks, MBBCh, of the FDA in Silver Springs, Maryland, and his colleagues conducted this research and reported the results in JAMA.

The team reviewed marketing applications for all new molecular entities (NMEs; active ingredients never before marketed in the US) first submitted to the FDA between 2000 and 2012.

They used FDA correspondence and reviews to determine the scientific and regulatory reasons approvals were delayed or denied.

Of the 302 NME applications, 222 (73.5%) ultimately achieved marketing approval.

Half of all NMEs (151) were rejected on the first try, but 71 (47.0%) of these were approved following resubmission. The median time to approval was 435 days after the first action letter (range, 47-2374 days).

Drugs were denied approval for a number of reasons, including:

  • Uncertainty about the optimal dose to maximize efficacy and minimize safety risks (15.9%)
  • Inconsistent results for multiple predefined study endpoints (13.2%)
  • Trial endpoints were unsatisfactory (13.2%)
  • Inconsistencies in efficacy for portions of the study population (11.3%)
  • The populations studied did not reflect the populations likely to use the drug (7.3%).

The investigators also found that the frequency of safety deficiencies was similar among never-approved drugs and drugs with delayed approval. However, efficacy deficiencies were significantly more frequent among the never-approved drugs than among those with delayed approvals.

There were 48 drugs with initial efficacy concerns, and only 31.3% of these were eventually approved, compared to 61.5% of the 39 drugs with safety concerns alone.

There were 20 drugs (13.2%) that, despite showing superiority to placebo, were considered to have inadequate efficacy compared with the standard of care.

The investigators said that, taken together, these findings suggest there is room for improvement in new drug applications. But if drug sponsors increase communication with the FDA, particularly with regard to study design, they could reduce delays in drug approval.

Preparing medication for a trial

Credit: Esther Dyson

New research suggests that drugs are often rejected by the US Food and Drug Administration (FDA), not because they are unsafe or ineffective, but because there is not enough evidence to determine the drugs’ safety and efficacy.

Investigators reviewed about 300 drug applications and discovered a number of reasons why drugs were denied approval on the first try.

The FDA cited issues with dosing, trial populations, study endpoints, and inconsistencies in data as reasons for denial.

Leonard V. Sacks, MBBCh, of the FDA in Silver Springs, Maryland, and his colleagues conducted this research and reported the results in JAMA.

The team reviewed marketing applications for all new molecular entities (NMEs; active ingredients never before marketed in the US) first submitted to the FDA between 2000 and 2012.

They used FDA correspondence and reviews to determine the scientific and regulatory reasons approvals were delayed or denied.

Of the 302 NME applications, 222 (73.5%) ultimately achieved marketing approval.

Half of all NMEs (151) were rejected on the first try, but 71 (47.0%) of these were approved following resubmission. The median time to approval was 435 days after the first action letter (range, 47-2374 days).

Drugs were denied approval for a number of reasons, including:

  • Uncertainty about the optimal dose to maximize efficacy and minimize safety risks (15.9%)
  • Inconsistent results for multiple predefined study endpoints (13.2%)
  • Trial endpoints were unsatisfactory (13.2%)
  • Inconsistencies in efficacy for portions of the study population (11.3%)
  • The populations studied did not reflect the populations likely to use the drug (7.3%).

The investigators also found that the frequency of safety deficiencies was similar among never-approved drugs and drugs with delayed approval. However, efficacy deficiencies were significantly more frequent among the never-approved drugs than among those with delayed approvals.

There were 48 drugs with initial efficacy concerns, and only 31.3% of these were eventually approved, compared to 61.5% of the 39 drugs with safety concerns alone.

There were 20 drugs (13.2%) that, despite showing superiority to placebo, were considered to have inadequate efficacy compared with the standard of care.

The investigators said that, taken together, these findings suggest there is room for improvement in new drug applications. But if drug sponsors increase communication with the FDA, particularly with regard to study design, they could reduce delays in drug approval.

Publications
Publications
Topics
Article Type
Display Headline
Study explains why FDA rejects new drug applications
Display Headline
Study explains why FDA rejects new drug applications
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

FDA working to end IV fluid shortage

Article Type
Changed
Display Headline
FDA working to end IV fluid shortage

Bag of saline solution

The US Food and Drug Administration (FDA) has acknowledged the current shortage of intravenous (IV) solutions, particularly 0.9% sodium chloride injection (ie, saline), which is used as a priming solution and to provide patients with the necessary fluids for hydration.

The agency said the shortage has been triggered by a range of factors. A few manufacturers have cited increased demand as the cause, and the FDA said this could be a result of flu season.

The agency is now working with 3 manufacturers of IV solutions, Baxter Healthcare Corp., B. Braun Medical Inc., and Hospira Inc., to help preserve the supply of these products.

However, the FDA noted that addressing the shortage will depend on the demand of these products and supplier production. Millions of these IV solutions are used each week by healthcare professionals.

Visit the FDA’s drug shortage webpage for updates.

Publications
Topics

Bag of saline solution

The US Food and Drug Administration (FDA) has acknowledged the current shortage of intravenous (IV) solutions, particularly 0.9% sodium chloride injection (ie, saline), which is used as a priming solution and to provide patients with the necessary fluids for hydration.

The agency said the shortage has been triggered by a range of factors. A few manufacturers have cited increased demand as the cause, and the FDA said this could be a result of flu season.

The agency is now working with 3 manufacturers of IV solutions, Baxter Healthcare Corp., B. Braun Medical Inc., and Hospira Inc., to help preserve the supply of these products.

However, the FDA noted that addressing the shortage will depend on the demand of these products and supplier production. Millions of these IV solutions are used each week by healthcare professionals.

Visit the FDA’s drug shortage webpage for updates.

Bag of saline solution

The US Food and Drug Administration (FDA) has acknowledged the current shortage of intravenous (IV) solutions, particularly 0.9% sodium chloride injection (ie, saline), which is used as a priming solution and to provide patients with the necessary fluids for hydration.

The agency said the shortage has been triggered by a range of factors. A few manufacturers have cited increased demand as the cause, and the FDA said this could be a result of flu season.

The agency is now working with 3 manufacturers of IV solutions, Baxter Healthcare Corp., B. Braun Medical Inc., and Hospira Inc., to help preserve the supply of these products.

However, the FDA noted that addressing the shortage will depend on the demand of these products and supplier production. Millions of these IV solutions are used each week by healthcare professionals.

Visit the FDA’s drug shortage webpage for updates.

Publications
Publications
Topics
Article Type
Display Headline
FDA working to end IV fluid shortage
Display Headline
FDA working to end IV fluid shortage
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

FDA doesn’t hold drug trials to same standards

Article Type
Changed
Display Headline
FDA doesn’t hold drug trials to same standards

Drug production

Credit: FDA

A new study suggests the US Food and Drug Administration (FDA) does not hold drug trials to the same set of standards.

The research revealed substantial differences in trials used to support drugs approved between 2005 and 2012.

Some drugs were approved based on results from multiple studies, while other approvals were based on data from a single trial.

Furthermore, trials varied greatly with regard to size, length of study period, type of comparator, and metrics of efficacy.

These results appear in the current issue of JAMA.

“Based on our analysis, some drugs are approved on the basis of large, high-quality clinical trials, while others are approved based on results of smaller trials,” said senior study author Joseph Ross, MD, of the Yale School of Medicine in New Haven, Connecticut.

“There was a lack of uniformity in the level of evidence the FDA used. We also found that only 40% of drug approvals involved a clinical trial that compared a new drug to existing treatment offerings. This is an important step for determining whether the new drug is a better option than existing, older drugs.”

Dr Ross and his colleagues evaluated the strength of clinical trial evidence supporting FDA approval decisions by characterizing key features of efficacy trials, such as size, duration, and endpoints.

The researchers used publicly available FDA documents to identify 188 drugs approved between 2005 and 2012 for 206 indications on the basis of 448 pivotal efficacy trials.

The team identified trials for 201 of the indications. Four drugs (including 1 used for 2 different indications) were approved without a pivotal efficacy trial.

So among the 201 indications, the median number of trials reviewed per indication was 2 (interquartile range [IQR], 1-2.5). Seventy-four indications (36.8%) were approved on the basis of a single trial, 77 (38.3%) on data from 2 trials, and 50 (24.9%) on data from 3 or more trials.

Most trials were randomized (89.3%) and double-blinded (79.5%). The median duration of a trial was 14.0 weeks (IQR, 6.0-26.0 weeks), and 113 trials (25.2%) lasted 6 months or longer.

The median number of total subjects enrolled in a trial was 446 (IQR, 205-678), and the median number of patients in the intervention arm of a study was 271 (IQR, 133-426).

More than half of trials (55.1%) used a placebo for comparison, 31.9% used an active comparator (such as another drug), and 12.9% had no comparator.

The primary endpoint was a surrogate outcome in 48.9% of trials, a clinical outcome for 29%, and a clinical scale for 22.1%.

These results suggest the quality of clinical trial evidence the FDA uses to make approval decisions varies widely across indications, the researchers said.

Study author Nicholas S. Downing, a student at the Yale School of Medicine, noted that survey data suggest patients expect drugs approved by the FDA to be both safe and effective.

“Based on our study of the data, we can’t be certain that this expectation is necessarily justified,” he said, “given the quantity and quality of the variability we saw in the drug approval process.”

Publications
Topics

Drug production

Credit: FDA

A new study suggests the US Food and Drug Administration (FDA) does not hold drug trials to the same set of standards.

The research revealed substantial differences in trials used to support drugs approved between 2005 and 2012.

Some drugs were approved based on results from multiple studies, while other approvals were based on data from a single trial.

Furthermore, trials varied greatly with regard to size, length of study period, type of comparator, and metrics of efficacy.

These results appear in the current issue of JAMA.

“Based on our analysis, some drugs are approved on the basis of large, high-quality clinical trials, while others are approved based on results of smaller trials,” said senior study author Joseph Ross, MD, of the Yale School of Medicine in New Haven, Connecticut.

“There was a lack of uniformity in the level of evidence the FDA used. We also found that only 40% of drug approvals involved a clinical trial that compared a new drug to existing treatment offerings. This is an important step for determining whether the new drug is a better option than existing, older drugs.”

Dr Ross and his colleagues evaluated the strength of clinical trial evidence supporting FDA approval decisions by characterizing key features of efficacy trials, such as size, duration, and endpoints.

The researchers used publicly available FDA documents to identify 188 drugs approved between 2005 and 2012 for 206 indications on the basis of 448 pivotal efficacy trials.

The team identified trials for 201 of the indications. Four drugs (including 1 used for 2 different indications) were approved without a pivotal efficacy trial.

So among the 201 indications, the median number of trials reviewed per indication was 2 (interquartile range [IQR], 1-2.5). Seventy-four indications (36.8%) were approved on the basis of a single trial, 77 (38.3%) on data from 2 trials, and 50 (24.9%) on data from 3 or more trials.

Most trials were randomized (89.3%) and double-blinded (79.5%). The median duration of a trial was 14.0 weeks (IQR, 6.0-26.0 weeks), and 113 trials (25.2%) lasted 6 months or longer.

The median number of total subjects enrolled in a trial was 446 (IQR, 205-678), and the median number of patients in the intervention arm of a study was 271 (IQR, 133-426).

More than half of trials (55.1%) used a placebo for comparison, 31.9% used an active comparator (such as another drug), and 12.9% had no comparator.

The primary endpoint was a surrogate outcome in 48.9% of trials, a clinical outcome for 29%, and a clinical scale for 22.1%.

These results suggest the quality of clinical trial evidence the FDA uses to make approval decisions varies widely across indications, the researchers said.

Study author Nicholas S. Downing, a student at the Yale School of Medicine, noted that survey data suggest patients expect drugs approved by the FDA to be both safe and effective.

“Based on our study of the data, we can’t be certain that this expectation is necessarily justified,” he said, “given the quantity and quality of the variability we saw in the drug approval process.”

Drug production

Credit: FDA

A new study suggests the US Food and Drug Administration (FDA) does not hold drug trials to the same set of standards.

The research revealed substantial differences in trials used to support drugs approved between 2005 and 2012.

Some drugs were approved based on results from multiple studies, while other approvals were based on data from a single trial.

Furthermore, trials varied greatly with regard to size, length of study period, type of comparator, and metrics of efficacy.

These results appear in the current issue of JAMA.

“Based on our analysis, some drugs are approved on the basis of large, high-quality clinical trials, while others are approved based on results of smaller trials,” said senior study author Joseph Ross, MD, of the Yale School of Medicine in New Haven, Connecticut.

“There was a lack of uniformity in the level of evidence the FDA used. We also found that only 40% of drug approvals involved a clinical trial that compared a new drug to existing treatment offerings. This is an important step for determining whether the new drug is a better option than existing, older drugs.”

Dr Ross and his colleagues evaluated the strength of clinical trial evidence supporting FDA approval decisions by characterizing key features of efficacy trials, such as size, duration, and endpoints.

The researchers used publicly available FDA documents to identify 188 drugs approved between 2005 and 2012 for 206 indications on the basis of 448 pivotal efficacy trials.

The team identified trials for 201 of the indications. Four drugs (including 1 used for 2 different indications) were approved without a pivotal efficacy trial.

So among the 201 indications, the median number of trials reviewed per indication was 2 (interquartile range [IQR], 1-2.5). Seventy-four indications (36.8%) were approved on the basis of a single trial, 77 (38.3%) on data from 2 trials, and 50 (24.9%) on data from 3 or more trials.

Most trials were randomized (89.3%) and double-blinded (79.5%). The median duration of a trial was 14.0 weeks (IQR, 6.0-26.0 weeks), and 113 trials (25.2%) lasted 6 months or longer.

The median number of total subjects enrolled in a trial was 446 (IQR, 205-678), and the median number of patients in the intervention arm of a study was 271 (IQR, 133-426).

More than half of trials (55.1%) used a placebo for comparison, 31.9% used an active comparator (such as another drug), and 12.9% had no comparator.

The primary endpoint was a surrogate outcome in 48.9% of trials, a clinical outcome for 29%, and a clinical scale for 22.1%.

These results suggest the quality of clinical trial evidence the FDA uses to make approval decisions varies widely across indications, the researchers said.

Study author Nicholas S. Downing, a student at the Yale School of Medicine, noted that survey data suggest patients expect drugs approved by the FDA to be both safe and effective.

“Based on our study of the data, we can’t be certain that this expectation is necessarily justified,” he said, “given the quantity and quality of the variability we saw in the drug approval process.”

Publications
Publications
Topics
Article Type
Display Headline
FDA doesn’t hold drug trials to same standards
Display Headline
FDA doesn’t hold drug trials to same standards
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

‘Blood-Type Diet’ theory doesn’t hold up

Article Type
Changed
Display Headline
‘Blood-Type Diet’ theory doesn’t hold up

Blood sample collection

Credit: Juan D. Alfonso

The theory behind the “Blood-Type Diet”—which claims an individual’s nutritional needs vary by blood type—is not valid, according to a study published in PLOS ONE.

The study showed that 3 of the 4 blood-type-specific diets conferred positive effects. But these benefits occurred independently of a person’s ABO genotype.

“The way an individual responds to any one of these diets has absolutely nothing to do with their blood type and has everything to do with their ability to stick to a sensible vegetarian or low-carbohydrate diet,” said study author Ahmed El-Sohemy, PhD, of the University of Toronto in Ontario, Canada.

About the diet(s)

The Blood-Type Diet was popularized in the book Eat Right for Your Type, written by Peter D’Adamo, ND. His theory is that people with different blood types process food differently, and the ABO blood type should match the dietary habits of our ancestors.

According to the theory, individuals adhering to a blood-type-specific diet can improve their health and decrease the risk of chronic illness such as cardiovascular disease.

The Type-A diet recommends that subjects consume mostly grains, fruits, and vegetables. The Type-B diet promotes a high intake of dairy products and a moderate intake of other food groups.

The Type-AB diet is similar to the Type-B diet but has more restrictions on specific foods. And the Type-O diet recommends that subjects consume mostly meat and avoid grain products.

Study results

To test that Blood-Type Diet theory, Dr El-Sohemy and his colleagues analyzed a population of 1455 adults aged 20 to 29 years. Subjects provided detailed information about their usual diets, as well as fasting blood samples.

The researchers used the samples to determine subjects’ ABO blood type and their level of cardiometabolic risk factors, such as insulin, cholesterol, and triglycerides.

The team also calculated diet scores based on the food items listed in Eat Right for Your Type to determine subjects’ relative adherence to each of the 4 blood-type diets.

Subjects whose diets closely resembled the Type-A diet had a lower body mass index and waist circumference, as well as reduced blood pressure, serum cholesterol, triglycerides, insulin, HOMA-IR, and HOMA-Beta (P<0.05). But these effects were seen regardless of the individual’s blood type.

Subjects whose diets resembled the Type-AB diet had reduced blood pressure, serum cholesterol, triglycerides, insulin, HOMA-IR, and HOMA-Beta (P<0.05), regardless of their blood type.

Adhering to the recommendations of the Type-O diet was associated with lower triglyceride levels (P<0.0001), regardless of blood type.

And there were no significant associations for subjects whose eating habits corresponded to the Type-B diet.

“[W]e found no evidence to support the Blood-Type Diet theory,” Dr El-Sohemy said. “It was an intriguing hypothesis, so we felt we should put it to the test. We can now be confident in saying that the Blood-Type Diet hypothesis is false.”

Publications
Topics

Blood sample collection

Credit: Juan D. Alfonso

The theory behind the “Blood-Type Diet”—which claims an individual’s nutritional needs vary by blood type—is not valid, according to a study published in PLOS ONE.

The study showed that 3 of the 4 blood-type-specific diets conferred positive effects. But these benefits occurred independently of a person’s ABO genotype.

“The way an individual responds to any one of these diets has absolutely nothing to do with their blood type and has everything to do with their ability to stick to a sensible vegetarian or low-carbohydrate diet,” said study author Ahmed El-Sohemy, PhD, of the University of Toronto in Ontario, Canada.

About the diet(s)

The Blood-Type Diet was popularized in the book Eat Right for Your Type, written by Peter D’Adamo, ND. His theory is that people with different blood types process food differently, and the ABO blood type should match the dietary habits of our ancestors.

According to the theory, individuals adhering to a blood-type-specific diet can improve their health and decrease the risk of chronic illness such as cardiovascular disease.

The Type-A diet recommends that subjects consume mostly grains, fruits, and vegetables. The Type-B diet promotes a high intake of dairy products and a moderate intake of other food groups.

The Type-AB diet is similar to the Type-B diet but has more restrictions on specific foods. And the Type-O diet recommends that subjects consume mostly meat and avoid grain products.

Study results

To test that Blood-Type Diet theory, Dr El-Sohemy and his colleagues analyzed a population of 1455 adults aged 20 to 29 years. Subjects provided detailed information about their usual diets, as well as fasting blood samples.

The researchers used the samples to determine subjects’ ABO blood type and their level of cardiometabolic risk factors, such as insulin, cholesterol, and triglycerides.

The team also calculated diet scores based on the food items listed in Eat Right for Your Type to determine subjects’ relative adherence to each of the 4 blood-type diets.

Subjects whose diets closely resembled the Type-A diet had a lower body mass index and waist circumference, as well as reduced blood pressure, serum cholesterol, triglycerides, insulin, HOMA-IR, and HOMA-Beta (P<0.05). But these effects were seen regardless of the individual’s blood type.

Subjects whose diets resembled the Type-AB diet had reduced blood pressure, serum cholesterol, triglycerides, insulin, HOMA-IR, and HOMA-Beta (P<0.05), regardless of their blood type.

Adhering to the recommendations of the Type-O diet was associated with lower triglyceride levels (P<0.0001), regardless of blood type.

And there were no significant associations for subjects whose eating habits corresponded to the Type-B diet.

“[W]e found no evidence to support the Blood-Type Diet theory,” Dr El-Sohemy said. “It was an intriguing hypothesis, so we felt we should put it to the test. We can now be confident in saying that the Blood-Type Diet hypothesis is false.”

Blood sample collection

Credit: Juan D. Alfonso

The theory behind the “Blood-Type Diet”—which claims an individual’s nutritional needs vary by blood type—is not valid, according to a study published in PLOS ONE.

The study showed that 3 of the 4 blood-type-specific diets conferred positive effects. But these benefits occurred independently of a person’s ABO genotype.

“The way an individual responds to any one of these diets has absolutely nothing to do with their blood type and has everything to do with their ability to stick to a sensible vegetarian or low-carbohydrate diet,” said study author Ahmed El-Sohemy, PhD, of the University of Toronto in Ontario, Canada.

About the diet(s)

The Blood-Type Diet was popularized in the book Eat Right for Your Type, written by Peter D’Adamo, ND. His theory is that people with different blood types process food differently, and the ABO blood type should match the dietary habits of our ancestors.

According to the theory, individuals adhering to a blood-type-specific diet can improve their health and decrease the risk of chronic illness such as cardiovascular disease.

The Type-A diet recommends that subjects consume mostly grains, fruits, and vegetables. The Type-B diet promotes a high intake of dairy products and a moderate intake of other food groups.

The Type-AB diet is similar to the Type-B diet but has more restrictions on specific foods. And the Type-O diet recommends that subjects consume mostly meat and avoid grain products.

Study results

To test that Blood-Type Diet theory, Dr El-Sohemy and his colleagues analyzed a population of 1455 adults aged 20 to 29 years. Subjects provided detailed information about their usual diets, as well as fasting blood samples.

The researchers used the samples to determine subjects’ ABO blood type and their level of cardiometabolic risk factors, such as insulin, cholesterol, and triglycerides.

The team also calculated diet scores based on the food items listed in Eat Right for Your Type to determine subjects’ relative adherence to each of the 4 blood-type diets.

Subjects whose diets closely resembled the Type-A diet had a lower body mass index and waist circumference, as well as reduced blood pressure, serum cholesterol, triglycerides, insulin, HOMA-IR, and HOMA-Beta (P<0.05). But these effects were seen regardless of the individual’s blood type.

Subjects whose diets resembled the Type-AB diet had reduced blood pressure, serum cholesterol, triglycerides, insulin, HOMA-IR, and HOMA-Beta (P<0.05), regardless of their blood type.

Adhering to the recommendations of the Type-O diet was associated with lower triglyceride levels (P<0.0001), regardless of blood type.

And there were no significant associations for subjects whose eating habits corresponded to the Type-B diet.

“[W]e found no evidence to support the Blood-Type Diet theory,” Dr El-Sohemy said. “It was an intriguing hypothesis, so we felt we should put it to the test. We can now be confident in saying that the Blood-Type Diet hypothesis is false.”

Publications
Publications
Topics
Article Type
Display Headline
‘Blood-Type Diet’ theory doesn’t hold up
Display Headline
‘Blood-Type Diet’ theory doesn’t hold up
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Compound active against a range of cancers

Article Type
Changed
Display Headline
Compound active against a range of cancers

The NCI-60 cell line U251

expressing NFAT3c-GFP

A little-studied chemical compound has “wide and potent” anticancer activity, investigators have reported in Cancer Cell.

The compound, BMH-21, works by inhibiting the RNA polymerase transcription pathway (Pol I), thereby preventing cancer cell communication and replication.

“Without this transcription machinery, cancer cells cannot function,” said study author Marikki Laiho, MD, PhD, of the Johns Hopkins University School of Medicine in Baltimore, Maryland.

She and her colleagues homed in on BMH-21 by screening a library of chemical compounds thought to have potential for anticancer activity.

Specifically, the team looked at the compounds’ ability to interfere with transcription in the National Cancer Institute’s collection of 60 human tumor cell lines (known as NCI-60).

BMH-21 demonstrated activity against all 9 cancer types studied—leukemia and melanoma, as well as breast, CNS, colon, lung, ovarian, prostate, and renal cancers.

The drug also repressed tumor growth in mouse models of colon cancer and melanoma.

Additional analyses showed that BMH-21 inhibited Pol I transcription and caused disintegration of the nucleolus. The drug activated loss of the Pol I catalytic subunit RPA194, which led to disassembly of the Pol I holocomplex from the ribosomal DNA.

And the loss of RPA194, which was a result of increased proteasome-mediated turnover, was associated with decreased cancer cell viability.

Dr Laiho and her colleagues are continuing studies of BMH-21 in animal models to confirm the drug’s anticancer activity, identify any toxicities associated with the compound, and determine the optimal dose.

And because Pol I activity is frequently deregulated in cancers, the investigators believe BMH-21 could have therapeutic potential for many malignancies.

Dr Laiho is currently collaborating with experts in multiple myeloma, medullary thyroid cancer, and prostate cancer to explore the drug’s activity in these malignancies.

Publications
Topics

The NCI-60 cell line U251

expressing NFAT3c-GFP

A little-studied chemical compound has “wide and potent” anticancer activity, investigators have reported in Cancer Cell.

The compound, BMH-21, works by inhibiting the RNA polymerase transcription pathway (Pol I), thereby preventing cancer cell communication and replication.

“Without this transcription machinery, cancer cells cannot function,” said study author Marikki Laiho, MD, PhD, of the Johns Hopkins University School of Medicine in Baltimore, Maryland.

She and her colleagues homed in on BMH-21 by screening a library of chemical compounds thought to have potential for anticancer activity.

Specifically, the team looked at the compounds’ ability to interfere with transcription in the National Cancer Institute’s collection of 60 human tumor cell lines (known as NCI-60).

BMH-21 demonstrated activity against all 9 cancer types studied—leukemia and melanoma, as well as breast, CNS, colon, lung, ovarian, prostate, and renal cancers.

The drug also repressed tumor growth in mouse models of colon cancer and melanoma.

Additional analyses showed that BMH-21 inhibited Pol I transcription and caused disintegration of the nucleolus. The drug activated loss of the Pol I catalytic subunit RPA194, which led to disassembly of the Pol I holocomplex from the ribosomal DNA.

And the loss of RPA194, which was a result of increased proteasome-mediated turnover, was associated with decreased cancer cell viability.

Dr Laiho and her colleagues are continuing studies of BMH-21 in animal models to confirm the drug’s anticancer activity, identify any toxicities associated with the compound, and determine the optimal dose.

And because Pol I activity is frequently deregulated in cancers, the investigators believe BMH-21 could have therapeutic potential for many malignancies.

Dr Laiho is currently collaborating with experts in multiple myeloma, medullary thyroid cancer, and prostate cancer to explore the drug’s activity in these malignancies.

The NCI-60 cell line U251

expressing NFAT3c-GFP

A little-studied chemical compound has “wide and potent” anticancer activity, investigators have reported in Cancer Cell.

The compound, BMH-21, works by inhibiting the RNA polymerase transcription pathway (Pol I), thereby preventing cancer cell communication and replication.

“Without this transcription machinery, cancer cells cannot function,” said study author Marikki Laiho, MD, PhD, of the Johns Hopkins University School of Medicine in Baltimore, Maryland.

She and her colleagues homed in on BMH-21 by screening a library of chemical compounds thought to have potential for anticancer activity.

Specifically, the team looked at the compounds’ ability to interfere with transcription in the National Cancer Institute’s collection of 60 human tumor cell lines (known as NCI-60).

BMH-21 demonstrated activity against all 9 cancer types studied—leukemia and melanoma, as well as breast, CNS, colon, lung, ovarian, prostate, and renal cancers.

The drug also repressed tumor growth in mouse models of colon cancer and melanoma.

Additional analyses showed that BMH-21 inhibited Pol I transcription and caused disintegration of the nucleolus. The drug activated loss of the Pol I catalytic subunit RPA194, which led to disassembly of the Pol I holocomplex from the ribosomal DNA.

And the loss of RPA194, which was a result of increased proteasome-mediated turnover, was associated with decreased cancer cell viability.

Dr Laiho and her colleagues are continuing studies of BMH-21 in animal models to confirm the drug’s anticancer activity, identify any toxicities associated with the compound, and determine the optimal dose.

And because Pol I activity is frequently deregulated in cancers, the investigators believe BMH-21 could have therapeutic potential for many malignancies.

Dr Laiho is currently collaborating with experts in multiple myeloma, medullary thyroid cancer, and prostate cancer to explore the drug’s activity in these malignancies.

Publications
Publications
Topics
Article Type
Display Headline
Compound active against a range of cancers
Display Headline
Compound active against a range of cancers
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

E coli has applications for malaria vaccine

Article Type
Changed
Display Headline
E coli has applications for malaria vaccine

E coli cluster
Credit: USDA

E coli bacteria may enable inexpensive production of a transmission-blocking malaria vaccine, according to a paper published in Infection and Immunity.

Scientists used E coli to create a new process to purify and refold codon harmonized recombinant Pfs25 (CHrPfs25).

Pfs25 is a sexual-stage antigen of Plasmodium falciparum expressed on the surface of zygote and ookinete forms of the parasite.

Research has shown that monoclonal antibodies directed against native Pfs25 can prevent development of P falciparum oocysts in the midgut of the mosquito.

So Pfs25 is a potential vaccine candidate, but producing it has proven challenging and costly.

“Malaria affects the poorest of the poor,” said study author Nirbhay Kumar, PhD, of Tulane University School of Public Health and Tropical Medicine in New Orleans, Louisiana.

“And if you are trying to make a vaccine for those billions of people who are at risk, you need to make it cheaper to manufacture. We think that producing this protein in bacteria will make it very cost-effective for large-scale vaccine production.”

To create the vaccine for the current study, Dr Kumar and his colleagues expressed CHrPfs25 in E coli, purified the protein after simple oxidative refolding steps, and formulated it in several adjuvants.

The team then tested the final product in mice. Antibodies present after vaccination recognized native Pfs25 on the surface of live gametes of P falciparum and demonstrated complete malaria transmission-blocking activity.

The transmission-blocking efficacy of CHrPfs25 was 100% whether the researchers were testing Anopheles gambiae mosquitoes or Anopheles stephensi mosquitoes.

Dr Kumar said the next step for this research will be to develop a version of the vaccine that can be used in clinical trials.

Transmission-blocking vaccines, though not yet widely tested in humans, have the potential to be used in conjunction with more traditional malaria vaccines and other interventions—such as malaria drugs and bed nets—to fight the disease and ultimately aid in the gradual elimination of malaria parasites.

Publications
Topics

E coli cluster
Credit: USDA

E coli bacteria may enable inexpensive production of a transmission-blocking malaria vaccine, according to a paper published in Infection and Immunity.

Scientists used E coli to create a new process to purify and refold codon harmonized recombinant Pfs25 (CHrPfs25).

Pfs25 is a sexual-stage antigen of Plasmodium falciparum expressed on the surface of zygote and ookinete forms of the parasite.

Research has shown that monoclonal antibodies directed against native Pfs25 can prevent development of P falciparum oocysts in the midgut of the mosquito.

So Pfs25 is a potential vaccine candidate, but producing it has proven challenging and costly.

“Malaria affects the poorest of the poor,” said study author Nirbhay Kumar, PhD, of Tulane University School of Public Health and Tropical Medicine in New Orleans, Louisiana.

“And if you are trying to make a vaccine for those billions of people who are at risk, you need to make it cheaper to manufacture. We think that producing this protein in bacteria will make it very cost-effective for large-scale vaccine production.”

To create the vaccine for the current study, Dr Kumar and his colleagues expressed CHrPfs25 in E coli, purified the protein after simple oxidative refolding steps, and formulated it in several adjuvants.

The team then tested the final product in mice. Antibodies present after vaccination recognized native Pfs25 on the surface of live gametes of P falciparum and demonstrated complete malaria transmission-blocking activity.

The transmission-blocking efficacy of CHrPfs25 was 100% whether the researchers were testing Anopheles gambiae mosquitoes or Anopheles stephensi mosquitoes.

Dr Kumar said the next step for this research will be to develop a version of the vaccine that can be used in clinical trials.

Transmission-blocking vaccines, though not yet widely tested in humans, have the potential to be used in conjunction with more traditional malaria vaccines and other interventions—such as malaria drugs and bed nets—to fight the disease and ultimately aid in the gradual elimination of malaria parasites.

E coli cluster
Credit: USDA

E coli bacteria may enable inexpensive production of a transmission-blocking malaria vaccine, according to a paper published in Infection and Immunity.

Scientists used E coli to create a new process to purify and refold codon harmonized recombinant Pfs25 (CHrPfs25).

Pfs25 is a sexual-stage antigen of Plasmodium falciparum expressed on the surface of zygote and ookinete forms of the parasite.

Research has shown that monoclonal antibodies directed against native Pfs25 can prevent development of P falciparum oocysts in the midgut of the mosquito.

So Pfs25 is a potential vaccine candidate, but producing it has proven challenging and costly.

“Malaria affects the poorest of the poor,” said study author Nirbhay Kumar, PhD, of Tulane University School of Public Health and Tropical Medicine in New Orleans, Louisiana.

“And if you are trying to make a vaccine for those billions of people who are at risk, you need to make it cheaper to manufacture. We think that producing this protein in bacteria will make it very cost-effective for large-scale vaccine production.”

To create the vaccine for the current study, Dr Kumar and his colleagues expressed CHrPfs25 in E coli, purified the protein after simple oxidative refolding steps, and formulated it in several adjuvants.

The team then tested the final product in mice. Antibodies present after vaccination recognized native Pfs25 on the surface of live gametes of P falciparum and demonstrated complete malaria transmission-blocking activity.

The transmission-blocking efficacy of CHrPfs25 was 100% whether the researchers were testing Anopheles gambiae mosquitoes or Anopheles stephensi mosquitoes.

Dr Kumar said the next step for this research will be to develop a version of the vaccine that can be used in clinical trials.

Transmission-blocking vaccines, though not yet widely tested in humans, have the potential to be used in conjunction with more traditional malaria vaccines and other interventions—such as malaria drugs and bed nets—to fight the disease and ultimately aid in the gradual elimination of malaria parasites.

Publications
Publications
Topics
Article Type
Display Headline
E coli has applications for malaria vaccine
Display Headline
E coli has applications for malaria vaccine
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

NK cells target malaria-infected RBCs

Article Type
Changed
Display Headline
NK cells target malaria-infected RBCs

NK cell in action

Credit: Bjorn Onfelt/Dan Davis

The parasites that cause malaria are adapted to the hosts they infect, so studying the disease in mice doesn’t necessarily reveal information that translates to human disease.

But scientists believe they may have overcome this limitation. They’ve developed a strain of mice that mimics many features of the human immune system and can be infected with Plasmodium falciparum.

Using this strain, the researchers discovered that natural killer (NK) cells preferentially interact with and kill infected red blood cells (RBCs) in a contact-dependent manner.

The group recounted this discovery in PNAS.

“Human malaria studies have been hampered by a lack of animal models,” said study author Jianzhu Chen, PhD, of the Singapore-MIT Alliance for Research and Technology in Singapore.

“This [research] paves the way to start dissecting how the host human immune system interacts with the pathogen.”

Scientists studying malaria in mice previously generated mice with human RBCs. But these mice have compromised immune systems, so they can’t be used to study the immune response to malaria infection.

Over the past several years, Dr Chen and his colleagues have developed strains of mice that have the human cells necessary for a comprehensive immune response.

To generate these cells, the researchers deliver human hematopoietic stem cells, along with cytokines that help them mature into B and T cells, NK cells, and macrophages. These mice have already proven useful to study other diseases, such as dengue fever.

To adapt the mice for the study of malaria, the scientists injected them with human RBCs every day for a week, at which point 25% of their RBCs were human. And this was enough for the malaria parasite to cause an infection.

The researchers investigated the role of NK cells and macrophages during the first 2 days of malaria infection. And they found that eliminating macrophages had very little impact on the immune response during those early stages.

However, in mice lacking NK cells, parasite levels went up 7-fold, suggesting that NK cells are critical to controlling infection early on.

To further investigate the role of NK cells, the scientists placed human NK cells in a sample of infected and uninfected RBCs. The NK cells randomly interacted with both types of cells, but they latched onto infected cells much longer, eventually killing them.

The researchers also identified a cell adhesion protein called LFA-1 that helps NK cells bind to RBCs. They are now studying this process in more detail and trying to determine what other molecules, including those produced by the malaria parasite, might be involved.

Dr Chen and his colleagues also hope to use these mice to study experimental malaria vaccines or drugs. And in another future study, they plan to inject the mice with RBCs from patients with sickle cell anemia to investigate how the sickle-shaped cells help people survive malaria infection.

Publications
Topics

NK cell in action

Credit: Bjorn Onfelt/Dan Davis

The parasites that cause malaria are adapted to the hosts they infect, so studying the disease in mice doesn’t necessarily reveal information that translates to human disease.

But scientists believe they may have overcome this limitation. They’ve developed a strain of mice that mimics many features of the human immune system and can be infected with Plasmodium falciparum.

Using this strain, the researchers discovered that natural killer (NK) cells preferentially interact with and kill infected red blood cells (RBCs) in a contact-dependent manner.

The group recounted this discovery in PNAS.

“Human malaria studies have been hampered by a lack of animal models,” said study author Jianzhu Chen, PhD, of the Singapore-MIT Alliance for Research and Technology in Singapore.

“This [research] paves the way to start dissecting how the host human immune system interacts with the pathogen.”

Scientists studying malaria in mice previously generated mice with human RBCs. But these mice have compromised immune systems, so they can’t be used to study the immune response to malaria infection.

Over the past several years, Dr Chen and his colleagues have developed strains of mice that have the human cells necessary for a comprehensive immune response.

To generate these cells, the researchers deliver human hematopoietic stem cells, along with cytokines that help them mature into B and T cells, NK cells, and macrophages. These mice have already proven useful to study other diseases, such as dengue fever.

To adapt the mice for the study of malaria, the scientists injected them with human RBCs every day for a week, at which point 25% of their RBCs were human. And this was enough for the malaria parasite to cause an infection.

The researchers investigated the role of NK cells and macrophages during the first 2 days of malaria infection. And they found that eliminating macrophages had very little impact on the immune response during those early stages.

However, in mice lacking NK cells, parasite levels went up 7-fold, suggesting that NK cells are critical to controlling infection early on.

To further investigate the role of NK cells, the scientists placed human NK cells in a sample of infected and uninfected RBCs. The NK cells randomly interacted with both types of cells, but they latched onto infected cells much longer, eventually killing them.

The researchers also identified a cell adhesion protein called LFA-1 that helps NK cells bind to RBCs. They are now studying this process in more detail and trying to determine what other molecules, including those produced by the malaria parasite, might be involved.

Dr Chen and his colleagues also hope to use these mice to study experimental malaria vaccines or drugs. And in another future study, they plan to inject the mice with RBCs from patients with sickle cell anemia to investigate how the sickle-shaped cells help people survive malaria infection.

NK cell in action

Credit: Bjorn Onfelt/Dan Davis

The parasites that cause malaria are adapted to the hosts they infect, so studying the disease in mice doesn’t necessarily reveal information that translates to human disease.

But scientists believe they may have overcome this limitation. They’ve developed a strain of mice that mimics many features of the human immune system and can be infected with Plasmodium falciparum.

Using this strain, the researchers discovered that natural killer (NK) cells preferentially interact with and kill infected red blood cells (RBCs) in a contact-dependent manner.

The group recounted this discovery in PNAS.

“Human malaria studies have been hampered by a lack of animal models,” said study author Jianzhu Chen, PhD, of the Singapore-MIT Alliance for Research and Technology in Singapore.

“This [research] paves the way to start dissecting how the host human immune system interacts with the pathogen.”

Scientists studying malaria in mice previously generated mice with human RBCs. But these mice have compromised immune systems, so they can’t be used to study the immune response to malaria infection.

Over the past several years, Dr Chen and his colleagues have developed strains of mice that have the human cells necessary for a comprehensive immune response.

To generate these cells, the researchers deliver human hematopoietic stem cells, along with cytokines that help them mature into B and T cells, NK cells, and macrophages. These mice have already proven useful to study other diseases, such as dengue fever.

To adapt the mice for the study of malaria, the scientists injected them with human RBCs every day for a week, at which point 25% of their RBCs were human. And this was enough for the malaria parasite to cause an infection.

The researchers investigated the role of NK cells and macrophages during the first 2 days of malaria infection. And they found that eliminating macrophages had very little impact on the immune response during those early stages.

However, in mice lacking NK cells, parasite levels went up 7-fold, suggesting that NK cells are critical to controlling infection early on.

To further investigate the role of NK cells, the scientists placed human NK cells in a sample of infected and uninfected RBCs. The NK cells randomly interacted with both types of cells, but they latched onto infected cells much longer, eventually killing them.

The researchers also identified a cell adhesion protein called LFA-1 that helps NK cells bind to RBCs. They are now studying this process in more detail and trying to determine what other molecules, including those produced by the malaria parasite, might be involved.

Dr Chen and his colleagues also hope to use these mice to study experimental malaria vaccines or drugs. And in another future study, they plan to inject the mice with RBCs from patients with sickle cell anemia to investigate how the sickle-shaped cells help people survive malaria infection.

Publications
Publications
Topics
Article Type
Display Headline
NK cells target malaria-infected RBCs
Display Headline
NK cells target malaria-infected RBCs
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica