Consumers spend billions of dollars a year on organic food thinking it’s healthier and may reduce their risk of cancer.
But a new study finds eating organic food doesn’t lower our overall risk of cancer.
Here’s what researchers found.
http://medicalxpress.com/news/2014-03-food-doesnt-cancer.html