Saturday, August 13, 2011

Do doctors not tell us about natural cures because they get kickbacks for the drugs they prescribe?

I really believe that doctors get money from drug companies when they prescribe their drugs. It is a shame that doctors don't take the time to learn about natural cures used around the the world. Why should they when the drug companies will pay them for "selling" their drugs?

No comments:

Post a Comment