I really believe that doctors get money from drug companies when they prescribe their drugs. It is a shame that doctors don't take the time to learn about natural cures used around the the world. Why should they when the drug companies will pay them for "selling" their drugs?
No comments:
Post a Comment