Women’s Health Neglected in the US
Women’s Health Neglected In the US If you are a women and you live in America, then your health is, quite likely, being neglected. And many women don’t even realize it. A significant report on the future of women’s health published today reveals that women are still being left out of research, despite a 1993…