Female doctors in US earn much less than male doctors, study finds

Female doctors in the United States earn significantly less than their male counterparts, even after adjustment for specialty, experience, and hours worked, research published in The BMJ has found.1 The researchers also found that white male doctors earned more than their black male colleagues.