News & Information
Gender Diversity in WorkplacesTop of Mind with Julie Rose
- Apr 20, 2021 8:00 pm
Research has long shown that when women take senior positions in a company, the whole business improves–it becomes more profitable, provides better customer service, and is more socially responsible. Corinne Post–a professor of Management at Lehigh University’s College of Business–recently figured it out: women change the way the company thinks.