News & Information

Gender Diversity in Workplaces

Top of Mind with Julie Rose
  • Apr 20, 2021 8:00 pm
  • 13:04
Research has long shown that when women take senior positions in a company, the whole business improves–it becomes more profitable, provides better customer service, and is more socially responsible. Corinne Post–a professor of Management at Lehigh University’s College of Business–recently figured it out: women change the way the company thinks.