Gender Diversity in WorkplacesTop of Mind with Julie Rose • Season 1, Episode 1581, Segment 5
Apr 20, 2021 • 13m
Research has long shown that when women take senior positions in a company, the whole business improves–it becomes more profitable, provides better customer service, and is more socially responsible. Corinne Post–a professor of Management at Lehigh University’s College of Business–recently figured it out: women change the way the company thinks.