Both researchers and industry experts concur that workplaces with diverse and inclusive environments make positive impacts on organizations. They point out that having diversity, equity, and inclusion (hence DEI) initiatives at work improves corporate culture, client relations, enables the organization …