Business Culture in the USA

>> Women in Business in the USA

Women in Business in the USA

Women play an active part in business in the US and have done so for many years. Although progress to the boardroom might still be more difficult than for a man, a large percentage of American executives are women and this percentage is rising year on year.

It is important to be seen to be politically correct on gender issues in the workplace. Treat a woman as equal to a man in all business dealings – any perceived stereotyping would be regarded very badly.


Latest version updated 23rd March 2017