Women in Business in The USA

USA:

Women play an active part in business in the US and have done so for many years. Although progress to the boardroom might still be more difficult than for a man, a large percentage of American executives are women and this percentage is rising year on year.

It is important to be seen to be ‘politically correct' on gender issues in the workplace. Treat a woman as you would a man in all business dealings — any perceived stereotyping would be regarded very badly.
advert