Women's Roles

Since the 70’s there has been an active feminist movement, or women’s liberation movement, in the U.S., which aims to insure that women have equal responsibilities and opportunities to those of men. Although there are still aspects of society in which women have not yet achieved equality, women play a public and visible role in the political, economic, cultural, and social affairs of this country. Nonetheless, some people may find that American society is more sexist than their own in certain respects.

Men and women in the U.S. may associate more freely with members of the opposite sex at work and in social situations than in many other countries. You may also find that the dress and behavior of women in social situations here are quite different from those of your country. While in your country it may be the man’s responsibility to ask a woman out for a date, here it is acceptable for a woman to ask a man out for a date as well. Whether the man or the woman offers the invitation, often both share the expenses.

Some international students and scholars have difficulty adjusting to situations in which a woman is in a position of authority because of their experiences in their own countries. American women may appear too assertive or aggressive if judged in another cultural context. Approach situations involving a female authority figure with an open mind.