What is the woman's proper role in American society? Men and women are by no means equal. There are always going to be certain attributes of each gender that the other can't duplicate. This does not mean that any one is better than the other, but that, aside from the obvious fundamental differences, we should treat both sexes equally, making allowances for these differences. Being male, I don't know how it feels to be treated in a gender bias way to such an extent as most women will undoubtedly feel throughout their lifetime. The piece by Susan Faludi capsulized that fact very well. The male-female pay gap is a very objective representation of an accepted form of male dominance, whether it is fair or not. To a friend of mine, this meant that women shouldn't work because men are in control and always will be in control and women are only trying to make themselves equal to men by working. That is ignorant. Women have the right to make their own decisions, and they've also got the brains. If a woman is interested in a job and meets the qualifications that the employer requires, I would expect her to be considered eligible for that job with no more questions asked. The point is this; men have treated women as subordinates all through history and quite frankly, old habits die hard. Women have learned that they are to be under men's control and have only recently considered that this may be incorrect. In short, women should be allowed to do whatever they desire in their heart, just like any man. I see no reason other than ignorance that we shouldn't see a female president. Their role in society depends on their strengths; the things they do well....