There have been a lot of talk in the media about Gender Roles here in the United States. Research shows that women have more power and authority at work and at home. The healthiest relationships among couples are the ones when both couples voices are heard and choices are validated. This to me is not a surprise. Men and women both want to be respected. I am glad that as a society that we are making strides where women can now hold positions that were exclusive to men, just not to long ago. Women are CEO, surgeons, pilots and more.
As women gain more power in society, I wonder what the backlash is. Do many men resent this? Do they feel they have to give up some of their power? Are they confused? Do they agree with the changes? How do they express these feelings if they have negative feelings about the new Gender Roles?
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment