After watching AMC's new hit show "The Walking Dead", it was hard to look past the clear and uncomfortable gender roles that the show displays on a regular basis. For those who do not know the show, "The Walking Dead" is set in post-apocalyptic America, overrun by zombies, or walkers, that terrorize the few left living on Earth. Filled with violence, gore, drama, and Southern accents (the show takes place in Atlanta, Georgia), "The Walking Dead" explores life without boundaries, and the fight for survival that more often than not seems hopeless.
Unfortunately, even amidst a zombie invasion, gender boundaries still remain strong and unforgiving. Throughout the show, men protect the lives of women and children; guns and decision making are controlled by men while women hopelessly follow the guidance of those who are just as clueless as how to survive a zombie apocalypse. Sure, there is a point where the men realize women need to be able to protect themselves against zombies, but it is only once the men decide the women are ready do they get the luxury of carrying weapons. Even then it is clear that the women do not deserve that responsibility. In the episode that women are granted the right to bear arms, one women accidentally shoots one of her companions in the head, persuading the audience that she and the other women should not be allowed to protect themselves in fear of harming others. It is more often then not that women are cowardly screaming and running at the sight of the all too familiar zombies, and rely on a strong male counterpart to skewer the awful creature for her. There is no hiding the issue of gendered violence in "The Walking Dead", and men and women almost always fit stereotypical gender personalities- men are capricious and aggressive while women are passive and overly emotional. Although the show takes place in the South, it seems as though the gender roles are overly exaggerated, and modernity has yet to shed light on the advancement of women rights in the past few decades.
While the examples provided so far revolve around my own personal understanding of gender dynamics, the dialogue within the show proves the gendered nature of violence in the show. When one female character expresses her interest in protecting the group by surveilling the surrounding area for zombies, a female counterpart exclaims, "The men can handle that on their own, they don't need your help. There are plenty of other things to do around here, cooking, cleaning". Regardless of my interpretation of the show, this is a clear indication that women are being socialized to view their duties as cooking and cleaning as opposed to taking a position of leadership. "The Walking Dead" supports the notion that women should be seen and not heard, and that their duties should not extend beyond the shelter of their home.
This explicit and implicit display of gender roles and violence affects the way that women and men perceive gender in today's world. Regardless of context, "The Walking Dead" provides a lens for which people around the world are socialized to believe that women and men should occupy the roles of leadership (or lack thereof) that are displayed on television. We as a people use mass media as a reference point to understand what is acceptable and unacceptable behavior, and "The Walking Dead" only helps to perpetuate a negative understanding of the role that women can have in a modern context.
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.