Pretty gross stuff, but I bet all of you get some sort of thrill from the combination of the hot babes (or guys) and the gruesome torture. I'm not going to lie and say I don't, but I do feel pretty disgusted for feeling that thrill.
It's a topic most people don't like to bring up because the idea of sex already is so taboo in our society. But, are these images and ideas really beneficial to the community? We are constantly bombarded with these scenes through the media and it makes it seem like it is okay. In the adult movie industry, women are being objectified as sex toys and men are being taught that "No" really means "Yes." Many ex-porn stars have come out to say that the industry is so animalistic and brutal, that they must be constantly high and drunk just to get through the scenes.
Therefore, is it simply a coincidence sex trafficking is thriving and incidences of sexual assault, pedophilia, rape are becoming more and more prominent? The United States claims to be this great industrialized, modern nation, but we have so many human rights violations and sickening events that we overlook and are numb to in our very own communities. I don't know if I would ever want my kids to grow up in this kind of world.