Why do movies portray it as funny, deserved or "badass", when female characters kick male ones in the balls over words or hurt feelings?
How is that better than the reverse?
- Anonymous6 months ago
Because so many movies today are woke garbage disguised as entertainment. Problem is that women are being taught misandry by Hollywood and men are being taught to accept it. Pathetic.
- LilyRTLv 76 months ago
I will never understand why people fixate on one small thing and then ask questions about it over and over and over....for years. decades even. forget growing up, don't you ever get to the point where your interests shift?
- 6 months ago
Violence in the name of social justice is cheered but is sickening. Witness "Unchained" or "Inglorious Bastards".
- CogitoLv 76 months ago
They very rarely do.
But as with all fiction - movies, TV, novels, etc, they aren't real life.
You shouldn't take them as being moral guides to rational behaviour!
- What do you think of the answers? You can sign in to give your opinion on the answer.
- megalomaniacLv 76 months ago
People shouldn't learn their morality from Hollywood. In fact people shouldn't learn anything from Hollywood (they are so far removed from reality that even "factual" movies are mostly full of crap - they just make up stuff that they think will sell and not much more thought than that goes into it).