Again, because I think this is important and being overlooked here:
In a lot of these discussions, particularly involving rape culture in the Western world, it is not that it is saying ‘rape is a good thing’ or ‘rape is encouraged’. The message of rape culture is one of victim blaming, sexual objectification, and trivialization of rape (making it a joke). I think most of us can agree this kind of thing does happen in Western cultures to an extent, yes?