So what's the deal now? Is objectifying women right now or is it wrong? For the longest, men were criticized and bashed for objectifying women. Women have agency and we should respect that. Cool. Nowadays, a lot of women, whether chasing fame, basking in their fame, or just living a normal life, are out here taking pics, making videos, and using social medial to promote themselves in a way conducive to being objectified. So what was all the whining about before? Do some of these women want to be able to promote themselves as sexual objects without being seen as sexual objects?