Hollywood’s New Sex Worker Roles Are Girlboss Heroines
In Hollywood, sex workers have become the ultimate girlbosses. The message is clear: there’s no need for collective empowerment when one can escape the low-wage economy by cashing in on the power of bootstrapping entrepreneurism.