Aside from men being horny in the 1800s, and this being a holdover since that time, is there any actual reason why this hasn’t changed?
If society was dominated by women, would this be more likely to change?
I was sweating my ass off hiking in the hot sun, and the question has been bothering me all day after my top soaked through with sweat.
Pretty sure breastfeeding is protected all over the US. It certainly is in Florida.
I’m not even big in that department, don’t wear a bra for yoga, but running? That needs a sport bra, and working outside I wear long sleeves to keep the sun off.
I do think beaches here should allow topless suits for women though. That might go a long way towards desexualizing simple nudity, people would get used to it. I agree that it ought not be a rule that you have to cover your boobs at all if you don’t want to.