Source | www.bustle.com | JR THORPE
Has a stranger told you to smile lately? Or that you’d be “so much prettier” if you slapped a grin on your face? The fact that many men feel entitled to regulate women’s smiles is not news — but the phenomenon was drawn into the limelight yet again last week, when President Trump singled out Caitríona Perry, the press bureau chief of RTÉ News, for her “beautiful smile” and told the Irish President that “I bet she treats you well.”
But women weren’t always expected to smile — in fact, smiling women were once considered troublesome and devious. But by the 20th century, in the US, smiling women were considered the epitome of docile femininity. How did that change occur — and why do so many men now think it is their birthright to demand that you turn your frown upside down?