Did Women Get Rights Because Men Allowed It or Did Protests Actually Matter?
Let’s be real--women didn’t just receive rights like some generous gift. They fought tooth and nail for them. But at the end of the day, the people in power (mostly men) had to allow it. So, did protests, activism, and suffrage movements actually change things, or was it just a matter of men eventually deciding to be "progressive" when it suited them?
Would love to hear thoughts--did the fight force change, or was it just men giving crumbs when it was convenient?