![stephaniem0216](/avatars/45241.jpg)
stephaniem0216
22.06.2019 •
History
Gradually, american society came to accept that girls could be educated and that women could be
Solved
Show answers
More tips
- F Food and Cooking Homemade French Fries: The Ultimate Guide...
- D Dating, Love, Relationships How Long Can Love Last?...
- A Auto and Moto Mastering One-Movement Parking: All You Need to Know...
- C Computers and Internet How to Properly Order Clothing from International Online Stores...
- H Health and Medicine Headache: A Comprehensive Guide to Treatment...
- F Family and Home How to Choose the Best Diapers for Your Baby?...
- A Auto and Moto Discovering the Leader: What is the Most Expensive Car in the World?...
- F Food and Cooking How to Quickly Put your Child to Sleep?...
- C Computers and Internet How to Create a Website for Free and Easy?...
- F Family and Home Parquet or laminate, which is better?...
Ответ:
Gradually, American society came to accept that girls could be educated and that women could be TEACHERS
Explanation:
As early as the 19th Century, most men and women lived by their traditional roles in the society. Men would work and be the breadwinners in America, while the Women would be the 'home maker' who would take care of the kids, clean the house, cook and do the dishes.
In such traditional roles, it was not normal for a woman to be highly educated and go out to work.
However, by the early-20th Century, things had started to gradually change. With the on-set of the first and the second world war, more and more women, left their homes to take up jobs.
In the early years of women rights though, Conservative Americans found it hard for girls to be educated and to become professional teachers.
Ответ: