erieannapickett12
erieannapickett12
09.06.2020 • 
History

How did television influence Americans during the postwar period? Check all that apply. It supported the roles of women in the workplace.
It encouraged Americans to be consumers.
It shaped American social values.
It helped minorities gain greater acceptance.
It portrayed examples of ideal social roles.
It emphasized the importance of family.

Solved
Show answers

Ask an AI advisor a question