trevorhenyan51
trevorhenyan51
08.04.2021 • 
Social Studies

What does the term “Southern Redemption” mean? Southerners regained control of their states after Reconstruction.

Southerners were forced to pay the costs of the Civil War.

The South was freed from the institution of slavery.

The South became a fairer, more equal society.

Solved
Show answers

Ask an AI advisor a question