nocemoist3 nocemoist3 08-04-2021 Social Studies contestada What does the term “Southern Redemption” mean? Southerners regained control of their states after Reconstruction. Southerners were forced to pay the costs of the Civil War. The South was freed from the institution of slavery. The South became a fairer, more equal society.