The major impact of the war on the South (other than the end of slavery) was that it left white Southerners feeling bitter toward the North and the United States. This feeling continued during, and was made worse by, the period of Reconstruction that followed the war.