gretaquintero0327 gretaquintero0327 08-07-2021 History contestada World War II transformed the United States in many ways. What do you believe was the most important way in which the war changed America?