Americans believed the war was a European conflict and did not want to get involved.
World War I was not a pleasant experience for anyone but America still felt the wounds of joining a distinctly European conflict.
While leaders like Roosevelt wanted America to help its British allies by getting in the war, many Americans were largely opposed.
It wasn't until Hitler foolishly declared war on the United States after the US declared war on Japan that the US was brought into the war.