Why did US influence in Hawaii increase during the 1800s? Check all THREE that apply.


Group of answer choices


The US negotiated a deal to buy Hawaii from Spain.


US plantation owners dominated Hawaiian politics.


The US began to import many goods from Hawaii.


The US sent its military to Hawaii to overthrow the queen.


US settlers purchased land in order to start plantations.