The fauna, natural riches, and culture of Africa have historically made it a very important continent to the west. Western travelers, western explorers, and western imperialists from all over the world
have become interested in Africa. As a result, western interests have had a significant influence on Africa over time.Countries have always acted in their own best interests, dating back to the age of exploration. The western world only truly began to go further into the interior of Africa in the late 1800s. The explorers discovered a wealth of land and resources. The sole obstacle in their path was a tribe of primitive people armed only with spears. This technological edge allowed Europe to easily conquer Africa.
learn more about western culture here:
https://brainly.com/question/13026104
#SPJ4