contestada

a crow flies to a point that is 1 mile east and 20 miles south of its starting point. how far does the crow fly?

Respuesta :

Moly42
Make it into a right Triangle then use the pythagorean theorem (a^2+b^2=c^2) to solve for C which will give you your answer.

Answer:

Crow flew 20.02 miles.

Step-by-step explanation:

A crow flies to a point in the east = 1 miles

Then it flies to a point in the south = 20 miles

Now we have to calculate the distance x between the crow and initial point.

Therefore, distance x = [tex]\sqrt{(\text{Distance of the crow in the east})^{2}+(\text{Distance of the crow in the south})^{2}}[/tex]

[By Pythagoras theorem]

x = [tex]\sqrt{1^{2}+(20)^{2}}[/tex]

  = [tex]\sqrt{1+400}[/tex]

  = [tex]\sqrt{401}[/tex]

  = 20.02 miles

Crow is 20.02 miles away.

Ver imagen eudora