Why do you multiply when you are trying to find the percentage of a number?



Hello, I'm curious about something regarding percent problems.

So when you are given a problem such as: What is 25% of 20?

You have to multiply 25% by turning it into a fraction form:

[tex]\frac{25}{100} * \frac{20}{1} = \frac{500}{100} = 5[/tex]

you can also use the decimal form : 0.25 × 20 = 5

I understand how to do it a little, My confusion is in understanding why we multiply first.


So I'd like to ask exactly why do you multiply when you are trying to find the percentage of a number?