We need the concept of percent because we need a convenient way to measure the size (or amount) or one thing compared to another basic amount. For example, if you are selling apples and you want to know how well you did in a particular year, you need to measure your success compared to how much demand there was for apples in that year.
If the market had 100 people looking to buy apples and 30 of them picked your apples, your success was 30% of the market. If, however, there were 10,000 people looking to buy apples and only 30 of them bought yours, your success was only 0.3% - much lower, which means that there could be a problem with your apples or with the way you are selling them.
This shows that the number 30 (30 apples sold) is not significant by itself to measure performance. Performance is measured by percent - by how one amount is compared to another one (in this case, the amount of apples sold vs. the amount of people looking to buy apples).
Another example is if you are opening a business with a few partners and you want to divide the profits. Suppose you decide that the right way to divide the profits is 50%-25%-25%. Suppose one month you got paid $1000 - is this your fair share or not? The only way to know is to measure this compared to the overall profits in that month. If the profit was $2000 you got 50% which is your share, but if the profit was $10,000 then you deserve more. Again the significant number here is a numerical relation between one amount to the other.
Percent, mathematically, is a conversion of units of measurements. Knowing that you got 16/123 of the profit contains the exact same information as knowing that you got 13% of the profit, yet 16/123 is not an amount we can easily concretize in our head - not a number we can easily relate to. To make sense of it, we would need to see how it stands in relation to something more familiar, like one half, one third, a decimal number or a percent.
Is there something inherit about the base of 123 that makes it hard to measure things like 16/123? Not really. Had we measured percent with a base of 123, the number 16 would have been far easier to concretize (connect to an actual amount of objects in reality) than the corresponding amount (13, in this case) had the base been a 100.
The amount of 100 is easier to handle since we count in tens, but the main reason it is easier to grasp the meaning of 13/100 than 16/123 is because using 100 units as the whole is far more concretized for us - due to extensive use in daily life.
One can see this phenomenon by the need of Americans to convert kilometers to miles to get a sense of the distance and of Europeans to convert miles to kilometers for the same purpose.
There is nothing special about the length of one kilometer or one mile except that one gets a sense of their length by repeatedly using those units and getting a sense of their physical meaning in reality.
When calculating what is the percent of something, what we do is to ask the following question: Given that we have X units from type Y - how many units would we have had we converted the measurement unit into one of a hundred?
For example, suppose we have 15 seconds out of a minute - the basic measurement unit is 1/60. Asking what percent 15 seconds are from one minute is the same as asking - if a minute were composed of 100 units of time, how many units would we have to take to equal the time duration of 15 seconds?
__     =     __
The question mark is the percent (the percent 15 seconds are out of a whole minute).
So, in conclusion, percentage is a measurement of the size relation of one amount to another, represented in a unit base of 1/100 (in which the whole is 100 units).