### Multiplying decimals by decimals

To multiply decimals, we are told to multiply as if there were no decimal points, and then make the answer have as many decimal digits as there are decimal digits in the factors.

In the video below, I compare multiplying decimals by decimals to fraction multiplication:

Do you know where this rule or "shortcut" comes from?

It comes from fraction multiplication. For example, 1.1 × 0.005 becomes (11/10) × (5/1000) when it is written with fractions. One decimal digit means the denominator is 10. Three decimals means the denominator is 1,000.

When you multiply the fractions, you get 55/10,000. Ten thousand as a denominator means the corresponding decimal has four decimal digits. So, the answer is 0.0055.

If you are a teacher, you can approach the rule for decimal multiplication by starting out with fractions, and using examples like the one above or the ones in the video to show students where the rule comes from. Anonymous said…
Nice video, learnt something new or something I had forgotten.

I'm doing a computer science degree and stumbled across this video, after reading your article on calculating the square root of a number. I used the Babylonian method to create a programming function to automatically calculate the square root of a given number, it gives you a result the same as Microsoft Windows Calculator would.

Just wanted to say thanks for the useful information.