Decimals are just another way to express
fractions. After all, to produce a decimal, you simply divide the
numerator of a fraction by the denominator. For example,
2 = .5.
Like fractions, comparing decimals can be a bit deceptive.
As a general rule, when comparing two decimals such as .3 with .003,
the decimal with more leading zeroes is the smaller one. But if
asked to compare .003 with .0009, you might be tempted to overlook
the additional zero, and because 9 is the larger integer, choose
.0009 as the larger decimal. That would be wrong. Use caution to
avoid such mistakes. It might help to line up the decimal points
of the two decimals:
- .0009 is clearly smaller than .0030
- .000900 is smaller than .000925
Converting Decimals to Fractions
Knowing how to convert decimals into fractions and fractions
into decimals are useful skills. Sometimes you’ll produce a decimal
while solving a question, and then you’ll have to choose from fractions
for test choices. Other times, it may just be easier to work with fractions.
Whatever the case, both conversions can be done easily.
To convert a decimal number to a fraction:
Remove the decimal point and make the decimal number
the denominator be the number 1 followed by as many zeroes as there are
decimal places in the decimal number.
Let’s convert .3875 into a fraction. First, we eliminate
the decimal point and make 3875 the numerator:
Since .3875 has four digits after the decimal
point, we put four zeroes in the denominator:
Then, by finding the GCF of 3875 and 10,000, which is
125, we can reduce the fraction:
To convert from fractions back to decimals is a cinch.
Simply carry out the necessary division on your calculator, such
as for 3/5: