Decimals are just another way to express fractions. To
produce a decimal, divide the numerator of a fraction by the denominator.
For example, 1/2
2 = .5.
As with fractions, comparing decimals can be a bit deceptive.
As a general rule, when comparing two decimals such as .3 with .003,
the decimal with more leading zeros is smaller. But if asked to
compare .003 with .0009, however, you might overlook the additional
zero and, because 9 is the larger integer, choose .0009 as the larger
decimal. That, of course, would be wrong. Take care to avoid such
mistakes. One way is to line up the decimal points of the two decimals:
- .0009 is smaller than .0030
- .000900 is smaller than .000925
Converting Decimals to Fractions
Knowing how to convert decimals into fractions, and fractions
into decimals, is a useful skill. Sometimes you’ll produce a decimal
while solving a question and then have to choose from fractions
for test choices. Other times, it may be easier to work with fractions.
Whatever the case, both conversions can be done easily.
To convert a decimal number to a fraction:
Remove the decimal point and use the decimal number
as the numerator.
denominator is the number 1 followed by as many zeros as there are decimal
places in the decimal number.
Let’s convert .3875 into a fraction. First, we eliminate
the decimal point and make 3875 the numerator:
Since .3875 has four digits after the decimal point, we
put four zeros in the denominator:
Then, by finding the greatest common factor of 3875 and
10000, 125, we can reduce the fraction:
To convert from fractions back to decimals is a cinch.
Simply carry out the necessary division on your calculator, such
as for 31/80: