Ignore this at your peril! To display very large and very small numbers, we
often (and computers and calculators almost always) use 13,000,000 = 1.3 x 10 13,000,000 = 1.3e7 (or 1.3E+07) The "e" means ten raised to the power. Some calculators and computer software write it as a capital E and include two digits and the sign of the exponent, as illustrated above in parentheses. Thus, if your calculator or computer tells you a number is 2.87E-05, and you write it on your homework or exam as 2.87 and ignore the exponent because you don't know what it is (a mistake I see all the time), that answer will be counted wrong (because it is). Scientific notation is also a very convenient way to represent significant digits. Suppose we have the value 3,500. How many significant digits are there? According to the rules presented in the previous article, there are two. In scientific notation we would write it: 3.5 x 10 But suppose all four digits are accurate. Suppose, for example, we counted the number of people attending a political rally, and the count came to exactly 3,500. In that case, all four digits in the number are accurate and all four digits are significant. It's hard to tell that from the standard notation, but in scientific notation we could write: 3.500 x 10 In scientific notation, trailing zeros are significant. Otherwise, they should be dropped. |

Return To Main Page |