3.7: Modeling with Normal Distributions
- Page ID
- 148590
\( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)
\( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)
\( \newcommand{\dsum}{\displaystyle\sum\limits} \)
\( \newcommand{\dint}{\displaystyle\int\limits} \)
\( \newcommand{\dlim}{\displaystyle\lim\limits} \)
\( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)
( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)
\( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)
\( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)
\( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)
\( \newcommand{\Span}{\mathrm{span}}\)
\( \newcommand{\id}{\mathrm{id}}\)
\( \newcommand{\Span}{\mathrm{span}}\)
\( \newcommand{\kernel}{\mathrm{null}\,}\)
\( \newcommand{\range}{\mathrm{range}\,}\)
\( \newcommand{\RealPart}{\mathrm{Re}}\)
\( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)
\( \newcommand{\Argument}{\mathrm{Arg}}\)
\( \newcommand{\norm}[1]{\| #1 \|}\)
\( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)
\( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)
\( \newcommand{\vectorA}[1]{\vec{#1}} % arrow\)
\( \newcommand{\vectorAt}[1]{\vec{\text{#1}}} % arrow\)
\( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)
\( \newcommand{\vectorC}[1]{\textbf{#1}} \)
\( \newcommand{\vectorD}[1]{\overrightarrow{#1}} \)
\( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}} \)
\( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)
\( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)
\(\newcommand{\longvect}{\overrightarrow}\)
\( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)
\(\newcommand{\avec}{\mathbf a}\) \(\newcommand{\bvec}{\mathbf b}\) \(\newcommand{\cvec}{\mathbf c}\) \(\newcommand{\dvec}{\mathbf d}\) \(\newcommand{\dtil}{\widetilde{\mathbf d}}\) \(\newcommand{\evec}{\mathbf e}\) \(\newcommand{\fvec}{\mathbf f}\) \(\newcommand{\nvec}{\mathbf n}\) \(\newcommand{\pvec}{\mathbf p}\) \(\newcommand{\qvec}{\mathbf q}\) \(\newcommand{\svec}{\mathbf s}\) \(\newcommand{\tvec}{\mathbf t}\) \(\newcommand{\uvec}{\mathbf u}\) \(\newcommand{\vvec}{\mathbf v}\) \(\newcommand{\wvec}{\mathbf w}\) \(\newcommand{\xvec}{\mathbf x}\) \(\newcommand{\yvec}{\mathbf y}\) \(\newcommand{\zvec}{\mathbf z}\) \(\newcommand{\rvec}{\mathbf r}\) \(\newcommand{\mvec}{\mathbf m}\) \(\newcommand{\zerovec}{\mathbf 0}\) \(\newcommand{\onevec}{\mathbf 1}\) \(\newcommand{\real}{\mathbb R}\) \(\newcommand{\twovec}[2]{\left[\begin{array}{r}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\ctwovec}[2]{\left[\begin{array}{c}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\threevec}[3]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\cthreevec}[3]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\fourvec}[4]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\cfourvec}[4]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\fivevec}[5]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\cfivevec}[5]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\mattwo}[4]{\left[\begin{array}{rr}#1 \amp #2 \\ #3 \amp #4 \\ \end{array}\right]}\) \(\newcommand{\laspan}[1]{\text{Span}\{#1\}}\) \(\newcommand{\bcal}{\cal B}\) \(\newcommand{\ccal}{\cal C}\) \(\newcommand{\scal}{\cal S}\) \(\newcommand{\wcal}{\cal W}\) \(\newcommand{\ecal}{\cal E}\) \(\newcommand{\coords}[2]{\left\{#1\right\}_{#2}}\) \(\newcommand{\gray}[1]{\color{gray}{#1}}\) \(\newcommand{\lgray}[1]{\color{lightgray}{#1}}\) \(\newcommand{\rank}{\operatorname{rank}}\) \(\newcommand{\row}{\text{Row}}\) \(\newcommand{\col}{\text{Col}}\) \(\renewcommand{\row}{\text{Row}}\) \(\newcommand{\nul}{\text{Nul}}\) \(\newcommand{\var}{\text{Var}}\) \(\newcommand{\corr}{\text{corr}}\) \(\newcommand{\len}[1]{\left|#1\right|}\) \(\newcommand{\bbar}{\overline{\bvec}}\) \(\newcommand{\bhat}{\widehat{\bvec}}\) \(\newcommand{\bperp}{\bvec^\perp}\) \(\newcommand{\xhat}{\widehat{\xvec}}\) \(\newcommand{\vhat}{\widehat{\vvec}}\) \(\newcommand{\uhat}{\widehat{\uvec}}\) \(\newcommand{\what}{\widehat{\wvec}}\) \(\newcommand{\Sighat}{\widehat{\Sigma}}\) \(\newcommand{\lt}{<}\) \(\newcommand{\gt}{>}\) \(\newcommand{\amp}{&}\) \(\definecolor{fillinmathshade}{gray}{0.9}\)INTRODUCTION
Recall from Preparation S.7 that Normal distributions have three key properties:
- The mean, median, and mode are all equal.
- The normal distribution is symmetric about the mean.
- The total area below a normal curve is 1, which is 1.00 or 100%, since all (100%) of the values must fall somewhere within the distribution.
SPECIFIC OBJECTIVES
By the end of this collaboration, you should understand that
- normal distributions model bell-shaped and symmetric distributions.
- normal curves are defined based on their mean and standard deviation.
- the area below a normal curve within a specific interval represents the probability that a randomly selected value is within that interval.
- statistical inference is the process of using sample statistics to make decisions about population
parameters.
- a hypothesis test is a procedure which uses a sample statistic to make a decision about the value of an unknown population parameter.
- a P-value is the probability of obtaining a sample statistic as extreme as the one observed, assuming that the population parameter is a certain value.
By the end of this collaboration, you should be able to
- find probabilities and percentages using a normal distribution.
- apply the Empirical Rule to determine the intervals that contain 68%, 95%, and 99.7% of data values in a normal distribution.
- perform a hypothesis test about an unknown population parameter based on sample data.
- find and interpret P-values in a hypothesis test for a population parameter.
PROBLEM SITUATION 1: SUMMER TEMPERATURES IN THE SOUTHWESTERN U.S.
Finding Probabilities in a Normal Distribution
An important property of a normal distribution is that we can use it to find probabilities associated with any value in the distribution. From 2001–2016, July average temperatures in the Southwestern region of the U.S. were normally distributed with a mean of 75 (°F) and a standard deviation of 1.2 (°F). This normal distribution is shown below.
7
In the image above, note that the values below the horizontal axis correspond to the mean and the positions that are one, two and three standard deviations above and below the mean. Below these values are the corresponding Z-scores.
Suppose we want to find the probability that a July average temperature in the 2001–2016 distribution is greater than 76.8 (°F). This probability is the area to the right of 76.8 below the normal curve with mean 75 (°F) and standard deviation 1.2 (°F). The area corresponding to this probability is shown in the normal curve below.
To find the area to the right of 76.8, we can convert 76.8 to a Z-score, and then find the area to the right of the Z-score using a table created from the standard normal distribution. The standard normal distribution is the normal distribution with mean 0 and standard deviation 1. A key property of normal curves is that the area of an interval of values below any normal distribution can be found by:
- converting the interval of values to an interval of Z-scores, and
- finding the area of the interval of Z-scores below the standard normal distribution.
Recall that
\[Z-score = \dfrac{Value - Mean}{Standard\;Deviation} \nonumber\]
- Let’s apply this procedure to find the probability that a randomly selected 2001–2016 July average temperature is greater than 76.8 (°F). Complete the following steps:
(a) Convert the July average temperature of 76.8 to a Z-score using the mean 75.0 and standard deviation 1.2. Round the Z-score to two decimal places.
(b) At the end of the collaboration you will find a Standard Normal Distribution table. This table provides areas below the standard normal distribution to the left of Z-scores. To use the table to find the area to the left of a Z-score: (1) go to the row that contains the Z-score’s ones and tenths digits, (2) go to the column that contains the Z-score’s hundredths digit, (3) find the intersection of the row and column to obtain the area.
Using the Standard Normal Distribution table, find the area to the left of the Z-score you found in Question 1(a). Write this area as a decimal and a percentage.
(c) What is the area to the right of the Z-score you found in Question 1(a)? Write this area as a decimal and a percentage. (Hint: Remember that the total area below the standard normal distribution is 1.)
(d) What is the probability that a randomly selected 2001–2016 July average temperature is greater than 76.8 (°F)? Express this answer as a percentage. (Hint: The probability that a July average temperature is greater than 76.8 (°F) is the area to the right of the Z-score for 76.8.)
- Apply the process you used in Question 1 to find the probability that a randomly selected 2001–2016 July average temperature is less than 74.0 (°F). Hint: Begin by converting the average temperature of 74.0 (°F) to a Z-score. Round this Z-score to two decimal places.
PROBLEM SITUATION 2: TESTING A CLAIM ABOUT CLIMATE CHANGE
The Power of Samples
We can use normal distributions and calculating areas for certain intervals, as we did above, to help with statistical inference. Statistical inference is the process of using sample statistics to make decisions about population parameters. This is a powerful tool for understanding and describing large populations, as samples are often all that we can obtain. Fortunately, well-chosen samples are powerful, and when they are combined with probability tools, samples enable us to draw logical inferences about very large populations.
College Students’ Views on Climate Change
A student at a Florida community college surveyed students at her college on their views about climate change. In the student’s random survey of 100 other students, 62 reported that they have concerns that climate change will personally impact themselves or a family member. The sample is representative of all students at the college. The sample results are summarized in the bar graph below.
- 62% of the sample reported that they have concerns that climate change will personally impact themselves or a family member. 62% is a sample statistic. Identify the population parameter that corresponds to this sample statistic.
- Another student views these sample results and claims that the sample provides evidence that more than half (i.e., more than 50%) of all students at the college have personal concerns about the effects of climate change. What do you think? Does this sample statistic enable us to conclude with certainty that more than half of all students at the college hold this view? Explain.
Using a Sample Statistic to Test a Claim About an Unknown Population Parameter
Does the observed sample proportion (0.62) provide compelling evidence that the population proportion is greater than 0.50? Sample statistics vary from sample to sample, so we must carefully consider how we use a sample to make an inference about a population. The population proportion could be greater than 0.50, but we cannot be certain. There are two possible relationships between the observed sample statistic (0.62) and the hypothesized population proportion (0.50).
Explanation 1: The actual population proportion is greater than 0.50. The observed sample proportion could reflect (be at or near) the actual population proportion. This would mean that the proportion of all students at the college who have concerns that climate change will personally impact themselves or a family member is greater than 0.50.
Explanation 2: The actual population proportion is less than or equal to 0.50. The observed sample proportion could be due solely to chance and sampling variability. The population proportion could actually be less than or equal to 0.50. This would mean that the proportion of all students at the college who have concerns that climate change will personally impact themselves or a family member is less than or equal to 0.50. In this case, the observed sample proportion differs from the actual population proportion and is a result of chance alone.
We don’t know the actual population proportion, so we don’t know which explanation above is correct. Even with samples that are chosen randomly and reflect the population, we can never be certain. We can, however, perform a hypothesis test to say with a certain degree of confidence which explanation is correct.
Hypothesis Tests
A hypothesis test is a procedure that uses probability to determine the likely explanation behind an observed sample statistic. To perform a hypothesis test on the sample data, we perform the following steps:
- Establish a null and alternative hypothesis about the unknown population parameter.
The null and alternative hypotheses are competing hypotheses. When a hypothesis test compares a sample statistic to a hypothesized population parameter, the null hypothesis states that the population parameter is equal to a specific value. In this hypothesis test, the null hypothesis states that the population proportion is equal to 0.50. Thus, it states that half (50%) of all students at the college have concerns that climate change will personally impact them. The alternative hypothesis is a claim about a population parameter based on a claim or research question. The alternative hypothesis in this test is that the population proportion is greater than 0.50.
The hypotheses can be summarized as follows:
- Null Hypothesis: Population proportion = 0.50.
- Alternative Hypothesis: Population proportion > 0.50.
- Find the probability of observing a sample proportion as extreme as the one we observed, assuming the null hypothesis is true.
The fundamental question is: if the null hypothesis is true and the population proportion is 0.50, how likely is it that we would obtain a random sample of 100 students whose sample proportion is 0.62 or more? We will find this probability. The procedure is explained below.
- Use the probability to make a decision about the hypotheses.
If the probability of finding such a sample proportion (0.62 or more) is very low, the observed sample proportion provides strong evidence that the population proportion is not equal to 0.50. If this occurs, we reject the null hypothesis and support the alternative hypothesis. In other words, we reject the hypothesis that the proportion of all students at the college who have personal concerns about climate change is equal to 0.50, and support the hypothesis that the population proportion is greater than 0.50. When we reject the null hypothesis, we say that the sample proportion is statistically significant.
If the probability of finding such a sample proportion is not very low, we arrive at the opposite decision. In this case, we do not reject the null hypothesis and we do not support the alternative hypothesis. That is, we do not reject the hypothesis that the population proportion is 0.50, and we do not support the hypothesis that the population proportion is greater than 0.50. When we do not reject the null hypothesis, we say the sample proportion is not statistically significant.
Finding the Probability
We can find the probability of obtaining a random sample of 100 students whose sample proportion is
0.62 or more by modeling the distribution of sample proportions, from a population whose proportion is 0.50, with a normal distribution. From this population, sample proportions from random samples of size 100 are normally distributed with a mean of 0.50 and a standard deviation of 0.05. This distribution of sample proportions is displayed below.
- Note that the mean of all sample proportions in this distribution is the assumed population proportion. Using the Empirical Rule, which sample proportions are unusual in this distribution? (Hint: Unusual values in a normal distribution are values that are two or more standard deviations away from the mean.)
- The observed sample proportion is 0.62. Compute the Z-score of this sample proportion using the mean of sample proportions (0.50) and the standard deviation of sample proportions (0.05). Round the Z-score to two decimal places. Interpret this Z-score.
- The observed sample proportion is 0.62. Compute the Z-score of this sample proportion using the mean of sample proportions (0.50) and the standard deviation of sample proportions (0.05). Round the Z-score to two decimal places.
The percentage of sample proportions that are 0.62 or more is equivalent to the probability of obtaining a random sample whose sample proportion is 0.62 or more. This probability is called a P-value (probability value). In a single sample hypothesis test, the P-value specifies the probability of randomly obtaining a sample statistic as extreme as the one observed if the null hypothesis is true.
- What is the P-value for the above hypothesis test? Interpret this P-value.
- If the population proportion is 0.50, is it likely or unlikely that a random sample of 100 students from the population would have a sample proportion of 0.62 or more?
Making a Decision about the Hypotheses
When a P-value is low, typically less than 5%, we say that the sample statistic is statistically significant. We then reject the null hypothesis that the population parameter is equal to a specific value, and we support the alternative hypothesis. We decide that the observed sample statistic is not due to chance alone, and instead reflects the population parameter.
Note: If a P-value other than 5% is going to be used as a cutoff, this value must be chosen before the hypothesis test is performed, to prevent confirmation bias. A P-value of 5% is typically used to ensure we are at least 95% certain before we reject the null hypothesis. For example, consider a jury trial: Would you want to convict someone, if you weren’t at least 95% certain they were guilty? Consider a drug safety study. Would you want to provide a new drug to the public if you weren’t at least 95% certain it was safe enough to use?
- Is the observed sample proportion of 0.62 statistically significant? Explain.
- (a) What should we decide about the null and alternative hypotheses?
The hypotheses can be summarized as follows:
- Null Hypothesis: Population proportion = 0.50.
- Alternative Hypothesis: Population proportion > 0.50.
(b) What can we state about the proportion of all students at the college who have concerns that climate change will personally impact themselves or a family member?
- Can we be 100% certain about our decision from Question 11?
MAKING CONNECTIONS
Record the important mathematical ideas from the discussion.
Standard Normal Distribution Table (Negative Z-Scores)
| z | .00 | .01 | .02 | .03 | .04 | .05 | .06 | .07 | .08 | .09 |
| -3.4 | 0.0003 | 0.0003 | 0.0003 | 0.0003 | 0.0003 | 0.0003 | 0.0003 | 0.0003 | 0.0003 | 0.0002 |
| -3.3 | 0.0005 | 0.0005 | 0.0005 | 0.0004 | 0.0004 | 0.0004 | 0.0004 | 0.0004 | 0.0004 | 0.0003 |
| -3.2 | 0.0007 | 0.0007 | 0.0006 | 0.0006 | 0.0006 | 0.0006 | 0.0006 | 0.0005 | 0.0005 | 0.0005 |
| -3.1 | 0.0010 | 0.0009 | 0.0009 | 0.0009 | 0.0008 | 0.0008 | 0.0008 | 0.0008 | 0.0007 | 0.0007 |
| -3.0 | 0.0013 | 0.0013 | 0.0013 | 0.0012 | 0.0012 | 0.0011 | 0.0011 | 0.0011 | 0.0010 | 0.0010 |
| -2.9 | 0.0019 | 0.0018 | 0.0018 | 0.0017 | 0.0016 | 0.0016 | 0.0015 | 0.0015 | 0.0014 | 0.0014 |
| -2.8 | 0.0026 | 0.0025 | 0.0024 | 0.0023 | 0.0023 | 0.0022 | 0.0021 | 0.0021 | 0.0020 | 0.0019 |
| -2.7 | 0.0035 | 0.0034 | 0.0033 | 0.0032 | 0.0031 | 0.0030 | 0.0029 | 0.0028 | 0.0027 | 0.0026 |
| -2.6 | 0.0047 | 0.0045 | 0.0044 | 0.0043 | 0.0041 | 0.0040 | 0.0039 | 0.0038 | 0.0037 | 0.0036 |
| -2.5 | 0.0062 | 0.0060 | 0.0059 | 0.0057 | 0.0055 | 0.0054 | 0.0052 | 0.0051 | 0.0049 | 0.0048 |
| -2.4 | 0.0082 | 0.0080 | 0.0078 | 0.0075 | 0.0073 | 0.0071 | 0.0069 | 0.0068 | 0.0066 | 0.0064 |
| -2.3 | 0.0107 | 0.0104 | 0.0102 | 0.0099 | 0.0096 | 0.0094 | 0.0091 | 0.0089 | 0.0087 | 0.0084 |
| -2.2 | 0.0139 | 0.0136 | 0.0132 | 0.0129 | 0.0125 | 0.0122 | 0.0119 | 0.0116 | 0.0113 | 0.0110 |
| -2.1 | 0.0179 | 0.0174 | 0.0170 | 0.0166 | 0.0162 | 0.0158 | 0.0154 | 0.0150 | 0.0146 | 0.0143 |
| -2.0 | 0.0228 | 0.0222 | 0.0217 | 0.0212 | 0.0207 | 0.0202 | 0.0197 | 0.0192 | 0.0188 | 0.0183 |
| -1.9 | 0.0287 | 0.0281 | 0.0274 | 0.0268 | 0.0262 | 0.0256 | 0.0250 | 0.0244 | 0.0239 | 0.0233 |
| -1.8 | 0.0359 | 0.0351 | 0.0344 | 0.0336 | 0.0329 | 0.0322 | 0.0314 | 0.0307 | 0.0301 | 0.0294 |
| -1.7 | 0.0446 | 0.0436 | 0.0427 | 0.0418 | 0.0409 | 0.0401 | 0.0392 | 0.0384 | 0.0375 | 0.0367 |
| -1.6 | 0.0548 | 0.0537 | 0.0526 | 0.0516 | 0.0505 | 0.0495 | 0.0485 | 0.0475 | 0.0465 | 0.0455 |
| -1.5 | 0.0668 | 0.0655 | 0.0643 | 0.0630 | 0.0618 | 0.0606 | 0.0594 | 0.0582 | 0.0571 | 0.0559 |
| -1.4 | 0.0808 | 0.0793 | 0.0778 | 0.0764 | 0.0749 | 0.0735 | 0.0721 | 0.0708 | 0.0694 | 0.0681 |
| -1.3 | 0.0968 | 0.0951 | 0.0934 | 0.0918 | 0.0901 | 0.0885 | 0.0869 | 0.0853 | 0.0838 | 0.0823 |
| -1.2 | 0.1151 | 0.1131 | 0.1112 | 0.1093 | 0.1075 | 0.1056 | 0.1038 | 0.1020 | 0.1003 | 0.0985 |
| -1.1 | 0.1357 | 0.1335 | 0.1314 | 0.1292 | 0.1271 | 0.1251 | 0.1230 | 0.1210 | 0.1190 | 0.1170 |
| -1.0 | 0.1587 | 0.1562 | 0.1539 | 0.1515 | 0.1492 | 0.1469 | 0.1446 | 0.1423 | 0.1401 | 0.1379 |
| -0.9 | 0.1841 | 0.1814 | 0.1788 | 0.1762 | 0.1736 | 0.1711 | 0.1685 | 0.1660 | 0.1635 | 0.1611 |
| -0.8 | 0.2119 | 0.2090 | 0.2061 | 0.2033 | 0.2005 | 0.1977 | 0.1949 | 0.1922 | 0.1894 | 0.1867 |
| -0.7 | 0.2420 | 0.2389 | 0.2358 | 0.2327 | 0.2296 | 0.2266 | 0.2236 | 0.2206 | 0.2177 | 0.2148 |
| -0.6 | 0.2743 | 0.2709 | 0.2676 | 0.2643 | 0.2611 | 0.2578 | 0.2546 | 0.2514 | 0.2483 | 0.2451 |
| -0.5 | 0.3085 | 0.3050 | 0.3015 | 0.2981 | 0.2946 | 0.2912 | 0.2877 | 0.2843 | 0.2810 | 0.2776 |
| -0.4 | 0.3446 | 0.3409 | 0.3372 | 0.3336 | 0.3300 | 0.3264 | 0.3228 | 0.3192 | 0.3156 | 0.3121 |
| -0.3 | 0.3821 | 0.3783 | 0.3745 | 0.3707 | 0.3669 | 0.3632 | 0.3594 | 0.3557 | 0.3520 | 0.3483 |
| -0.2 | 0.4207 | 0.4168 | 0.4129 | 0.4090 | 0.4052 | 0.4013 | 0.3974 | 0.3936 | 0.3897 | 0.3859 |
| -0.1 | 0.4602 | 0.4562 | 0.4522 | 0.4483 | 0.4443 | 0.4404 | 0.4364 | 0.4325 | 0.4286 | 0.4247 |
| -0.0 | 0.5000 | 0.4960 | 0.4920 | 0.4880 | 0.4840 | 0.4801 | 0.4761 | 0.4721 | 0.4681 | 0.4641 |
Standard Normal Distribution Table (Positive Z-Scores)
| z | .00 | .01 | .02 | .03 | .04 | .05 | .06 | .07 | .08 | .09 |
| 0.0 | 0.5000 | 0.5040 | 0.5080 | 0.5120 | 0.5160 | 0.5199 | 0.5239 | 0.5279 | 0.5319 | 0.5359 |
| 0.1 | 0.5398 | 0.5438 | 0.5478 | 0.5517 | 0.5557 | 0.5596 | 0.5636 | 0.5675 | 0.5714 | 0.5753 |
| 0.2 | 0.5793 | 0.5832 | 0.5871 | 0.5910 | 0.5948 | 0.5987 | 0.6026 | 0.6064 | 0.6103 | 0.6141 |
| 0.3 | 0.6179 | 0.6217 | 0.6255 | 0.6293 | 0.6331 | 0.6368 | 0.6406 | 0.6443 | 0.6480 | 0.6517 |
| 0.4 | 0.6554 | 0.6591 | 0.6628 | 0.6664 | 0.6700 | 0.6736 | 0.6772 | 0.6808 | 0.6844 | 0.6879 |
| 0.5 | 0.6915 | 0.6950 | 0.6985 | 0.7019 | 0.7054 | 0.7088 | 0.7123 | 0.7157 | 0.7190 | 0.7224 |
| 0.6 | 0.7257 | 0.7291 | 0.7324 | 0.7357 | 0.7389 | 0.7422 | 0.7454 | 0.7486 | 0.7517 | 0.7549 |
| 0.7 | 0.7580 | 0.7611 | 0.7642 | 0.7673 | 0.7704 | 0.7734 | 0.7764 | 0.7794 | 0.7823 | 0.7852 |
| 0.8 | 0.7881 | 0.7910 | 0.7939 | 0.7967 | 0.7995 | 0.8023 | 0.8051 | 0.8078 | 0.8106 | 0.8133 |
| 0.9 | 0.8159 | 0.8186 | 0.8212 | 0.8238 | 0.8264 | 0.8289 | 0.8315 | 0.8340 | 0.8365 | 0.8389 |
| 1.0 | 0.8413 | 0.8438 | 0.8461 | 0.8485 | 0.8508 | 0.8531 | 0.8554 | 0.8577 | 0.8599 | 0.8621 |
| 1.1 | 0.8643 | 0.8665 | 0.8686 | 0.8708 | 0.8729 | 0.8749 | 0.8770 | 0.8790 | 0.8810 | 0.8830 |
| 1.2 | 0.8849 | 0.8869 | 0.8888 | 0.8907 | 0.8925 | 0.8944 | 0.8962 | 0.8980 | 0.8997 | 0.9015 |
| 1.3 | 0.9032 | 0.9049 | 0.9066 | 0.9082 | 0.9099 | 0.9115 | 0.9131 | 0.9147 | 0.9162 | 0.9177 |
| 1.4 | 0.9192 | 0.9207 | 0.9222 | 0.9236 | 0.9251 | 0.9265 | 0.9279 | 0.9292 | 0.9306 | 0.9319 |
| 1.5 | 0.9332 | 0.9345 | 0.9357 | 0.9370 | 0.9382 | 0.9394 | 0.9406 | 0.9418 | 0.9429 | 0.9441 |
| 1.6 | 0.9452 | 0.9463 | 0.9474 | 0.9484 | 0.9495 | 0.9505 | 0.9515 | 0.9525 | 0.9535 | 0.9545 |
| 1.7 | 0.9554 | 0.9564 | 0.9573 | 0.9582 | 0.9591 | 0.9599 | 0.9608 | 0.9616 | 0.9625 | 0.9633 |
| 1.8 | 0.9641 | 0.9649 | 0.9656 | 0.9664 | 0.9671 | 0.9678 | 0.9686 | 0.9693 | 0.9699 | 0.9706 |
| 1.9 | 0.9713 | 0.9719 | 0.9726 | 0.9732 | 0.9738 | 0.9744 | 0.9750 | 0.9756 | 0.9761 | 0.9767 |
| 2.0 | 0.9772 | 0.9778 | 0.9783 | 0.9788 | 0.9793 | 0.9798 | 0.9803 | 0.9808 | 0.9812 | 0.9817 |
| 2.1 | 0.9821 | 0.9826 | 0.9830 | 0.9834 | 0.9838 | 0.9842 | 0.9846 | 0.9850 | 0.9854 | 0.9857 |
| 2.2 | 0.9861 | 0.9864 | 0.9868 | 0.9871 | 0.9875 | 0.9878 | 0.9881 | 0.9884 | 0.9887 | 0.9890 |
| 2.3 | 0.9893 | 0.9896 | 0.9898 | 0.9901 | 0.9904 | 0.9906 | 0.9909 | 0.9911 | 0.9913 | 0.9916 |
| 2.4 | 0.9918 | 0.9920 | 0.9922 | 0.9925 | 0.9927 | 0.9929 | 0.9931 | 0.9932 | 0.9934 | 0.9936 |
| 2.5 | 0.9938 | 0.9940 | 0.9941 | 0.9943 | 0.9945 | 0.9946 | 0.9948 | 0.9949 | 0.9951 | 0.9952 |
| 2.6 | 0.9953 | 0.9955 | 0.9956 | 0.9957 | 0.9959 | 0.9960 | 0.9961 | 0.9962 | 0.9963 | 0.9964 |
| 2.7 | 0.9965 | 0.9966 | 0.9967 | 0.9968 | 0.9969 | 0.9970 | 0.9971 | 0.9972 | 0.9973 | 0.9974 |
| 2.8 | 0.9974 | 0.9975 | 0.9976 | 0.9977 | 0.9977 | 0.9978 | 0.9979 | 0.9979 | 0.9980 | 0.9981 |
| 2.9 | 0.9981 | 0.9982 | 0.9982 | 0.9983 | 0.9984 | 0.9984 | 0.9985 | 0.9985 | 0.9986 | 0.9986 |
| 3.0 | 0.9987 | 0.9987 | 0.9987 | 0.9988 | 0.9988 | 0.9989 | 0.9989 | 0.9989 | 0.9990 | 0.9990 |
| 3.1 | 0.9990 | 0.9991 | 0.9991 | 0.9991 | 0.9992 | 0.9992 | 0.9992 | 0.9992 | 0.9993 | 0.9993 |
| 3.2 | 0.9993 | 0.9993 | 0.9994 | 0.9994 | 0.9994 | 0.9994 | 0.9994 | 0.9995 | 0.9995 | 0.9995 |
| 3.3 | 0.9995 | 0.9995 | 0.9995 | 0.9996 | 0.9996 | 0.9996 | 0.9996 | 0.9996 | 0.9996 | 0.9997 |
| 3.4 | 0.9997 | 0.9997 | 0.9997 | 0.9997 | 0.9997 | 0.9997 | 0.9997 | 0.9997 | 0.9997 | 0.9998 |


