Randomization and Blocking in DOE: Difference between revisions

From ReliaWiki
Jump to navigation Jump to search
 
Line 305: Line 305:




Assuming that the desired significance level is 0.1, since <math>p\,\!</math> value > 0.1, we fail to reject <math>{{H}_{0}}:{{\zeta }_{i}}=0\,\!</math> and conclude that there is no significant variation in the mileage from one vehicle to the other. Statistics to test the significance of other factors can be calculated in a similar manner. The complete analysis results obtained from DOE++ for this experiment are presented in the following figure.   
Assuming that the desired significance level is 0.1, since <math>p\,\!</math> value > 0.1, we fail to reject <math>{{H}_{0}}:{{\zeta }_{i}}=0\,\!</math> and conclude that there is no significant variation in the mileage from one vehicle to the other. Statistics to test the significance of other factors can be calculated in a similar manner. The complete analysis results obtained from the DOE folio for this experiment are presented in the following figure.   




[[Image:doe6_14.png|center|644px|Analysis results for the experiment in the example.]]
[[Image:doe6_14.png|center|644px|Analysis results for the experiment in the example.]]

Latest revision as of 16:24, 10 August 2017

New format available! This reference is now available in a new format that offers faster page load, improved display for calculations and images, more targeted search and the latest content available as a PDF. As of September 2023, this Reliawiki page will not continue to be updated. Please update all links and bookmarks to the latest reference at help.reliasoft.com/reference/experiment_design_and_analysis

Chapter 7: Randomization and Blocking in DOE


DOEbox.png

Chapter 7  
Randomization and Blocking in DOE  

Synthesis-icon.png

Available Software:
Weibull++

Examples icon.png

More Resources:
DOE examples


Randomization

The aspect of recording observations in an experiment in a random order is referred to as randomization. Specifically, randomization is the process of assigning the various levels of the investigated factors to the experimental units in a random fashion. An experiment is said to be completely randomized if the probability of an experimental unit to be subjected to any level of a factor is equal for all the experimental units. The importance of randomization can be illustrated using an example. Consider an experiment where the effect of the speed of a lathe machine on the surface finish of a product is being investigated. In order to save time, the experimenter records surface finish values by running the lathe machine continuously and recording observations in the order of increasing speeds. The analysis of the experiment data shows that an increase in lathe speeds causes a decrease in the quality of surface finish. However the results of the experiment are disputed by the lathe operator who claims that he has been able to obtain better surface finish quality in the products by operating the lathe machine at higher speeds. It is later found that the faulty results were caused because of overheating of the tool used in the machine. Since the lathe was run continuously in the order of increased speeds the observations were recorded in the order of increased tool temperatures. This problem could have been avoided if the experimenter had randomized the experiment and taken reading at the various lathe speeds in a random fashion. This would require the experimenter to stop and restart the machine at every observation, thereby keeping the temperature of the tool within a reasonable range. Randomization would have ensured that the effect of heating of the machine tool is not included in the experiment.

Blocking

Many times a factorial experiment requires so many runs that not all of them can be completed under homogeneous conditions. This may lead to inclusion of the effects of nuisance factors into the investigation. Nuisance factors are factors that have an effect on the response but are not of primary interest to the investigator. For example, two replicates of a two factor factorial experiment require eight runs. If four runs require the duration of one day to be completed, then the total experiment will require two days to be completed. The difference in the conditions on the two days may introduce effects on the response that are not the result of the two factors being investigated. Therefore, the day is a nuisance factor for this experiment. Nuisance factors can be accounted for using blocking. In blocking, experimental runs are separated based on levels of the nuisance factor. For the case of the two factor factorial experiment (where the day is a nuisance factor), separation can be made into two groups or blocks: runs that are carried out on the first day belong to block 1, and runs that are carried out on the second day belong to block 2. Thus, within each block conditions are the same with respect to the nuisance factor. As a result, each block investigates the effects of the factors of interest, while the difference in the blocks measures the effect of the nuisance factor. For the example of the two factor factorial experiment, a possible assignment of runs to the blocks could be as follows: one replicate of the experiment is assigned to block 1 and the second replicate is assigned to block 2 (now each block contains all possible treatment combinations). Within each block, runs are subjected to randomization (i.e., randomization is now restricted to the runs within a block). Such a design, where each block contains one complete replicate and the treatments within a block are subjected to randomization, is called randomized complete block design.

In summary, blocking should always be used to account for the effects of nuisance factors if it is not possible to hold the nuisance factor at a constant level through all of the experimental runs. Randomization should be used within each block to counter the effects of any unknown variability that may still be present.


Example

Consider the example discussed in General Full Factorial Design where the mileage of a sports utility vehicle was investigated for the effects of speed and fuel additive type. Now assume that the three replicates for this experiment were carried out on three different vehicles. To ensure that the variation from one vehicle to another does not have an effect on the analysis, each vehicle is considered as one block. See the experiment design in the following figure.


Randomized complete block design for the mileage test using three blocks.


For the purpose of the analysis, the block is considered as a main effect except that it is assumed that interactions between the block and the other main effects do not exist. Therefore, there is one block main effect (having three levels - block 1, block 2 and block 3), two main effects (speed -having three levels; and fuel additive type - having two levels) and one interaction effect (speed-fuel additive interaction) for this experiment. Let [math]\displaystyle{ {{\zeta }_{i}}\,\! }[/math] represent the block effects. The hypothesis test on the block main effect checks if there is a significant variation from one vehicle to the other. The statements for the hypothesis test are:


[math]\displaystyle{ \begin{align} & {{H}_{0}}: & {{\zeta }_{1}}={{\zeta }_{2}}={{\zeta }_{3}}=0\text{ (no main effect of block)} \\ & {{H}_{1}}: & {{\zeta }_{i}}\ne 0\text{ for at least one }i \end{align}\,\! }[/math]


The test statistic for this test is:


[math]\displaystyle{ {{F}_{0}}=\frac{M{{S}_{Block}}}{M{{S}_{E}}}\,\! }[/math]


where [math]\displaystyle{ M{{S}_{Block}}\,\! }[/math] represents the mean square for the block main effect and [math]\displaystyle{ M{{S}_{E}}\,\! }[/math] is the error mean square. The hypothesis statements and test statistics to test the significance of factors [math]\displaystyle{ A\,\! }[/math] (speed), [math]\displaystyle{ B\,\! }[/math] (fuel additive) and the interaction [math]\displaystyle{ AB\,\! }[/math] (speed-fuel additive interaction) can be obtained as explained in the example. The ANOVA model for this example can be written as:


[math]\displaystyle{ {{Y}_{ijk}}=\mu +{{\zeta }_{i}}+{{\tau }_{j}}+{{\delta }_{k}}+{{(\tau \delta )}_{jk}}+{{\epsilon }_{ijk}}\,\! }[/math]


where:


  • [math]\displaystyle{ \mu \,\! }[/math] represents the overall mean effect
  • [math]\displaystyle{ {{\zeta }_{i}}\,\! }[/math] is the effect of the [math]\displaystyle{ i\,\! }[/math]th level of the block ([math]\displaystyle{ i=1,2,3\,\! }[/math])
  • [math]\displaystyle{ {{\tau }_{j}}\,\! }[/math] is the effect of the [math]\displaystyle{ j\,\! }[/math]th level of factor [math]\displaystyle{ A\,\! }[/math] ([math]\displaystyle{ j=1,2,3\,\! }[/math])
  • [math]\displaystyle{ {{\delta }_{k}}\,\! }[/math] is the effect of the [math]\displaystyle{ k\,\! }[/math]th level of factor [math]\displaystyle{ B\,\! }[/math] ([math]\displaystyle{ k=1,2\,\! }[/math])
  • [math]\displaystyle{ {{(\tau \delta )}_{jk}}\,\! }[/math] represents the interaction effect between [math]\displaystyle{ A\,\! }[/math] and [math]\displaystyle{ B\,\! }[/math]
  • and [math]\displaystyle{ {{\epsilon }_{ijk}}\,\! }[/math] represents the random error terms (which are assumed to be normally distributed with a mean of zero and variance of [math]\displaystyle{ {{\sigma }^{2}}\,\! }[/math])


In order to calculate the test statistics, it is convenient to express the ANOVA model of the equation given above in the form [math]\displaystyle{ y=X\beta +\epsilon \,\! }[/math]. This can be done as explained next.

Expression of the ANOVA Model as y = ΧΒ + ε

Since the effects [math]\displaystyle{ {{\zeta }_{i}}\,\! }[/math], [math]\displaystyle{ {{\tau }_{j}}\,\! }[/math], [math]\displaystyle{ {{\delta }_{k}}\,\! }[/math], and [math]\displaystyle{ {{(\tau \delta )}_{jk}}\,\! }[/math] are defined as deviations from the overall mean, the following constraints exist.
Constraints on [math]\displaystyle{ {{\zeta }_{i}}\,\! }[/math] are:


[math]\displaystyle{ \begin{align} & \underset{i=1}{\overset{3}{\mathop \sum }}\,{{\zeta }_{i}}= & 0 \\ & \text{or }{{\zeta }_{1}}+{{\zeta }_{2}}+{{\zeta }_{3}}= & 0 \end{align}\,\! }[/math]


Therefore, only two of the [math]\displaystyle{ {{\zeta }_{i}}\,\! }[/math] effects are independent. Assuming that [math]\displaystyle{ {{\zeta }_{1}}\,\! }[/math] and [math]\displaystyle{ {{\zeta }_{2}}\,\! }[/math] are independent, [math]\displaystyle{ {{\zeta }_{3}}=-({{\zeta }_{1}}+{{\zeta }_{2}})\,\! }[/math]. (The null hypothesis to test the significance of the blocks can be rewritten using only the independent effects as [math]\displaystyle{ {{H}_{0}}:{{\zeta }_{1}}={{\zeta }_{2}}=0\,\! }[/math].) In DOE folios, the independent block effects, [math]\displaystyle{ {{\zeta }_{1}}\,\! }[/math] and [math]\displaystyle{ {{\zeta }_{2}}\,\! }[/math], are displayed as Block[1] and Block[2], respectively.

Constraints on [math]\displaystyle{ {{\tau }_{j}}\,\! }[/math] are:


[math]\displaystyle{ \begin{align} & \underset{j=1}{\overset{3}{\mathop \sum }}\,{{\tau }_{j}}= & 0 \\ & \text{or }{{\tau }_{1}}+{{\tau }_{2}}+{{\tau }_{3}}= & 0 \end{align}\,\! }[/math]


Therefore, only two of the [math]\displaystyle{ {{\tau }_{j}}\,\! }[/math] effects are independent. Assuming that [math]\displaystyle{ {{\tau }_{1}}\,\! }[/math] and [math]\displaystyle{ {{\tau }_{2}}\,\! }[/math] are independent, [math]\displaystyle{ {{\tau }_{3}}=-({{\tau }_{1}}+{{\tau }_{2}})\,\! }[/math]. The independent effects, [math]\displaystyle{ {{\tau }_{1}}\,\! }[/math] and [math]\displaystyle{ {{\tau }_{2}}\,\! }[/math], are displayed as A[1] and A[2], respectively. Constraints on [math]\displaystyle{ {{\delta }_{k}}\,\! }[/math] are:


[math]\displaystyle{ \begin{align} & \underset{k=1}{\overset{2}{\mathop \sum }}\,{{\delta }_{k}}= & 0 \\ & \text{or }{{\delta }_{1}}+{{\delta }_{2}}= & 0 \end{align}\,\! }[/math]


Therefore, only one of the [math]\displaystyle{ {{\delta }_{k}}\,\! }[/math] effects is independent. Assuming that [math]\displaystyle{ {{\delta }_{1}}\,\! }[/math] is independent, [math]\displaystyle{ {{\delta }_{2}}=-{{\delta }_{1}}\,\! }[/math]. The independent effect, [math]\displaystyle{ {{\delta }_{1}}\,\! }[/math], is displayed as B:B. Constraints on [math]\displaystyle{ {{(\tau \delta )}_{jk}}\,\! }[/math] are:


[math]\displaystyle{ \begin{align} & \underset{j=1}{\overset{3}{\mathop \sum }}\,{{(\tau \delta )}_{jk}}= & 0 \\ & \text{and }\underset{k=1}{\overset{2}{\mathop \sum }}\,{{(\tau \delta )}_{jk}}= & 0 \\ & \text{or }{{(\tau \delta )}_{11}}+{{(\tau \delta )}_{21}}+{{(\tau \delta )}_{31}}= & 0 \\ & {{(\tau \delta )}_{12}}+{{(\tau \delta )}_{22}}+{{(\tau \delta )}_{32}}= & 0 \\ & \text{and }{{(\tau \delta )}_{11}}+{{(\tau \delta )}_{12}}= & 0 \\ & {{(\tau \delta )}_{21}}+{{(\tau \delta )}_{22}}= & 0 \\ & {{(\tau \delta )}_{31}}+{{(\tau \delta )}_{32}}= & 0 \end{align}\,\! }[/math]


The last five equations given above represent four constraints as only four of the five equations are independent. Therefore, only two out of the six [math]\displaystyle{ {{(\tau \delta )}_{jk}}\,\! }[/math] effects are independent. Assuming that [math]\displaystyle{ {{(\tau \delta )}_{11}}\,\! }[/math] and [math]\displaystyle{ {{(\tau \delta )}_{21}}\,\! }[/math] are independent, we can express the other four effects in terms of these effects. The independent effects, [math]\displaystyle{ {{(\tau \delta )}_{11}}\,\! }[/math] and [math]\displaystyle{ {{(\tau \delta )}_{21}}\,\! }[/math], are displayed as A[1]B and A[2]B, respectively.

The regression version of the ANOVA model can be obtained using indicator variables. Since the block has three levels, two indicator variables, [math]\displaystyle{ {{x}_{1}}\,\! }[/math] and [math]\displaystyle{ {{x}_{2}}\,\! }[/math], are required, which need to be coded as shown next:


[math]\displaystyle{ \begin{align} & \text{Block 1}: & {{x}_{1}}=1,\text{ }{{x}_{2}}=0\text{ } \\ & \text{Block 2}: & {{x}_{1}}=0,\text{ }{{x}_{2}}=1\text{ } \\ & \text{Block 3}: & {{x}_{1}}=-1,\text{ }{{x}_{2}}=-1\text{ } \end{align}\,\! }[/math]


Factor [math]\displaystyle{ A\,\! }[/math] has three levels and two indicator variables, [math]\displaystyle{ {{x}_{3}}\,\! }[/math] and [math]\displaystyle{ {{x}_{4}}\,\! }[/math], are required:


[math]\displaystyle{ \begin{align} & \text{Treatment Effect }{{\tau }_{1}}: & {{x}_{3}}=1,\text{ }{{x}_{4}}=0 \\ & \text{Treatment Effect }{{\tau }_{2}}: & {{x}_{3}}=0,\text{ }{{x}_{4}}=1\text{ } \\ & \text{Treatment Effect }{{\tau }_{3}}: & {{x}_{3}}=-1,\text{ }{{x}_{4}}=-1\text{ } \end{align}\,\! }[/math]


Factor [math]\displaystyle{ B\,\! }[/math] has two levels and can be represented using one indicator variable, [math]\displaystyle{ {{x}_{5}}\,\! }[/math], as follows:


[math]\displaystyle{ \begin{align} & \text{Treatment Effect }{{\delta }_{1}}: & {{x}_{5}}=1 \\ & \text{Treatment Effect }{{\delta }_{2}}: & {{x}_{5}}=-1 \end{align}\,\! }[/math]


The [math]\displaystyle{ AB\,\! }[/math] interaction will be represented by [math]\displaystyle{ {{x}_{3}}{{x}_{5}}\,\! }[/math] and [math]\displaystyle{ {{x}_{4}}{{x}_{5}}\,\! }[/math]. The regression version of the ANOVA model can finally be obtained as:


[math]\displaystyle{ Y=\mu +{{\zeta }_{1}}\cdot {{x}_{1}}+{{\zeta }_{2}}\cdot {{x}_{2}}+{{\tau }_{1}}\cdot {{x}_{3}}+{{\tau }_{2}}\cdot {{x}_{4}}+{{\delta }_{1}}\cdot {{x}_{5}}+{{(\tau \delta )}_{11}}\cdot {{x}_{3}}{{x}_{5}}+{{(\tau \delta )}_{21}}\cdot {{x}_{4}}{{x}_{5}}+\epsilon \,\! }[/math]


In matrix notation this model can be expressed as:


[math]\displaystyle{ y=X\beta +\epsilon \,\! }[/math]


or:


[math]\displaystyle{ \left[ \begin{matrix} 17.3 \\ 18.9 \\ 17.1 \\ 18.7 \\ 19.1 \\ 18.8 \\ 17.8 \\ 18.2 \\ . \\ . \\ 18.3 \\ \end{matrix} \right]=\left[ \begin{matrix} 1 & 1 & 0 & 1 & 0 & 1 & 1 & 0 \\ 1 & 1 & 0 & 0 & 1 & 1 & 0 & 1 \\ 1 & 1 & 0 & -1 & -1 & 1 & -1 & -1 \\ 1 & 1 & 0 & 1 & 0 & -1 & -1 & 0 \\ 1 & 1 & 0 & 0 & 1 & -1 & 0 & -1 \\ 1 & 1 & 0 & -1 & -1 & -1 & 1 & 1 \\ 1 & 0 & 1 & 1 & 0 & 1 & 1 & 0 \\ 1 & 0 & 1 & 0 & 1 & 1 & 0 & 1 \\ . & . & . & . & . & . & . & . \\ . & . & . & . & . & . & . & . \\ 1 & -1 & -1 & -1 & -1 & -1 & 1 & 1 \\ \end{matrix} \right]\left[ \begin{matrix} \mu \\ {{\zeta }_{1}} \\ {{\zeta }_{2}} \\ {{\tau }_{1}} \\ {{\tau }_{2}} \\ {{\delta }_{1}} \\ {{(\tau \delta )}_{11}} \\ {{(\tau \delta )}_{21}} \\ \end{matrix} \right]+\left[ \begin{matrix} {{\epsilon }_{111}} \\ {{\epsilon }_{121}} \\ {{\epsilon }_{131}} \\ {{\epsilon }_{112}} \\ {{\epsilon }_{122}} \\ {{\epsilon }_{132}} \\ {{\epsilon }_{211}} \\ {{\epsilon }_{221}} \\ . \\ . \\ {{\epsilon }_{332}} \\ \end{matrix} \right]\,\! }[/math]


Knowing [math]\displaystyle{ y\,\! }[/math], [math]\displaystyle{ X\,\! }[/math] and [math]\displaystyle{ \beta \,\! }[/math], the sum of squares for the ANOVA model and the extra sum of squares for each of the factors can be calculated. These are used to calculate the mean squares that are used to obtain the test statistics.

Calculation of the Sum of Squares for the Model

The model sum of squares, [math]\displaystyle{ S{{S}_{TR}}\,\! }[/math], for the ANOVA model of this example can be obtained as:


[math]\displaystyle{ \begin{align} & S{{S}_{TR}}= & {{y}^{\prime }}[H-(\frac{1}{{{n}_{a}}\cdot {{n}_{b}}\cdot m})J]y \\ & = & {{y}^{\prime }}[H-(\frac{1}{18})J]y \\ & = & 9.9256 \end{align}\,\! }[/math]


Since seven effect terms ([math]\displaystyle{ {{\zeta }_{1}}\,\! }[/math], [math]\displaystyle{ {{\zeta }_{2}}\,\! }[/math], [math]\displaystyle{ {{\tau }_{1}}\,\! }[/math], [math]\displaystyle{ {{\tau }_{2}}\,\! }[/math], [math]\displaystyle{ {{\delta }_{1}}\,\! }[/math], [math]\displaystyle{ {{(\tau \delta )}_{11}}\,\! }[/math] and [math]\displaystyle{ {{(\tau \delta )}_{21}}\,\! }[/math]) are used in the model the number of degrees of freedom associated with [math]\displaystyle{ S{{S}_{TR}}\,\! }[/math] is seven ([math]\displaystyle{ dof(S{{S}_{TR}})=7\,\! }[/math]).

The total sum of squares can be calculated as:


[math]\displaystyle{ \begin{align} & S{{S}_{T}}= & {{y}^{\prime }}[I-(\frac{1}{{{n}_{a}}\cdot {{n}_{b}}\cdot m})J]y \\ & = & {{y}^{\prime }}[H-(\frac{1}{18})J]y \\ & = & 10.7178 \end{align}\,\! }[/math]


Since there are 18 observed response values, the number of degrees of freedom associated with the total sum of squares is 17 ([math]\displaystyle{ dof(S{{S}_{T}})=17\,\! }[/math]). The error sum of squares can now be obtained:


[math]\displaystyle{ \begin{align} S{{S}_{E}}= & S{{S}_{T}}-S{{S}_{TR}} \\ = & 10.7178-9.9256 \\ = & 0.7922 \end{align}\,\! }[/math]


The number of degrees of freedom associated with the error sum of squares is:


[math]\displaystyle{ \begin{align} dof(S{{S}_{E}})= & dof(S{{S}_{T}})-dof(S{{S}_{TR}}) \\ = & 17-7 \\ = & 10 \end{align}\,\! }[/math]


Since there are no true replicates of the treatments (as can be seen from the design of the previous figure, where all of the treatments are seen to be run just once), all of the error sum of squares is the sum of squares due to lack of fit. The lack of fit arises because the model used is not a full model since it is assumed that there are no interactions between blocks and other effects.

Calculation of the Extra Sum of Squares for the Factors

The sequential sum of squares for the blocks can be calculated as:


[math]\displaystyle{ \begin{align} S{{S}_{Block}}= & S{{S}_{TR}}(\mu ,{{\zeta }_{1}},{{\zeta }_{2}})-S{{S}_{TR}}(\mu ) \\ = & {{y}^{\prime }}[{{H}_{\mu ,{{\zeta }_{1}},{{\zeta }_{2}}}}-(\frac{1}{18})J]y-0 \end{align}\,\! }[/math]


where [math]\displaystyle{ J\,\! }[/math] is the matrix of ones, [math]\displaystyle{ {{H}_{\mu ,{{\zeta }_{1}},{{\zeta }_{2}}}}\,\! }[/math] is the hat matrix, which is calculated using [math]\displaystyle{ {{H}_{\mu ,{{\zeta }_{1}},{{\zeta }_{2}}}}={{X}_{\mu ,{{\zeta }_{1}},{{\zeta }_{2}}}}{{(X_{\mu ,{{\zeta }_{1}},{{\zeta }_{2}}}^{\prime }{{X}_{\mu ,{{\zeta }_{1}},{{\zeta }_{2}}}})}^{-1}}X_{\mu ,{{\zeta }_{1}},{{\zeta }_{2}}}^{\prime }\,\! }[/math], and [math]\displaystyle{ {{X}_{\mu ,{{\zeta }_{1}},{{\zeta }_{2}}}}\,\! }[/math] is the matrix containing only the first three columns of the [math]\displaystyle{ X\,\! }[/math] matrix. Thus


[math]\displaystyle{ \begin{align} S{{S}_{Block}}= & {{y}^{\prime }}[{{H}_{\mu ,{{\zeta }_{1}},{{\zeta }_{2}}}}-(\frac{1}{18})J]y-0 \\ = & 0.1944-0 \\ = & 0.1944 \end{align}\,\! }[/math]


Since there are two independent block effects, and [math]\displaystyle{ {{\zeta }_{2}}\,\! }[/math], the number of degrees of freedom associated with [math]\displaystyle{ S{{S}_{Blocks}}\,\! }[/math] is two ([math]\displaystyle{ dof(S{{S}_{Blocks}})=2\,\! }[/math]).

Similarly, the sequential sum of squares for factor [math]\displaystyle{ A\,\! }[/math] can be calculated as:


[math]\displaystyle{ \begin{align} S{{S}_{A}}= & S{{S}_{TR}}(\mu ,{{\zeta }_{1}},{{\zeta }_{2}},{{\tau }_{1}},{{\tau }_{2}})-S{{S}_{TR}}(\mu ,{{\zeta }_{1}},{{\zeta }_{2}}) \\ = & {{y}^{\prime }}[{{H}_{\mu ,{{\zeta }_{1}},{{\zeta }_{2}},{{\tau }_{1}},{{\tau }_{2}}}}-(\frac{1}{18})J]y-{{y}^{\prime }}[{{H}_{\mu ,{{\zeta }_{1}},{{\zeta }_{2}}}}-(\frac{1}{18})J]y \\ = & 4.7756-0.1944 \\ = & 4.5812 \end{align}\,\! }[/math]


Sequential sum of squares for the other effects are obtained as [math]\displaystyle{ S{{S}_{B}}=4.9089\,\! }[/math] and [math]\displaystyle{ S{{S}_{AB}}=0.2411\,\! }[/math].


Calculation of the Test Statistics

Knowing the sum of squares, the test statistics for each of the factors can be calculated. For example, the test statistic for the main effect of the blocks is:


[math]\displaystyle{ \begin{align} {{({{f}_{0}})}_{Block}}= & \frac{M{{S}_{Block}}}{M{{S}_{E}}} \\ = & \frac{S{{S}_{Block}}/dof(S{{S}_{Blocks}})}{S{{S}_{E}}/dof(S{{S}_{E}})} \\ = & \frac{0.1944/2}{0.7922/10} \\ = & 1.227 \end{align}\,\! }[/math]


The [math]\displaystyle{ p\,\! }[/math] value corresponding to this statistic based on the [math]\displaystyle{ F\,\! }[/math] distribution with 2 degrees of freedom in the numerator and 10 degrees of freedom in the denominator is:


[math]\displaystyle{ \begin{align} p\text{ }value= & 1-P(F\le {{({{f}_{0}})}_{Block}}) \\ = & 1-0.6663 \\ = & 0.3337 \end{align}\,\! }[/math]


Assuming that the desired significance level is 0.1, since [math]\displaystyle{ p\,\! }[/math] value > 0.1, we fail to reject [math]\displaystyle{ {{H}_{0}}:{{\zeta }_{i}}=0\,\! }[/math] and conclude that there is no significant variation in the mileage from one vehicle to the other. Statistics to test the significance of other factors can be calculated in a similar manner. The complete analysis results obtained from the DOE folio for this experiment are presented in the following figure.


Analysis results for the experiment in the example.