StudySmarter - The all-in-one study app.

4.8 • +11k Ratings

More than 3 Million Downloads

Free

Suggested languages for you:

Americas

Europe

Comparing Two Means Hypothesis Testing

- Calculus
- Absolute Maxima and Minima
- Absolute and Conditional Convergence
- Accumulation Function
- Accumulation Problems
- Algebraic Functions
- Alternating Series
- Antiderivatives
- Application of Derivatives
- Approximating Areas
- Arc Length of a Curve
- Area Between Two Curves
- Arithmetic Series
- Average Value of a Function
- Calculus of Parametric Curves
- Candidate Test
- Combining Differentiation Rules
- Combining Functions
- Continuity
- Continuity Over an Interval
- Convergence Tests
- Cost and Revenue
- Density and Center of Mass
- Derivative Functions
- Derivative of Exponential Function
- Derivative of Inverse Function
- Derivative of Logarithmic Functions
- Derivative of Trigonometric Functions
- Derivatives
- Derivatives and Continuity
- Derivatives and the Shape of a Graph
- Derivatives of Inverse Trigonometric Functions
- Derivatives of Polar Functions
- Derivatives of Sec, Csc and Cot
- Derivatives of Sin, Cos and Tan
- Determining Volumes by Slicing
- Direction Fields
- Disk Method
- Divergence Test
- Eliminating the Parameter
- Euler's Method
- Evaluating a Definite Integral
- Evaluation Theorem
- Exponential Functions
- Finding Limits
- Finding Limits of Specific Functions
- First Derivative Test
- Function Transformations
- General Solution of Differential Equation
- Geometric Series
- Growth Rate of Functions
- Higher-Order Derivatives
- Hydrostatic Pressure
- Hyperbolic Functions
- Implicit Differentiation Tangent Line
- Implicit Relations
- Improper Integrals
- Indefinite Integral
- Indeterminate Forms
- Initial Value Problem Differential Equations
- Integral Test
- Integrals of Exponential Functions
- Integrals of Motion
- Integrating Even and Odd Functions
- Integration Formula
- Integration Tables
- Integration Using Long Division
- Integration of Logarithmic Functions
- Integration using Inverse Trigonometric Functions
- Intermediate Value Theorem
- Inverse Trigonometric Functions
- Jump Discontinuity
- Lagrange Error Bound
- Limit Laws
- Limit of Vector Valued Function
- Limit of a Sequence
- Limits
- Limits at Infinity
- Limits at Infinity and Asymptotes
- Limits of a Function
- Linear Approximations and Differentials
- Linear Differential Equation
- Linear Functions
- Logarithmic Differentiation
- Logarithmic Functions
- Logistic Differential Equation
- Maclaurin Series
- Manipulating Functions
- Maxima and Minima
- Maxima and Minima Problems
- Mean Value Theorem for Integrals
- Models for Population Growth
- Motion Along a Line
- Motion in Space
- Natural Logarithmic Function
- Net Change Theorem
- Newton's Method
- Nonhomogeneous Differential Equation
- One-Sided Limits
- Optimization Problems
- P Series
- Particle Model Motion
- Particular Solutions to Differential Equations
- Polar Coordinates
- Polar Coordinates Functions
- Polar Curves
- Population Change
- Power Series
- Radius of Convergence
- Ratio Test
- Removable Discontinuity
- Riemann Sum
- Rolle's Theorem
- Root Test
- Second Derivative Test
- Separable Equations
- Separation of Variables
- Simpson's Rule
- Solid of Revolution
- Solutions to Differential Equations
- Surface Area of Revolution
- Symmetry of Functions
- Tangent Lines
- Taylor Polynomials
- Taylor Series
- Techniques of Integration
- The Fundamental Theorem of Calculus
- The Mean Value Theorem
- The Power Rule
- The Squeeze Theorem
- The Trapezoidal Rule
- Theorems of Continuity
- Trigonometric Substitution
- Vector Valued Function
- Vectors in Calculus
- Vectors in Space
- Washer Method
- Decision Maths
- Geometry
- 2 Dimensional Figures
- 3 Dimensional Vectors
- 3-Dimensional Figures
- Altitude
- Angles in Circles
- Arc Measures
- Area and Volume
- Area of Circles
- Area of Circular Sector
- Area of Parallelograms
- Area of Plane Figures
- Area of Rectangles
- Area of Regular Polygons
- Area of Rhombus
- Area of Trapezoid
- Area of a Kite
- Composition
- Congruence Transformations
- Congruent Triangles
- Convexity in Polygons
- Coordinate Systems
- Dilations
- Distance and Midpoints
- Equation of Circles
- Equilateral Triangles
- Figures
- Fundamentals of Geometry
- Geometric Inequalities
- Geometric Mean
- Geometric Probability
- Glide Reflections
- HL ASA and AAS
- Identity Map
- Inscribed Angles
- Isometry
- Isosceles Triangles
- Law of Cosines
- Law of Sines
- Linear Measure and Precision
- Median
- Parallel Lines Theorem
- Parallelograms
- Perpendicular Bisector
- Plane Geometry
- Polygons
- Projections
- Properties of Chords
- Proportionality Theorems
- Pythagoras Theorem
- Rectangle
- Reflection in Geometry
- Regular Polygon
- Rhombuses
- Right Triangles
- Rotations
- SSS and SAS
- Segment Length
- Similarity
- Similarity Transformations
- Special quadrilaterals
- Squares
- Surface Area of Cone
- Surface Area of Cylinder
- Surface Area of Prism
- Surface Area of Sphere
- Surface Area of a Solid
- Surface of Pyramids
- Symmetry
- Translations
- Trapezoids
- Triangle Inequalities
- Triangles
- Using Similar Polygons
- Vector Addition
- Vector Product
- Volume of Cone
- Volume of Cylinder
- Volume of Pyramid
- Volume of Solid
- Volume of Sphere
- Volume of prisms
- Mechanics Maths
- Acceleration and Time
- Acceleration and Velocity
- Angular Speed
- Assumptions
- Calculus Kinematics
- Coefficient of Friction
- Connected Particles
- Conservation of Mechanical Energy
- Constant Acceleration
- Constant Acceleration Equations
- Converting Units
- Elastic Strings and Springs
- Force as a Vector
- Kinematics
- Newton's First Law
- Newton's Law of Gravitation
- Newton's Second Law
- Newton's Third Law
- Power
- Projectiles
- Pulleys
- Resolving Forces
- Statics and Dynamics
- Tension in Strings
- Variable Acceleration
- Work Done by a Constant Force
- Probability and Statistics
- Bar Graphs
- Basic Probability
- Charts and Diagrams
- Conditional Probabilities
- Continuous and Discrete Data
- Frequency, Frequency Tables and Levels of Measurement
- Independent Events Probability
- Line Graphs
- Mean Median and Mode
- Mutually Exclusive Probabilities
- Probability Rules
- Probability of Combined Events
- Quartiles and Interquartile Range
- Systematic Listing
- Pure Maths
- ASA Theorem
- Absolute Value Equations and Inequalities
- Addition and Subtraction of Rational Expressions
- Addition, Subtraction, Multiplication and Division
- Algebra
- Algebraic Fractions
- Algebraic Notation
- Algebraic Representation
- Analyzing Graphs of Polynomials
- Angle Measure
- Angles
- Angles in Polygons
- Approximation and Estimation
- Area and Circumference of a Circle
- Area and Perimeter of Quadrilaterals
- Area of Triangles
- Argand Diagram
- Arithmetic Sequences
- Average Rate of Change
- Bijective Functions
- Binomial Expansion
- Binomial Theorem
- Chain Rule
- Circle Theorems
- Circles
- Circles Maths
- Combination of Functions
- Combinatorics
- Common Factors
- Common Multiples
- Completing the Square
- Completing the Squares
- Complex Numbers
- Composite Functions
- Composition of Functions
- Compound Interest
- Compound Units
- Conic Sections
- Construction and Loci
- Converting Metrics
- Convexity and Concavity
- Coordinate Geometry
- Coordinates in Four Quadrants
- Cubic Function Graph
- Cubic Polynomial Graphs
- Data transformations
- De Moivre's Theorem
- Deductive Reasoning
- Definite Integrals
- Deriving Equations
- Determinant of Inverse Matrix
- Determinants
- Differential Equations
- Differentiation
- Differentiation Rules
- Differentiation from First Principles
- Differentiation of Hyperbolic Functions
- Direct and Inverse proportions
- Disjoint and Overlapping Events
- Disproof by Counterexample
- Distance from a Point to a Line
- Divisibility Tests
- Double Angle and Half Angle Formulas
- Drawing Conclusions from Examples
- Ellipse
- Equation of Line in 3D
- Equation of a Perpendicular Bisector
- Equation of a circle
- Equations
- Equations and Identities
- Equations and Inequalities
- Estimation in Real Life
- Euclidean Algorithm
- Evaluating and Graphing Polynomials
- Even Functions
- Exponential Form of Complex Numbers
- Exponential Rules
- Exponentials and Logarithms
- Expression Math
- Expressions and Formulas
- Faces Edges and Vertices
- Factorials
- Factoring Polynomials
- Factoring Quadratic Equations
- Factorising expressions
- Factors
- Finding Maxima and Minima Using Derivatives
- Finding Rational Zeros
- Finding the Area
- Forms of Quadratic Functions
- Fractional Powers
- Fractional Ratio
- Fractions
- Fractions and Decimals
- Fractions and Factors
- Fractions in Expressions and Equations
- Fractions, Decimals and Percentages
- Function Basics
- Functional Analysis
- Functions
- Fundamental Counting Principle
- Fundamental Theorem of Algebra
- Generating Terms of a Sequence
- Geometric Sequence
- Gradient and Intercept
- Graphical Representation
- Graphing Rational Functions
- Graphing Trigonometric Functions
- Graphs
- Graphs and Differentiation
- Graphs of Common Functions
- Graphs of Exponents and Logarithms
- Graphs of Trigonometric Functions
- Greatest Common Divisor
- Growth and Decay
- Growth of Functions
- Highest Common Factor
- Hyperbolas
- Imaginary Unit and Polar Bijection
- Implicit differentiation
- Inductive Reasoning
- Inequalities Maths
- Infinite geometric series
- Injective functions
- Instantaneous Rate of Change
- Integers
- Integrating Polynomials
- Integrating Trig Functions
- Integrating e^x and 1/x
- Integration
- Integration Using Partial Fractions
- Integration by Parts
- Integration by Substitution
- Integration of Hyperbolic Functions
- Interest
- Inverse Hyperbolic Functions
- Inverse Matrices
- Inverse and Joint Variation
- Inverse functions
- Iterative Methods
- Law of Cosines in Algebra
- Law of Sines in Algebra
- Laws of Logs
- Limits of Accuracy
- Linear Expressions
- Linear Systems
- Linear Transformations of Matrices
- Location of Roots
- Logarithm Base
- Logic
- Lower and Upper Bounds
- Lowest Common Denominator
- Lowest Common Multiple
- Math formula
- Matrices
- Matrix Addition and Subtraction
- Matrix Determinant
- Matrix Multiplication
- Metric and Imperial Units
- Misleading Graphs
- Mixed Expressions
- Modulus Functions
- Modulus and Phase
- Multiples of Pi
- Multiplication and Division of Fractions
- Multiplicative Relationship
- Multiplying and Dividing Rational Expressions
- Natural Logarithm
- Natural Numbers
- Notation
- Number
- Number Line
- Number Systems
- Numerical Methods
- Odd functions
- Open Sentences and Identities
- Operation with Complex Numbers
- Operations with Decimals
- Operations with Matrices
- Operations with Polynomials
- Order of Operations
- Parabola
- Parallel Lines
- Parametric Differentiation
- Parametric Equations
- Parametric Integration
- Partial Fractions
- Pascal's Triangle
- Percentage
- Percentage Increase and Decrease
- Percentage as fraction or decimals
- Perimeter of a Triangle
- Permutations and Combinations
- Perpendicular Lines
- Points Lines and Planes
- Polynomial Graphs
- Polynomials
- Powers Roots And Radicals
- Powers and Exponents
- Powers and Roots
- Prime Factorization
- Prime Numbers
- Problem-solving Models and Strategies
- Product Rule
- Proof
- Proof and Mathematical Induction
- Proof by Contradiction
- Proof by Deduction
- Proof by Exhaustion
- Proof by Induction
- Properties of Exponents
- Proportion
- Proving an Identity
- Pythagorean Identities
- Quadratic Equations
- Quadratic Function Graphs
- Quadratic Graphs
- Quadratic functions
- Quadrilaterals
- Quotient Rule
- Radians
- Radical Functions
- Rates of Change
- Ratio
- Ratio Fractions
- Rational Exponents
- Rational Expressions
- Rational Functions
- Rational Numbers and Fractions
- Ratios as Fractions
- Real Numbers
- Reciprocal Graphs
- Recurrence Relation
- Recursion and Special Sequences
- Remainder and Factor Theorems
- Representation of Complex Numbers
- Rewriting Formulas and Equations
- Roots of Complex Numbers
- Roots of Polynomials
- Roots of Unity
- Rounding
- SAS Theorem
- SSS Theorem
- Scalar Triple Product
- Scale Drawings and Maps
- Scale Factors
- Scientific Notation
- Second Order Recurrence Relation
- Sector of a Circle
- Segment of a Circle
- Sequences
- Sequences and Series
- Series Maths
- Sets Math
- Similar Triangles
- Similar and Congruent Shapes
- Simple Interest
- Simplifying Fractions
- Simplifying Radicals
- Simultaneous Equations
- Sine and Cosine Rules
- Small Angle Approximation
- Solving Linear Equations
- Solving Linear Systems
- Solving Quadratic Equations
- Solving Radical Inequalities
- Solving Rational Equations
- Solving Simultaneous Equations Using Matrices
- Solving Systems of Inequalities
- Solving Trigonometric Equations
- Solving and Graphing Quadratic Equations
- Solving and Graphing Quadratic Inequalities
- Special Products
- Standard Form
- Standard Integrals
- Standard Unit
- Straight Line Graphs
- Substraction and addition of fractions
- Sum and Difference of Angles Formulas
- Sum of Natural Numbers
- Surds
- Surjective functions
- Tables and Graphs
- Tangent of a Circle
- The Quadratic Formula and the Discriminant
- Transformations
- Transformations of Graphs
- Translations of Trigonometric Functions
- Triangle Rules
- Triangle trigonometry
- Trigonometric Functions
- Trigonometric Functions of General Angles
- Trigonometric Identities
- Trigonometric Ratios
- Trigonometry
- Turning Points
- Types of Functions
- Types of Numbers
- Types of Triangles
- Unit Circle
- Units
- Variables in Algebra
- Vectors
- Verifying Trigonometric Identities
- Writing Equations
- Writing Linear Equations
- Statistics
- Bias in Experiments
- Binomial Distribution
- Binomial Hypothesis Test
- Bivariate Data
- Box Plots
- Categorical Data
- Categorical Variables
- Central Limit Theorem
- Chi Square Test for Goodness of Fit
- Chi Square Test for Homogeneity
- Chi Square Test for Independence
- Chi-Square Distribution
- Combining Random Variables
- Comparing Data
- Comparing Two Means Hypothesis Testing
- Conditional Probability
- Conducting a Study
- Conducting a Survey
- Conducting an Experiment
- Confidence Interval for Population Mean
- Confidence Interval for Population Proportion
- Confidence Interval for Slope of Regression Line
- Confidence Interval for the Difference of Two Means
- Confidence Intervals
- Correlation Math
- Cumulative Distribution Function
- Cumulative Frequency
- Data Analysis
- Data Interpretation
- Degrees of Freedom
- Discrete Random Variable
- Distributions
- Dot Plot
- Empirical Rule
- Errors in Hypothesis Testing
- Estimator Bias
- Events (Probability)
- Frequency Polygons
- Generalization and Conclusions
- Geometric Distribution
- Histograms
- Hypothesis Test for Correlation
- Hypothesis Test for Regression Slope
- Hypothesis Test of Two Population Proportions
- Hypothesis Testing
- Inference for Distributions of Categorical Data
- Inferences in Statistics
- Large Data Set
- Least Squares Linear Regression
- Linear Interpolation
- Linear Regression
- Measures of Central Tendency
- Methods of Data Collection
- Normal Distribution
- Normal Distribution Hypothesis Test
- Normal Distribution Percentile
- Paired T-Test
- Point Estimation
- Probability
- Probability Calculations
- Probability Density Function
- Probability Distribution
- Probability Generating Function
- Quantitative Variables
- Quartiles
- Random Variables
- Randomized Block Design
- Residual Sum of Squares
- Residuals
- Sample Mean
- Sample Proportion
- Sampling
- Sampling Distribution
- Scatter Graphs
- Single Variable Data
- Skewness
- Spearman's Rank Correlation Coefficient
- Standard Deviation
- Standard Error
- Standard Normal Distribution
- Statistical Graphs
- Statistical Measures
- Stem and Leaf Graph
- Sum of Independent Random Variables
- Survey Bias
- T-distribution
- Transforming Random Variables
- Tree Diagram
- Two Categorical Variables
- Two Quantitative Variables
- Type I Error
- Type II Error
- Types of Data in Statistics
- Variance for Binomial Distribution
- Venn Diagrams

When facing different scenarios, you will need to adapt your hypothesis testing method. One scenario that frequently arises is one where you wish to test whether there is a difference between two means. You might have done this already using the normal distribution. But what happens if you don't know the variances of these populations and your sample sizes are small?

That's where the \(t\)-distribution comes in. This article will take you through a **hypothesis test for the difference in means **of two independent, normally distributed populations.

The \(t\)-distribution can also be used to test the **means of two independent normal distributions** when the **variances are unknown and the sample sizes are small**. To do so, you will need to assume the populations have the same variance and therefore need to use a **pooled estimate of variance.**

For a reminder on the \(t\)-distribution and its properties, see the article T-distribution.

Unlike the **paired \(t\) -test**, where you are comparing the results of an experiment before and after some treatment, here you are comparing

Describe the kind of hypothesis test would you use in the following scenarios.

1. A mobile phone company has released a new software update. They have asked you to find statistical evidence to support their claim that the software update has improved battery life.

2. A pet store sells Welsh Corgi puppies from two different breeders. They wish to determine whether there is a significant difference between the weights of the puppies from each breeder.

**Solution**

1. In order to conduct this experiment, you would need to collect samples of information on phone battery life **before and after **the software update. Since the samples will be taken from the **same population after a change has been made**, they are not independent. Therefore, you need to use a **paired t-test**.

2. In this case, you would be required to take samples of weights from two different breeders and therefore **two independent distributions**. You should **assume that the populations have the same variances**, therefore you will need to use a **pooled estimate of variance** to find the *t*-value and **not**** a paired t-test.**

The hypothesis test for the difference of two means follows these steps:

Find the

**null hypothesis**and**alternative hypothesis**, \(H_0\) and \(H_1\).Determine the

**significance level**from the questions, \(\alpha\).Determine the number of

**degrees of freedom**, \(\upsilon\).Find the

**critical region**.Calculate the

**pooled estimate of the variance**, \(s^2_p\).Calculate \(t\).

Compare the value of \(t\) with your critical region and state your conclusion, addressing whether the result is

**significant,**and**context of the question**.

Next let's take a look at the hypotheses you will need to do the test.

While comparing two means, your null hypothesis will state that the difference between the two populations you are testing is equal to zero. In other words, the null hypothesis is that there is no difference in the population means.

Samples are taken from two distributions, \(X\) and \(Y\), under the assumption that they are independent and normally distributed.

To perform a hypothesis test for the **difference between the means **of these distributions, use the following **null hypothesis,**

\[H_0:\, \mu _x =\mu _y.\]

What about the alternative hypothesis?

The alternative hypothesis for comparing two means will depend on whether you wish to test whether one particular distribution is greater than the other (a one-tailed test), or simply whether there is any difference at all (a two-tailed test).

When using a two-tailed test, remember to divide the significance level between the two tails!

Remember to read the question carefully to determine which sort of alternative hypothesis to use.

Samples are taken from two distributions, \(X\) and \(Y\), under the assumption that they are independent and normally distributed.

In the case that you wish to test whether the means are **different **(that is a two-tailed test), you will have the following **alternative**** hypothesis, **

\[H_1:\, \mu _x \neq \mu _y.\]

In the case that you wish to test whether the mean of \(X\) is greater than the mean of \(Y\) (that is a one-tailed test), you will have the following **alternative hypothesis,**

\[H_1:\, \mu _x > \mu _y.\]

Next let's see some of the calculations involved.

When testing for the difference between means, there are some extra calculations that you'll need to perform to find the **pooled estimate of the variance** and the value of \(t\)* *that you wish to test.

Using sample variances, \(s^2_x\) and \(s^2_y\), and the size of each sample, \(n_x\) and \(n_y\), the **pooled estimate of the variance** is given by the formula

\[s^2_p=\frac{(n_x-1)s^2_x+(n_y-1)s^2_y}{(n_x-1)+(n_y-1)}.\]

Once you have found \(s^2_p\), you will need to find the \(t\)-critical value that goes with it.

Given samples means and variances \(\bar{x}\), \(\bar{y}\), \(s^2_x\) and \(s^2_y\) and the pooled estimate of variance \(s^2_p\), the **\(t\)-critical value**, \(t^*\) is:

\[t^*=\frac{(\bar{x}-\bar{y})-(\mu _x - \mu _y)}{\sqrt{s^2_p\left(\dfrac{1}{n_x}+\dfrac{1}{n_y}\right)}}.\]

Next, let's look at a couple of examples on how to use and calculate these statistics within an actual hypothesis test.

A pet store sells Welsh Corgi puppies on behalf of two puppy breeders, \(X\) and \(Y\). They have sampled the weights of puppies from each breeder.

Weights of puppies from breeder \(X\) in kilograms: \(5.44,5.32,5.21,5.67.\)

Weights of puppies from breeder \(Y\) in kilograms: \(5.02,4.99,5.42,5.21,5.11.\)

The pet store wishes to know whether there is a statistically significant difference between the weights of the puppies from each breeder.

a. If you wanted to test the difference in the weights of the puppies, what assumptions need to be made?

b. Test whether the mean weights of puppies from the two breeders is different at the \(10\%\) confidence level.

**Solution**

a. In order to test the difference in the weights of the puppies, the assumptions to be made are that the samples of puppies are normally distributed, independent and have the same variances.

b. The test is two-tailed, so the hypotheses are,

\[ \begin{align} &H_0:\, \mu _x=\mu _y \\ &H_1: \,\mu _x \neq \mu _y.\end{align}\]

This is a two-tailed test since the alternative hypothesis is that the mean weights are different. The significance level is \(10\)%, so the critical region will have the probability of \(0.05\) in each tail of the distribution.

The number of degrees of freedom is

\[\upsilon = (4-1)+(5-1)=7.\]

To find degrees of freedom in this case, you need to add together the degrees of freedom from each sample. Or, you can use the formula \(\upsilon = n_x+n_y-2\).

The critical value can be found using a calculator or probability tables:

\[t_{\upsilon =7}(0.05)=1.895.\]

Next, find the pooled estimate of variance. You should have \(\bar{x}=5.41\) and \(\bar{y}=5.17.\)

The samples variances are \(s^2_x=0.038866667 \) and \(s^2_y=0.03015\).

Therefore, the pooled estimate of variance is,

\[\begin{align} s^2_p &= \frac{(n_x-1)s^2_x+(n_y-1)s^2_y}{(n_x-1)+(n_y-1)} \\&= \frac{(4-1)0.038867 +(5-1)0.03015 }{(4-1)+(5-1)} \\&=0.033886 \text{ to 5 s.f.} \end{align}\]

Your value of \(t^*\) is then:

\[\begin{align} t&=\frac{(\bar{x}-\bar{y})-(\mu _x - \mu _y)}{\sqrt{s^2_p\left(\dfrac{1}{n_x}+\dfrac{1}{n_y}\right)}}\\&=\dfrac{(5.41-5.17)-(0)}{\sqrt{0.033886\left(\dfrac{1}{4}+\dfrac{1}{5}\right)}}\\&=1.9435\end{align}\]

Since \(t^*=1.9435>1.895=t_\upsilon\), your value of \(t^*\) falls within the critical region. Therefore, at the \(10\)% significance level, you can reject the null hypothesis.

In conclusion, there is* *evidence to suggest there is a difference between the means of the weights of Welsh Corgi puppies from the two breeders.

This second example is slightly different to the first. The method will need to be adapted slightly.

A food delivery service, \(A\), claims that their average food delivery time is more than \(5\) minutes faster than the delivery time of their competitor, \(B\).

A random sample of delivery times from each company is collected:

- Food delivery time for \(A\), in minutes: \(22,16,45,23,39,32.\)
- Food delivery time for \(B\), in minutes: \(34,42,63,18,25,46,47.\)

Food delivery service \(B\) hires you to test whether this claim is statistically significant at the \(10\%\) significance level. Complete a hypothesis test for the difference between means and explain what this means for the two food delivery services.

**Solution**

Since the samples are independent the null hypothesis would normally be that the two means are the same. However the claim is that service \(A\) averages \(5\) minutes faster than their competitor, so the null hypothesis is instead \(\mu _A=\mu _B -5 \). Since you are only interested in whether the food delivery time is greater for one service, the hypotheses are:

\[ \begin{align} &H_0:\,\mu _A=\mu _B -5 \\ &H_1: \,\mu_A < \mu _B-5. \end{align}\]

This is a one-tailed test.The significance level is \(10\)%, so the critical region will have the probability of \(0.10\) in the left tail of the distribution.

The number of degrees of freedom are

\[\upsilon = (6-1)+(7-1)=11.\]

The critical value can be found using a calculator or probability tables,

\[t_{\upsilon =11}(0.10)=1.363.\]

Since you are only interested in whether \(\mu _a\) is **less than **\(\mu _b -5\), the critical value is \(t_\upsilon = -1.363\).

If the alternative hypothesis had been **greater than**, you would have used \(t_\upsilon = 1.363\) instead.

Next, find the pooled estimate of variance. You have \(\bar{a}=29.5\) and \(\bar{b}=39.3\). The samples variances are \(s^2_a=123.50 \) and \(s^2_b=226.57\). Therefore, the pooled estimate of variance is:

\[\begin{align} s^2_p &= \frac{(n_a-1)s^2_a+(n_b-1)s^2_b}{(n_a-1)+(n_b-1)} \\&= \frac{(6-1)123.50 +(7-1)226.57 }{(6-1)+(7-1)} \\&=179.72\text{ to 5 s.f.} \end{align}\]

The value of \(t^*\) is therefore,

\[\begin{align} t^*&=\frac{(\bar{a}-\bar{b})-(\mu _a - \mu _b)}{\sqrt{s^2_p\left(\dfrac{1}{n_a}+\dfrac{1}{n_b}\right)}}\\&=\dfrac{(29.5 -39.3)-(-5)}{\sqrt{179.72 \left(\dfrac{1}{6}+\dfrac{1}{7}\right)}}\\&=-0.64357.\end{align}\]

Since the null hypothesis states that \(\mu _x=\mu _y-5\), you will have \(\mu _x-\mu _y=-5\).

Since \(t^*=-0.64357>-1.363=t_\upsilon \), the value of \(t\) falls within the acceptance region. Therefore, at the \(10\%\) significance level, you fail to reject the null hypothesis.

This means that there is not sufficient evidence to suggest that delivery service \(A\) has a delivery time better than \(5\) minutes faster than delivery service \(B\).

For a more detailed explanation of the pooled estimate of variance, check out the article Pooled Estimate of Variance.

- The \(t\)-distribution can be used to test the means of two independent normal distributions when the variances are unknown
- The assumptions are that the populations are independent, normal and have the same variance
- The pooled estimate of variance formula is \[s^2_p=\frac{(n_x-1)s^2_x+(n_y-1)s^2_y}{(n_x-1)+(n_y-1)}.\]
- The \(t^*\) value is \[t^*=\dfrac{(\bar{x}-\bar{y})-(\mu _x - \mu _y)}{\sqrt{s^2_p\left(\dfrac{1}{n_x}+\dfrac{1}{n_y}\right)}}.\]

More about Comparing Two Means Hypothesis Testing

60%

of the users don't pass the Comparing Two Means Hypothesis Testing quiz! Will you pass the quiz?

Start QuizBe perfectly prepared on time with an individual plan.

Test your knowledge with gamified quizzes.

Create and find flashcards in record time.

Create beautiful notes faster than ever before.

Have all your study materials in one place.

Upload unlimited documents and save them online.

Identify your study strength and weaknesses.

Set individual study goals and earn points reaching them.

Stop procrastinating with our study reminders.

Earn points, unlock badges and level up while studying.

Create flashcards in notes completely automatically.

Create the most beautiful study materials using our templates.

Sign up to highlight and take notes. It’s 100% free.

Over 10 million students from across the world are already learning smarter.

Get Started for Free