StudySmarter - The all-in-one study app.

4.8 • +11k Ratings

More than 3 Million Downloads

Free

Suggested languages for you:

Americas

Europe

Errors in Hypothesis Testing

- Calculus
- Absolute Maxima and Minima
- Absolute and Conditional Convergence
- Accumulation Function
- Accumulation Problems
- Algebraic Functions
- Alternating Series
- Antiderivatives
- Application of Derivatives
- Approximating Areas
- Arc Length of a Curve
- Area Between Two Curves
- Arithmetic Series
- Average Value of a Function
- Calculus of Parametric Curves
- Candidate Test
- Combining Differentiation Rules
- Combining Functions
- Continuity
- Continuity Over an Interval
- Convergence Tests
- Cost and Revenue
- Density and Center of Mass
- Derivative Functions
- Derivative of Exponential Function
- Derivative of Inverse Function
- Derivative of Logarithmic Functions
- Derivative of Trigonometric Functions
- Derivatives
- Derivatives and Continuity
- Derivatives and the Shape of a Graph
- Derivatives of Inverse Trigonometric Functions
- Derivatives of Polar Functions
- Derivatives of Sec, Csc and Cot
- Derivatives of Sin, Cos and Tan
- Determining Volumes by Slicing
- Direction Fields
- Disk Method
- Divergence Test
- Eliminating the Parameter
- Euler's Method
- Evaluating a Definite Integral
- Evaluation Theorem
- Exponential Functions
- Finding Limits
- Finding Limits of Specific Functions
- First Derivative Test
- Function Transformations
- General Solution of Differential Equation
- Geometric Series
- Growth Rate of Functions
- Higher-Order Derivatives
- Hydrostatic Pressure
- Hyperbolic Functions
- Implicit Differentiation Tangent Line
- Implicit Relations
- Improper Integrals
- Indefinite Integral
- Indeterminate Forms
- Initial Value Problem Differential Equations
- Integral Test
- Integrals of Exponential Functions
- Integrals of Motion
- Integrating Even and Odd Functions
- Integration Formula
- Integration Tables
- Integration Using Long Division
- Integration of Logarithmic Functions
- Integration using Inverse Trigonometric Functions
- Intermediate Value Theorem
- Inverse Trigonometric Functions
- Jump Discontinuity
- Lagrange Error Bound
- Limit Laws
- Limit of Vector Valued Function
- Limit of a Sequence
- Limits
- Limits at Infinity
- Limits at Infinity and Asymptotes
- Limits of a Function
- Linear Approximations and Differentials
- Linear Differential Equation
- Linear Functions
- Logarithmic Differentiation
- Logarithmic Functions
- Logistic Differential Equation
- Maclaurin Series
- Manipulating Functions
- Maxima and Minima
- Maxima and Minima Problems
- Mean Value Theorem for Integrals
- Models for Population Growth
- Motion Along a Line
- Motion in Space
- Natural Logarithmic Function
- Net Change Theorem
- Newton's Method
- Nonhomogeneous Differential Equation
- One-Sided Limits
- Optimization Problems
- P Series
- Particle Model Motion
- Particular Solutions to Differential Equations
- Polar Coordinates
- Polar Coordinates Functions
- Polar Curves
- Population Change
- Power Series
- Radius of Convergence
- Ratio Test
- Removable Discontinuity
- Riemann Sum
- Rolle's Theorem
- Root Test
- Second Derivative Test
- Separable Equations
- Separation of Variables
- Simpson's Rule
- Solid of Revolution
- Solutions to Differential Equations
- Surface Area of Revolution
- Symmetry of Functions
- Tangent Lines
- Taylor Polynomials
- Taylor Series
- Techniques of Integration
- The Fundamental Theorem of Calculus
- The Mean Value Theorem
- The Power Rule
- The Squeeze Theorem
- The Trapezoidal Rule
- Theorems of Continuity
- Trigonometric Substitution
- Vector Valued Function
- Vectors in Calculus
- Vectors in Space
- Washer Method
- Decision Maths
- Geometry
- 2 Dimensional Figures
- 3 Dimensional Vectors
- 3-Dimensional Figures
- Altitude
- Angles in Circles
- Arc Measures
- Area and Volume
- Area of Circles
- Area of Circular Sector
- Area of Parallelograms
- Area of Plane Figures
- Area of Rectangles
- Area of Regular Polygons
- Area of Rhombus
- Area of Trapezoid
- Area of a Kite
- Composition
- Congruence Transformations
- Congruent Triangles
- Convexity in Polygons
- Coordinate Systems
- Dilations
- Distance and Midpoints
- Equation of Circles
- Equilateral Triangles
- Figures
- Fundamentals of Geometry
- Geometric Inequalities
- Geometric Mean
- Geometric Probability
- Glide Reflections
- HL ASA and AAS
- Identity Map
- Inscribed Angles
- Isometry
- Isosceles Triangles
- Law of Cosines
- Law of Sines
- Linear Measure and Precision
- Median
- Parallel Lines Theorem
- Parallelograms
- Perpendicular Bisector
- Plane Geometry
- Polygons
- Projections
- Properties of Chords
- Proportionality Theorems
- Pythagoras Theorem
- Rectangle
- Reflection in Geometry
- Regular Polygon
- Rhombuses
- Right Triangles
- Rotations
- SSS and SAS
- Segment Length
- Similarity
- Similarity Transformations
- Special quadrilaterals
- Squares
- Surface Area of Cone
- Surface Area of Cylinder
- Surface Area of Prism
- Surface Area of Sphere
- Surface Area of a Solid
- Surface of Pyramids
- Symmetry
- Translations
- Trapezoids
- Triangle Inequalities
- Triangles
- Using Similar Polygons
- Vector Addition
- Vector Product
- Volume of Cone
- Volume of Cylinder
- Volume of Pyramid
- Volume of Solid
- Volume of Sphere
- Volume of prisms
- Mechanics Maths
- Acceleration and Time
- Acceleration and Velocity
- Angular Speed
- Assumptions
- Calculus Kinematics
- Coefficient of Friction
- Connected Particles
- Conservation of Mechanical Energy
- Constant Acceleration
- Constant Acceleration Equations
- Converting Units
- Elastic Strings and Springs
- Force as a Vector
- Kinematics
- Newton's First Law
- Newton's Law of Gravitation
- Newton's Second Law
- Newton's Third Law
- Power
- Projectiles
- Pulleys
- Resolving Forces
- Statics and Dynamics
- Tension in Strings
- Variable Acceleration
- Work Done by a Constant Force
- Probability and Statistics
- Bar Graphs
- Basic Probability
- Charts and Diagrams
- Conditional Probabilities
- Continuous and Discrete Data
- Frequency, Frequency Tables and Levels of Measurement
- Independent Events Probability
- Line Graphs
- Mean Median and Mode
- Mutually Exclusive Probabilities
- Probability Rules
- Probability of Combined Events
- Quartiles and Interquartile Range
- Systematic Listing
- Pure Maths
- ASA Theorem
- Absolute Value Equations and Inequalities
- Addition and Subtraction of Rational Expressions
- Addition, Subtraction, Multiplication and Division
- Algebra
- Algebraic Fractions
- Algebraic Notation
- Algebraic Representation
- Analyzing Graphs of Polynomials
- Angle Measure
- Angles
- Angles in Polygons
- Approximation and Estimation
- Area and Circumference of a Circle
- Area and Perimeter of Quadrilaterals
- Area of Triangles
- Argand Diagram
- Arithmetic Sequences
- Average Rate of Change
- Bijective Functions
- Binomial Expansion
- Binomial Theorem
- Chain Rule
- Circle Theorems
- Circles
- Circles Maths
- Combination of Functions
- Combinatorics
- Common Factors
- Common Multiples
- Completing the Square
- Completing the Squares
- Complex Numbers
- Composite Functions
- Composition of Functions
- Compound Interest
- Compound Units
- Conic Sections
- Construction and Loci
- Converting Metrics
- Convexity and Concavity
- Coordinate Geometry
- Coordinates in Four Quadrants
- Cubic Function Graph
- Cubic Polynomial Graphs
- Data transformations
- De Moivre's Theorem
- Deductive Reasoning
- Definite Integrals
- Deriving Equations
- Determinant of Inverse Matrix
- Determinants
- Differential Equations
- Differentiation
- Differentiation Rules
- Differentiation from First Principles
- Differentiation of Hyperbolic Functions
- Direct and Inverse proportions
- Disjoint and Overlapping Events
- Disproof by Counterexample
- Distance from a Point to a Line
- Divisibility Tests
- Double Angle and Half Angle Formulas
- Drawing Conclusions from Examples
- Ellipse
- Equation of Line in 3D
- Equation of a Perpendicular Bisector
- Equation of a circle
- Equations
- Equations and Identities
- Equations and Inequalities
- Estimation in Real Life
- Euclidean Algorithm
- Evaluating and Graphing Polynomials
- Even Functions
- Exponential Form of Complex Numbers
- Exponential Rules
- Exponentials and Logarithms
- Expression Math
- Expressions and Formulas
- Faces Edges and Vertices
- Factorials
- Factoring Polynomials
- Factoring Quadratic Equations
- Factorising expressions
- Factors
- Finding Maxima and Minima Using Derivatives
- Finding Rational Zeros
- Finding the Area
- Forms of Quadratic Functions
- Fractional Powers
- Fractional Ratio
- Fractions
- Fractions and Decimals
- Fractions and Factors
- Fractions in Expressions and Equations
- Fractions, Decimals and Percentages
- Function Basics
- Functional Analysis
- Functions
- Fundamental Counting Principle
- Fundamental Theorem of Algebra
- Generating Terms of a Sequence
- Geometric Sequence
- Gradient and Intercept
- Graphical Representation
- Graphing Rational Functions
- Graphing Trigonometric Functions
- Graphs
- Graphs and Differentiation
- Graphs of Common Functions
- Graphs of Exponents and Logarithms
- Graphs of Trigonometric Functions
- Greatest Common Divisor
- Growth and Decay
- Growth of Functions
- Highest Common Factor
- Hyperbolas
- Imaginary Unit and Polar Bijection
- Implicit differentiation
- Inductive Reasoning
- Inequalities Maths
- Infinite geometric series
- Injective functions
- Instantaneous Rate of Change
- Integers
- Integrating Polynomials
- Integrating Trig Functions
- Integrating e^x and 1/x
- Integration
- Integration Using Partial Fractions
- Integration by Parts
- Integration by Substitution
- Integration of Hyperbolic Functions
- Interest
- Inverse Hyperbolic Functions
- Inverse Matrices
- Inverse and Joint Variation
- Inverse functions
- Iterative Methods
- Law of Cosines in Algebra
- Law of Sines in Algebra
- Laws of Logs
- Limits of Accuracy
- Linear Expressions
- Linear Systems
- Linear Transformations of Matrices
- Location of Roots
- Logarithm Base
- Logic
- Lower and Upper Bounds
- Lowest Common Denominator
- Lowest Common Multiple
- Math formula
- Matrices
- Matrix Addition and Subtraction
- Matrix Determinant
- Matrix Multiplication
- Metric and Imperial Units
- Misleading Graphs
- Mixed Expressions
- Modulus Functions
- Modulus and Phase
- Multiples of Pi
- Multiplication and Division of Fractions
- Multiplicative Relationship
- Multiplying and Dividing Rational Expressions
- Natural Logarithm
- Natural Numbers
- Notation
- Number
- Number Line
- Number Systems
- Numerical Methods
- Odd functions
- Open Sentences and Identities
- Operation with Complex Numbers
- Operations with Decimals
- Operations with Matrices
- Operations with Polynomials
- Order of Operations
- Parabola
- Parallel Lines
- Parametric Differentiation
- Parametric Equations
- Parametric Integration
- Partial Fractions
- Pascal's Triangle
- Percentage
- Percentage Increase and Decrease
- Percentage as fraction or decimals
- Perimeter of a Triangle
- Permutations and Combinations
- Perpendicular Lines
- Points Lines and Planes
- Polynomial Graphs
- Polynomials
- Powers Roots And Radicals
- Powers and Exponents
- Powers and Roots
- Prime Factorization
- Prime Numbers
- Problem-solving Models and Strategies
- Product Rule
- Proof
- Proof and Mathematical Induction
- Proof by Contradiction
- Proof by Deduction
- Proof by Exhaustion
- Proof by Induction
- Properties of Exponents
- Proportion
- Proving an Identity
- Pythagorean Identities
- Quadratic Equations
- Quadratic Function Graphs
- Quadratic Graphs
- Quadratic functions
- Quadrilaterals
- Quotient Rule
- Radians
- Radical Functions
- Rates of Change
- Ratio
- Ratio Fractions
- Rational Exponents
- Rational Expressions
- Rational Functions
- Rational Numbers and Fractions
- Ratios as Fractions
- Real Numbers
- Reciprocal Graphs
- Recurrence Relation
- Recursion and Special Sequences
- Remainder and Factor Theorems
- Representation of Complex Numbers
- Rewriting Formulas and Equations
- Roots of Complex Numbers
- Roots of Polynomials
- Roots of Unity
- Rounding
- SAS Theorem
- SSS Theorem
- Scalar Triple Product
- Scale Drawings and Maps
- Scale Factors
- Scientific Notation
- Second Order Recurrence Relation
- Sector of a Circle
- Segment of a Circle
- Sequences
- Sequences and Series
- Series Maths
- Sets Math
- Similar Triangles
- Similar and Congruent Shapes
- Simple Interest
- Simplifying Fractions
- Simplifying Radicals
- Simultaneous Equations
- Sine and Cosine Rules
- Small Angle Approximation
- Solving Linear Equations
- Solving Linear Systems
- Solving Quadratic Equations
- Solving Radical Inequalities
- Solving Rational Equations
- Solving Simultaneous Equations Using Matrices
- Solving Systems of Inequalities
- Solving Trigonometric Equations
- Solving and Graphing Quadratic Equations
- Solving and Graphing Quadratic Inequalities
- Special Products
- Standard Form
- Standard Integrals
- Standard Unit
- Straight Line Graphs
- Substraction and addition of fractions
- Sum and Difference of Angles Formulas
- Sum of Natural Numbers
- Surds
- Surjective functions
- Tables and Graphs
- Tangent of a Circle
- The Quadratic Formula and the Discriminant
- Transformations
- Transformations of Graphs
- Translations of Trigonometric Functions
- Triangle Rules
- Triangle trigonometry
- Trigonometric Functions
- Trigonometric Functions of General Angles
- Trigonometric Identities
- Trigonometric Ratios
- Trigonometry
- Turning Points
- Types of Functions
- Types of Numbers
- Types of Triangles
- Unit Circle
- Units
- Variables in Algebra
- Vectors
- Verifying Trigonometric Identities
- Writing Equations
- Writing Linear Equations
- Statistics
- Bias in Experiments
- Binomial Distribution
- Binomial Hypothesis Test
- Bivariate Data
- Box Plots
- Categorical Data
- Categorical Variables
- Central Limit Theorem
- Chi Square Test for Goodness of Fit
- Chi Square Test for Homogeneity
- Chi Square Test for Independence
- Chi-Square Distribution
- Combining Random Variables
- Comparing Data
- Comparing Two Means Hypothesis Testing
- Conditional Probability
- Conducting a Study
- Conducting a Survey
- Conducting an Experiment
- Confidence Interval for Population Mean
- Confidence Interval for Population Proportion
- Confidence Interval for Slope of Regression Line
- Confidence Interval for the Difference of Two Means
- Confidence Intervals
- Correlation Math
- Cumulative Distribution Function
- Cumulative Frequency
- Data Analysis
- Data Interpretation
- Degrees of Freedom
- Discrete Random Variable
- Distributions
- Dot Plot
- Empirical Rule
- Errors in Hypothesis Testing
- Estimator Bias
- Events (Probability)
- Frequency Polygons
- Generalization and Conclusions
- Geometric Distribution
- Histograms
- Hypothesis Test for Correlation
- Hypothesis Test for Regression Slope
- Hypothesis Test of Two Population Proportions
- Hypothesis Testing
- Inference for Distributions of Categorical Data
- Inferences in Statistics
- Large Data Set
- Least Squares Linear Regression
- Linear Interpolation
- Linear Regression
- Measures of Central Tendency
- Methods of Data Collection
- Normal Distribution
- Normal Distribution Hypothesis Test
- Normal Distribution Percentile
- Paired T-Test
- Point Estimation
- Probability
- Probability Calculations
- Probability Density Function
- Probability Distribution
- Probability Generating Function
- Quantitative Variables
- Quartiles
- Random Variables
- Randomized Block Design
- Residual Sum of Squares
- Residuals
- Sample Mean
- Sample Proportion
- Sampling
- Sampling Distribution
- Scatter Graphs
- Single Variable Data
- Skewness
- Spearman's Rank Correlation Coefficient
- Standard Deviation
- Standard Error
- Standard Normal Distribution
- Statistical Graphs
- Statistical Measures
- Stem and Leaf Graph
- Sum of Independent Random Variables
- Survey Bias
- T-distribution
- Transforming Random Variables
- Tree Diagram
- Two Categorical Variables
- Two Quantitative Variables
- Type I Error
- Type II Error
- Types of Data in Statistics
- Variance for Binomial Distribution
- Venn Diagrams

Nobody is above mistakes but there are situations where a single mistake can cause physical, psychological and economical problems. In these types of situations, one has to be very careful. Imagine going to a hospital and getting misdiagnosed, being convicted for a crime you did not commit or giving the public some false information that causes them to panic for no reason. These are all serious problems that can be a result of some error somewhere.

In this article, you will learn about errors in hypothesis testing, their causes and how to balance them.

Before we can talk about hypothesis testing and the errors that can occur, you should first know what a hypothesis is.

A **hypothesis** is a proposed claim or idea about the characteristics of a population.

A hypothesis can be challenged based on some newfound knowledge about a population. It can be tested and compared to other claims to confirm or to draw a new conclusion.

Let's see the definition of hypothesis testing.

**Hypothesis testing** is a procedure that uses sample data to confirm a hypothesis or claim about a population by comparing it with another claim.

See the example below.

A battery company claims that their batteries last 18 hours but some new information comes up to say that the battery lasts less than 18 hours. Testing these claims by comparing them will help decide which is true and which is false.

To find out more about hypothesis testing, see our article on Hypothesis Testing

Hypothesis testing is used to find out if a claim is true or not but during this process of testing, some errors can occur. These errors can affect the conclusion of the test. It can lead to wrong results and decisions. Before we dive deep into errors in hypothesis testing, there are some terms you should know.

**Null Hypothesis**- This is the claim that is first accepted to be true. It is denoted by \(H_0\).**Alternative Hypothesis**- This is the opposing or contradicting claim. It is denoted by \(H_a\).

From the example above, the original claim by the battery company that the batteries last for 18 hours is the null hypothesis while the claim that it lasts for less is the alternative hypothesis.

There are two types of errors in hypothesis testing, Type I and Type II errors. Read on to find out more about them.

What is a Type I error?

**Type I error** is the error that occurs when the null hypothesis (\(H_0\)) is concluded to be false or is rejected when it is actually true.

Let's take a look at an example.

A man is being accused of murder and the judge is trying to decide if he is guilty or not. The possibility that he is not guilty is the **null hypothesis** and the possibility that he is guilty is the **alternative hypothesis**.

The hypotheses will be written as:

\[ \begin {align} &H_0: y = \text{man is not guilty}\\ &H_a: y = \text{man is guilty} \end {align} \]

where \(y\) represents the man's verdict.

If at the end of the trial, the judge concludes that he is guilty when he actually isn't, that will be a T**ype I error **and an innocent man will go to jail.

The probability of a Type I error is called the** level of significance** of the test and it is denoted by \(\alpha\). So, if you have \(\alpha = 0.05\), it means that the level of significance of the test is \(0.05\).

You can also define \(\alpha\) as the probability of rejecting a null hypothesis when it is true.

\[ P(rejecting \ H_0 / H_0 \ is \ true) = \alpha \]

What is a Type II error?

**Type II error** is the error that occurs when the null hypothesis (\(H_0\)) is accepted when it is false.

Let's use the previous example about a murder case to illustrate a Type II error.

A man is being accused of murder and the judge is trying to decide if he is guilty or not. The possibility that the man is not guilty is the **null hypothesis** and the possibility that he is guilty is the **alternative hypothesis**.

The hypothesis will be written as:

\[ \begin {align} &H_0: y = \text{man is not guilty}\\ &H_a: y = \text{man is guilty} \end {align} \]

where \(y\) represents the man's verdict.

If at the end of the trial, the judge concludes that he is not guilty when he actually is, that will be a T**ype II error** and a murderer will be left unpunished.

The probability of a Type II error is denoted by \(\beta\)

\[ P(\text{accepting }H_0:\; H_0 \text{ is false}) = \beta . \]

Either Type I or Type II error can occur in any test but the error that is more serious or significant depends on the situation.

Let's take a look at an example.

A doctor's diagnosis of a patient is to be confirmed with a test. The null hypothesis is that the patient has the disease and the alternative hypothesis is that he doesn't have the disease.

If the test concludes that the patient has the disease when he doesn't then that's a Type II error. In this case, a Type II error is much more serious than a Type I because assuring someone that they are healthy when they are not can cause serious problems.

Ideally, the probability of having a Type I and Type II error should be zero i.e \(\alpha = \beta = 0 \), but the only way this can be possible is if the information you use from a census is taken from the population instead of a sample. Since that is highly unlikely, you have to prepare for errors. If you decrease the value for \(\alpha\) or choose a small value for \(\alpha\), this would mean that you are reducing the chances of a Type I error happening. This might sound good but doing this will increase the chances of a Type II error happening. So, as you decrease \(\alpha\), \(\beta\) increases.

Both errors have their consequences but you have to try to balance this. To balance this, you have to use an appropriate value for \(\alpha\) and \(beta\). Trying to figure out the right values to use will depend on the situation. To create a balance try not to make the value of \(\alpha\) too small. You should assess the consequences of both errors and use the largest value of \(\alpha\) and/or \(\beta\) that can be accepted for the situation.

When you are dealing with a sample from a population, you are bound to encounter a Type I or Type II error. In trying to balance and minimize the errors, you will notice their relationship. The relationship is seen when varying the values of \(\alpha\) and \(\beta\). The smaller the value of \(\alpha\), the bigger the value of \(\beta\) gets. In other words, as \(\alpha\) increases, \(\beta\) decreases. The errors are not independent of each other but they are inversely proportional.

As stated in the section above, making the value of \(\alpha\) as small as possible in hopes to reduce Type I error results in a higher possibility of Type II error occurring. Solving this will be to find a balance by choosing tolerable values for both \(\alpha\) and \(\beta\).

Some of the causes of Type I error are listed below.

- Type I errors can occur as a result of a small sample size. A study or research is usually done on a sample of a population and sometimes, the sample may not contain enough information to come to an accurate conclusion. Increasing the sample size will help reduce the chances of a Type I error happening.
- Using poor research methods can lead to Type I error. If you don't use good methods and don't gather sufficient and correct data, a Type I error can occur.
- An external factor can affect the variables involved and can cause the result of the test to be against the null hypothesis.
- Setting the level of significance (\(\alpha\)) before the test without taking into account the sample size and how long the test will take can cause a Type I error.

Some of the causes of Type II errors are listed below.

- Type II error will occur when the power of the test is low.
- Using a small sample size can cause a Type II error.

The power of a test is the ability of a test to reject a null hypothesis when it is false. For more information see the article on Hypothesis Testing.

Let's see some examples of errors in hypothesis testing.

According to a poll, \(15\%\) of the students in a school do not like eating food from the cafeteria. The principal of the school decides to take a sample of her school's population to test and see if this claim is true. Let \(y\) be the percentage of the students who do not like eating food from the cafeteria. The hypothesis she used is below. \[ \begin {align} &H_0:y = 0.15 \\&H_a:y \neq 0.15. \end {align} \]

In which of the following conditions did the principal commit a Type I error?

- She concludes the percentage of students is not \(15\%\) when it isn't.
- She concludes the percentage of students is not \(15\%\) when it is.
- She concludes the percentage of students is \(15\%\) when it is.
- She concludes the percentage of students is \(15\%\) when it isn't.

**Solution:**

The correct option is option B. Concluding that it is not \(15\%\) when it actually is, is rejecting the null hypothesis which is a Type I error.

Option A is not an error. Concluding that it is not \(15\%\) when it is not, is not an error at all.

Option C is also not an error. Concluding that it is \(15\%\) when it is actually \(15\%\) is no error at all.

Option D is a Type II error. Concluding that it is \(15\%\) when it is not, is a Type II error.

Let's take a look at another example.

After research, it was concluded that then men in Town A are \(5\) times more likely to have lung cancer than the men in Town B. In an attempt to verify this result, it was found that the men in Town A are not \(5\) times more likely to have lung cancer than the men in Town B. What type of error occurred here?

**Solution:**

The null hypothesis \(H_0\) is that men in Town A are \(5\) times more likely to have lung cancer than the men in Town B. The alternative hypothesis \(H_a\) is that the men in Town A are **not** \(5\) times more likely to have lung cancer than the men in Town B.

After verification, it was found that \(H_0\) is false. This means that it was accepted to be true when it is indeed false. This is a Type II error.

Classifying which kind of error is happening and the cause of it can help improve testing and quality.

All the bottles manufactured by a certain company have an average diameter of \(4\, \mathrm{cm}\). The company suspects that the measurement has changed meaning they will need to re-calibrate their machine. Before they do that, they decide to take a sample of their product for testing. The hypotheses are:

\[ \begin {align} &H_0: x = 4\, \mathrm{cm} \\ &H_a: x \neq 4\, \mathrm{cm} \end {align} \]

where \(x\) is the average diameter of the bottle. After testing, \(H_0\) was rejected in favor of \(H_a\) when \(H_0\) was true.

What type of error is this and what may have caused it?

**Solution:**

The error here is a Type I error. A Type I error occurs when the null hypothesis is rejected in favor of the alternative hypothesis even though the null hypothesis is true.

This may have been caused by using a very small sample size for testing, an external factor may have contributed to altering the result, or the testing method may have been faulty.

Let's see one more example.

The following is are the null and alternative hypothesis for a new medical treatment:

\[ \begin {align} &H_0: p = 0.75 \\ &H_a: p &< 0.75 \end {align} \] where \(p\) represents the success of the treatment. In which of the following conditions is there a Type II error? Explain the effect or consequences of each of the options on a patient.

- Accepting \(H_0\) in favor of \(H_a\) when \(H_0\) is true.
- Rejecting \(H_0\) in favor of \(H_a\) when \(H_0\) is true.
- Accepting \(H_0\) in favor of \(H_a\) when \(H_0\) is false.
- Accepting \(H_a\) in favor of \(H_0\) when \(H_a\) is true.

**Solution:**

The correct option is** option C**. Accepting the null hypothesis when it is false is a Type II error. The consequence of this error is that you will give a patient a treatment that is not as effective as it is believed to be. There will be little or no improvement in the patient's condition which can lead to worsened health or even death.

**Option A** is not an error at all because \(H_0\) is accepted when it is indeed true. If you give the treatment to a patient, you will see the expected change in the patient's condition.

**Option B** is a Type I error because \(H_0\) is rejected when it is true. This error will mean that you will not give the patient the treatment because it is now thought of to be less effective than expected. The patient will not have the opportunity to improve their health with this treatment.

**Option D **is not an error at all because \(H_a\) is accepted when it is indeed true. Time and effort will not be wasted in giving a patient a treatment that is not effective.

- Type I error is the error that occurs when the null hypothesis (\(H_0\)) is concluded to be false or is rejected when it is actually true.
- Type II error is the error that occurs when the null hypothesis (\(H_0\) is accepted when it is false.
- Type I error and Type II error is inversely proportional to each other.

Type II error is the error that occurs when the null hypothesis is accepted when it is false.

An example of a type I error is below.

A man is being accused of murder and the judge is trying to decide if the he is guilty or not. The possibility that he is not guilty is the null the hypothesis and possibility that he is guilty is the alternative hypothesis. If at the end of the trial, the judge concludes that he is guilty when he actually isn't, that will be a type I error.

Below are some of the reasons type I error occurs.

- Type I error can occur as a result of small sample size. A study or research is usually done on a sample of a population and sometimes, the sample may not contain enough information to come to an accurate conclusion. Increasing the sample size will help reduce the chances of a type I error happening.
- Poor research methods - Using poor research methods can lead to type I error. If you don't use good methods and don't gather sufficient and correct data, type I error can occur.
- An external factor can affect the variables involved and can cause the result of the test to be against the null hypothesis.
- Setting the level of significance (\(\alpha\)) before the test without taking into account the sample size and how long the test will take can cause a type I error.

Type I and Type 2 errors are not independent. They are inversely proportional to each other.

More about Errors in Hypothesis Testing

60%

of the users don't pass the Errors in Hypothesis Testing quiz! Will you pass the quiz?

Start QuizBe perfectly prepared on time with an individual plan.

Test your knowledge with gamified quizzes.

Create and find flashcards in record time.

Create beautiful notes faster than ever before.

Have all your study materials in one place.

Upload unlimited documents and save them online.

Identify your study strength and weaknesses.

Set individual study goals and earn points reaching them.

Stop procrastinating with our study reminders.

Earn points, unlock badges and level up while studying.

Create flashcards in notes completely automatically.

Create the most beautiful study materials using our templates.

Sign up to highlight and take notes. It’s 100% free.

Over 10 million students from across the world are already learning smarter.

Get Started for Free