StudySmarter - The all-in-one study app.

4.8 • +11k Ratings

More than 3 Million Downloads

Free

Suggested languages for you:

Americas

Europe

Confidence Interval for Slope of Regression Line

- Calculus
- Absolute Maxima and Minima
- Absolute and Conditional Convergence
- Accumulation Function
- Accumulation Problems
- Algebraic Functions
- Alternating Series
- Antiderivatives
- Application of Derivatives
- Approximating Areas
- Arc Length of a Curve
- Area Between Two Curves
- Arithmetic Series
- Average Value of a Function
- Calculus of Parametric Curves
- Candidate Test
- Combining Differentiation Rules
- Combining Functions
- Continuity
- Continuity Over an Interval
- Convergence Tests
- Cost and Revenue
- Density and Center of Mass
- Derivative Functions
- Derivative of Exponential Function
- Derivative of Inverse Function
- Derivative of Logarithmic Functions
- Derivative of Trigonometric Functions
- Derivatives
- Derivatives and Continuity
- Derivatives and the Shape of a Graph
- Derivatives of Inverse Trigonometric Functions
- Derivatives of Polar Functions
- Derivatives of Sec, Csc and Cot
- Derivatives of Sin, Cos and Tan
- Determining Volumes by Slicing
- Direction Fields
- Disk Method
- Divergence Test
- Eliminating the Parameter
- Euler's Method
- Evaluating a Definite Integral
- Evaluation Theorem
- Exponential Functions
- Finding Limits
- Finding Limits of Specific Functions
- First Derivative Test
- Function Transformations
- General Solution of Differential Equation
- Geometric Series
- Growth Rate of Functions
- Higher-Order Derivatives
- Hydrostatic Pressure
- Hyperbolic Functions
- Implicit Differentiation Tangent Line
- Implicit Relations
- Improper Integrals
- Indefinite Integral
- Indeterminate Forms
- Initial Value Problem Differential Equations
- Integral Test
- Integrals of Exponential Functions
- Integrals of Motion
- Integrating Even and Odd Functions
- Integration Formula
- Integration Tables
- Integration Using Long Division
- Integration of Logarithmic Functions
- Integration using Inverse Trigonometric Functions
- Intermediate Value Theorem
- Inverse Trigonometric Functions
- Jump Discontinuity
- Lagrange Error Bound
- Limit Laws
- Limit of Vector Valued Function
- Limit of a Sequence
- Limits
- Limits at Infinity
- Limits at Infinity and Asymptotes
- Limits of a Function
- Linear Approximations and Differentials
- Linear Differential Equation
- Linear Functions
- Logarithmic Differentiation
- Logarithmic Functions
- Logistic Differential Equation
- Maclaurin Series
- Manipulating Functions
- Maxima and Minima
- Maxima and Minima Problems
- Mean Value Theorem for Integrals
- Models for Population Growth
- Motion Along a Line
- Motion in Space
- Natural Logarithmic Function
- Net Change Theorem
- Newton's Method
- Nonhomogeneous Differential Equation
- One-Sided Limits
- Optimization Problems
- P Series
- Particle Model Motion
- Particular Solutions to Differential Equations
- Polar Coordinates
- Polar Coordinates Functions
- Polar Curves
- Population Change
- Power Series
- Radius of Convergence
- Ratio Test
- Removable Discontinuity
- Riemann Sum
- Rolle's Theorem
- Root Test
- Second Derivative Test
- Separable Equations
- Separation of Variables
- Simpson's Rule
- Solid of Revolution
- Solutions to Differential Equations
- Surface Area of Revolution
- Symmetry of Functions
- Tangent Lines
- Taylor Polynomials
- Taylor Series
- Techniques of Integration
- The Fundamental Theorem of Calculus
- The Mean Value Theorem
- The Power Rule
- The Squeeze Theorem
- The Trapezoidal Rule
- Theorems of Continuity
- Trigonometric Substitution
- Vector Valued Function
- Vectors in Calculus
- Vectors in Space
- Washer Method
- Decision Maths
- Geometry
- 2 Dimensional Figures
- 3 Dimensional Vectors
- 3-Dimensional Figures
- Altitude
- Angles in Circles
- Arc Measures
- Area and Volume
- Area of Circles
- Area of Circular Sector
- Area of Parallelograms
- Area of Plane Figures
- Area of Rectangles
- Area of Regular Polygons
- Area of Rhombus
- Area of Trapezoid
- Area of a Kite
- Composition
- Congruence Transformations
- Congruent Triangles
- Convexity in Polygons
- Coordinate Systems
- Dilations
- Distance and Midpoints
- Equation of Circles
- Equilateral Triangles
- Figures
- Fundamentals of Geometry
- Geometric Inequalities
- Geometric Mean
- Geometric Probability
- Glide Reflections
- HL ASA and AAS
- Identity Map
- Inscribed Angles
- Isometry
- Isosceles Triangles
- Law of Cosines
- Law of Sines
- Linear Measure and Precision
- Median
- Parallel Lines Theorem
- Parallelograms
- Perpendicular Bisector
- Plane Geometry
- Polygons
- Projections
- Properties of Chords
- Proportionality Theorems
- Pythagoras Theorem
- Rectangle
- Reflection in Geometry
- Regular Polygon
- Rhombuses
- Right Triangles
- Rotations
- SSS and SAS
- Segment Length
- Similarity
- Similarity Transformations
- Special quadrilaterals
- Squares
- Surface Area of Cone
- Surface Area of Cylinder
- Surface Area of Prism
- Surface Area of Sphere
- Surface Area of a Solid
- Surface of Pyramids
- Symmetry
- Translations
- Trapezoids
- Triangle Inequalities
- Triangles
- Using Similar Polygons
- Vector Addition
- Vector Product
- Volume of Cone
- Volume of Cylinder
- Volume of Pyramid
- Volume of Solid
- Volume of Sphere
- Volume of prisms
- Mechanics Maths
- Acceleration and Time
- Acceleration and Velocity
- Angular Speed
- Assumptions
- Calculus Kinematics
- Coefficient of Friction
- Connected Particles
- Conservation of Mechanical Energy
- Constant Acceleration
- Constant Acceleration Equations
- Converting Units
- Elastic Strings and Springs
- Force as a Vector
- Kinematics
- Newton's First Law
- Newton's Law of Gravitation
- Newton's Second Law
- Newton's Third Law
- Power
- Projectiles
- Pulleys
- Resolving Forces
- Statics and Dynamics
- Tension in Strings
- Variable Acceleration
- Work Done by a Constant Force
- Probability and Statistics
- Bar Graphs
- Basic Probability
- Charts and Diagrams
- Conditional Probabilities
- Continuous and Discrete Data
- Frequency, Frequency Tables and Levels of Measurement
- Independent Events Probability
- Line Graphs
- Mean Median and Mode
- Mutually Exclusive Probabilities
- Probability Rules
- Probability of Combined Events
- Quartiles and Interquartile Range
- Systematic Listing
- Pure Maths
- ASA Theorem
- Absolute Value Equations and Inequalities
- Addition and Subtraction of Rational Expressions
- Addition, Subtraction, Multiplication and Division
- Algebra
- Algebraic Fractions
- Algebraic Notation
- Algebraic Representation
- Analyzing Graphs of Polynomials
- Angle Measure
- Angles
- Angles in Polygons
- Approximation and Estimation
- Area and Circumference of a Circle
- Area and Perimeter of Quadrilaterals
- Area of Triangles
- Argand Diagram
- Arithmetic Sequences
- Average Rate of Change
- Bijective Functions
- Binomial Expansion
- Binomial Theorem
- Chain Rule
- Circle Theorems
- Circles
- Circles Maths
- Combination of Functions
- Combinatorics
- Common Factors
- Common Multiples
- Completing the Square
- Completing the Squares
- Complex Numbers
- Composite Functions
- Composition of Functions
- Compound Interest
- Compound Units
- Conic Sections
- Construction and Loci
- Converting Metrics
- Convexity and Concavity
- Coordinate Geometry
- Coordinates in Four Quadrants
- Cubic Function Graph
- Cubic Polynomial Graphs
- Data transformations
- De Moivre's Theorem
- Deductive Reasoning
- Definite Integrals
- Deriving Equations
- Determinant of Inverse Matrix
- Determinants
- Differential Equations
- Differentiation
- Differentiation Rules
- Differentiation from First Principles
- Differentiation of Hyperbolic Functions
- Direct and Inverse proportions
- Disjoint and Overlapping Events
- Disproof by Counterexample
- Distance from a Point to a Line
- Divisibility Tests
- Double Angle and Half Angle Formulas
- Drawing Conclusions from Examples
- Ellipse
- Equation of Line in 3D
- Equation of a Perpendicular Bisector
- Equation of a circle
- Equations
- Equations and Identities
- Equations and Inequalities
- Estimation in Real Life
- Euclidean Algorithm
- Evaluating and Graphing Polynomials
- Even Functions
- Exponential Form of Complex Numbers
- Exponential Rules
- Exponentials and Logarithms
- Expression Math
- Expressions and Formulas
- Faces Edges and Vertices
- Factorials
- Factoring Polynomials
- Factoring Quadratic Equations
- Factorising expressions
- Factors
- Finding Maxima and Minima Using Derivatives
- Finding Rational Zeros
- Finding the Area
- Forms of Quadratic Functions
- Fractional Powers
- Fractional Ratio
- Fractions
- Fractions and Decimals
- Fractions and Factors
- Fractions in Expressions and Equations
- Fractions, Decimals and Percentages
- Function Basics
- Functional Analysis
- Functions
- Fundamental Counting Principle
- Fundamental Theorem of Algebra
- Generating Terms of a Sequence
- Geometric Sequence
- Gradient and Intercept
- Graphical Representation
- Graphing Rational Functions
- Graphing Trigonometric Functions
- Graphs
- Graphs and Differentiation
- Graphs of Common Functions
- Graphs of Exponents and Logarithms
- Graphs of Trigonometric Functions
- Greatest Common Divisor
- Growth and Decay
- Growth of Functions
- Highest Common Factor
- Hyperbolas
- Imaginary Unit and Polar Bijection
- Implicit differentiation
- Inductive Reasoning
- Inequalities Maths
- Infinite geometric series
- Injective functions
- Instantaneous Rate of Change
- Integers
- Integrating Polynomials
- Integrating Trig Functions
- Integrating e^x and 1/x
- Integration
- Integration Using Partial Fractions
- Integration by Parts
- Integration by Substitution
- Integration of Hyperbolic Functions
- Interest
- Inverse Hyperbolic Functions
- Inverse Matrices
- Inverse and Joint Variation
- Inverse functions
- Iterative Methods
- Law of Cosines in Algebra
- Law of Sines in Algebra
- Laws of Logs
- Limits of Accuracy
- Linear Expressions
- Linear Systems
- Linear Transformations of Matrices
- Location of Roots
- Logarithm Base
- Logic
- Lower and Upper Bounds
- Lowest Common Denominator
- Lowest Common Multiple
- Math formula
- Matrices
- Matrix Addition and Subtraction
- Matrix Determinant
- Matrix Multiplication
- Metric and Imperial Units
- Misleading Graphs
- Mixed Expressions
- Modulus Functions
- Modulus and Phase
- Multiples of Pi
- Multiplication and Division of Fractions
- Multiplicative Relationship
- Multiplying and Dividing Rational Expressions
- Natural Logarithm
- Natural Numbers
- Notation
- Number
- Number Line
- Number Systems
- Numerical Methods
- Odd functions
- Open Sentences and Identities
- Operation with Complex Numbers
- Operations with Decimals
- Operations with Matrices
- Operations with Polynomials
- Order of Operations
- Parabola
- Parallel Lines
- Parametric Differentiation
- Parametric Equations
- Parametric Integration
- Partial Fractions
- Pascal's Triangle
- Percentage
- Percentage Increase and Decrease
- Percentage as fraction or decimals
- Perimeter of a Triangle
- Permutations and Combinations
- Perpendicular Lines
- Points Lines and Planes
- Polynomial Graphs
- Polynomials
- Powers Roots And Radicals
- Powers and Exponents
- Powers and Roots
- Prime Factorization
- Prime Numbers
- Problem-solving Models and Strategies
- Product Rule
- Proof
- Proof and Mathematical Induction
- Proof by Contradiction
- Proof by Deduction
- Proof by Exhaustion
- Proof by Induction
- Properties of Exponents
- Proportion
- Proving an Identity
- Pythagorean Identities
- Quadratic Equations
- Quadratic Function Graphs
- Quadratic Graphs
- Quadratic functions
- Quadrilaterals
- Quotient Rule
- Radians
- Radical Functions
- Rates of Change
- Ratio
- Ratio Fractions
- Rational Exponents
- Rational Expressions
- Rational Functions
- Rational Numbers and Fractions
- Ratios as Fractions
- Real Numbers
- Reciprocal Graphs
- Recurrence Relation
- Recursion and Special Sequences
- Remainder and Factor Theorems
- Representation of Complex Numbers
- Rewriting Formulas and Equations
- Roots of Complex Numbers
- Roots of Polynomials
- Roots of Unity
- Rounding
- SAS Theorem
- SSS Theorem
- Scalar Triple Product
- Scale Drawings and Maps
- Scale Factors
- Scientific Notation
- Second Order Recurrence Relation
- Sector of a Circle
- Segment of a Circle
- Sequences
- Sequences and Series
- Series Maths
- Sets Math
- Similar Triangles
- Similar and Congruent Shapes
- Simple Interest
- Simplifying Fractions
- Simplifying Radicals
- Simultaneous Equations
- Sine and Cosine Rules
- Small Angle Approximation
- Solving Linear Equations
- Solving Linear Systems
- Solving Quadratic Equations
- Solving Radical Inequalities
- Solving Rational Equations
- Solving Simultaneous Equations Using Matrices
- Solving Systems of Inequalities
- Solving Trigonometric Equations
- Solving and Graphing Quadratic Equations
- Solving and Graphing Quadratic Inequalities
- Special Products
- Standard Form
- Standard Integrals
- Standard Unit
- Straight Line Graphs
- Substraction and addition of fractions
- Sum and Difference of Angles Formulas
- Sum of Natural Numbers
- Surds
- Surjective functions
- Tables and Graphs
- Tangent of a Circle
- The Quadratic Formula and the Discriminant
- Transformations
- Transformations of Graphs
- Translations of Trigonometric Functions
- Triangle Rules
- Triangle trigonometry
- Trigonometric Functions
- Trigonometric Functions of General Angles
- Trigonometric Identities
- Trigonometric Ratios
- Trigonometry
- Turning Points
- Types of Functions
- Types of Numbers
- Types of Triangles
- Unit Circle
- Units
- Variables in Algebra
- Vectors
- Verifying Trigonometric Identities
- Writing Equations
- Writing Linear Equations
- Statistics
- Bias in Experiments
- Binomial Distribution
- Binomial Hypothesis Test
- Bivariate Data
- Box Plots
- Categorical Data
- Categorical Variables
- Central Limit Theorem
- Chi Square Test for Goodness of Fit
- Chi Square Test for Homogeneity
- Chi Square Test for Independence
- Chi-Square Distribution
- Combining Random Variables
- Comparing Data
- Comparing Two Means Hypothesis Testing
- Conditional Probability
- Conducting a Study
- Conducting a Survey
- Conducting an Experiment
- Confidence Interval for Population Mean
- Confidence Interval for Population Proportion
- Confidence Interval for Slope of Regression Line
- Confidence Interval for the Difference of Two Means
- Confidence Intervals
- Correlation Math
- Cumulative Distribution Function
- Cumulative Frequency
- Data Analysis
- Data Interpretation
- Degrees of Freedom
- Discrete Random Variable
- Distributions
- Dot Plot
- Empirical Rule
- Errors in Hypothesis Testing
- Estimator Bias
- Events (Probability)
- Frequency Polygons
- Generalization and Conclusions
- Geometric Distribution
- Histograms
- Hypothesis Test for Correlation
- Hypothesis Test for Regression Slope
- Hypothesis Test of Two Population Proportions
- Hypothesis Testing
- Inference for Distributions of Categorical Data
- Inferences in Statistics
- Large Data Set
- Least Squares Linear Regression
- Linear Interpolation
- Linear Regression
- Measures of Central Tendency
- Methods of Data Collection
- Normal Distribution
- Normal Distribution Hypothesis Test
- Normal Distribution Percentile
- Paired T-Test
- Point Estimation
- Probability
- Probability Calculations
- Probability Density Function
- Probability Distribution
- Probability Generating Function
- Quantitative Variables
- Quartiles
- Random Variables
- Randomized Block Design
- Residual Sum of Squares
- Residuals
- Sample Mean
- Sample Proportion
- Sampling
- Sampling Distribution
- Scatter Graphs
- Single Variable Data
- Skewness
- Spearman's Rank Correlation Coefficient
- Standard Deviation
- Standard Error
- Standard Normal Distribution
- Statistical Graphs
- Statistical Measures
- Stem and Leaf Graph
- Sum of Independent Random Variables
- Survey Bias
- T-distribution
- Transforming Random Variables
- Tree Diagram
- Two Categorical Variables
- Two Quantitative Variables
- Type I Error
- Type II Error
- Types of Data in Statistics
- Variance for Binomial Distribution
- Venn Diagrams

With what confidence would you say that the relationship between the hours of sleep you get at night and your success in school are related? And that this relationship is a linear relationship?

In this article, you will learn about a confidence interval for the slope of a regression model, its meaning, the conditions necessary to be able to construct them, the formula, and how to actually determine them. For information on drawing conclusions about a population from the confidence interval, see the article Justifying Claims Based on the Confidence Interval for the Slope of a Regression Model.

By now you know that when there is a **linear relationship** between a variable \(x\) and a variable \(y\) – the linear correlation coefficient \(r\) is non-zero – you can model it with a **linear regression**. This regression consists of:

\[\hat{y}=\beta_0+\beta_1x\]

where:

\(\beta_0\) is the y-intercept;

\(\beta_1\) is the slope of the regression;

\(x\) is the independent variable; and

\(\hat{y}\) the predicted value of the dependent variable.

For a better reminder of this topic, see our article Least-Squares Regression. Remember that the correlation coefficient \(r\) tells you how much of a correlation there is between the two variables. If \(r\) is close to zero, then there is little to no correlation between the variables, while \(r\) values close to \(-1\) or \(1\) indicate that there is a strong correlation between the two variables.

On the other hand, the slope \(\beta_1\) represents how much \(\hat{y}\) changes to the changes in the \(x\)-values, that is, **for each unit of increase of \(x\), \(\hat{y}\) increases \(\beta_1\) units****.**

Suppose you suspect that an increase in book price means that fewer books will be sold. You collect data, and find the line of best fit to be:

\[\hat{y}=3500-10x\]

where \(x\) is the price is the book and \(hat{y}\) is the predicted number of books sold. What a \(\$1\) increase in \(x\) mean about the number of books you predict will sell?

**Solution:**

From the equation given you can see that \(\beta_0 = 3500\) and \(\beta_1 = -10\). Notice that the slope of the regression model is negative. That means an increase of \(\$1\) in the book price corresponds to a predicted increase of \(-10\) books sold, or in other words you can predict that 10 fewer books will be sold for every dollar increase in book price.

By calculating a confidence interval with a *high confidence level*, say \(c\%\), for the slope \(\beta_1\), you get two values that define the limits of a range of values in which you can find the slope. You can say with **\(c\%\) confidence that the value of the slope will be between those two values**.

Furthermore, you can say that the method used to construct the interval is successful in capturing the actual slope of the linear regression model about \(c\%\) of the time.

The conditions for constructing a confidence interval for the slope of a linear regression are the same as for constructing a linear regression. These conditions are:

** **

Quantitative variable condition: Correlation only applies if both variables are quantitative.

Straight enough condition: Look at the scatter plot and make sure your data has an approximately linear relationship. Correlation only measures the strength in a linear association. This can also be done by looking at the correlation coefficient of the data.

Independence of Variables: Data should be collected randomly, and if sampling without replacement is done, the sample size is less than or equal to \(10\%\) of the total population.

Normal: The independent variable is normally distributed.

Like any confidence interval you have studied so far, a **confidence interval for the slope \(\beta_1\) of the least squares regression line** has the following structure:

sample statistic – margin of error \(\le \beta_1\le\) sample statistic + margin of error,

where margin of error = critical value \(\times\) standard error.

Now, you just have to understand what each of those three elements is for the slope \(\beta_1\):

The sample statistic will be \(\hat{\beta}_1\), the point estimator of the slope \(\beta_1\);

For the margin of error:

this time the critical value will be of a \(t\)-distribution with \(n-2\) degrees of freedom, i.e., \(t\) with \(df=n-2\);

the standard error for the slope, written \(SE_{\beta_1} \), will be:\[SE_{\beta_1}=\frac{s}{\sqrt{\sum_{i=1}^{n}(x_i-\bar{x})^2}}\]where \(s\) is the sample standard deviation calculated as:\[s={\sqrt{\frac{\sum_{i=1}^{n}(y_i-\hat{y}_i)^2}{n-2}}}\ \]

The reason why you'll be using a critical \(t\) value instead of a critical \(z\) value is that the standard error of the slope \(\hat{\beta}_1\) is an estimate. You might not actually know the standard deviation of the sampling distribution.

Thus, the **formula for a confidence interval for the slope** \(\beta_1\) is:

\[\hat{\beta}_1- t\cdot SE_{\beta_1}\le \beta_1\le \hat{\beta}_1+ t\cdot SE_{\beta_1}\]

or an even shorter version:

\[\hat{\beta}_1\pm t\cdot SE_{\beta_1}\\]

This confidence interval is for any confidence level, but confidence levels that you will see most often are \(90\%\), \(95\%\), and \(99\%\). These are the values you should consider when calculating the critical value \(t\).

From what you have read so far, the formula for a confidence interval for the slope suggests a set of steps you should follow when you want to find it.

**Step 1**: **Find the sample statistic **\(\hat{\beta}_1\).

You get the value of the **point estimator** \(\hat{\beta}_1\) by constructing the regression line for the data set you are working with.

**Step 2**: **Select a confidence level** \(c\%\).

The confidence level describes the uncertainty of a sampling method. You will most often be asked for a confidence level of \(90\%\), \(95\%\), or \(99\%\).

The purpose of knowing the confidence level is to be able to find the critical value \(t\), by consulting a \(t\) table, with two bits of information:

the degrees of freedom, given by the:\[ \text{sample size } -2 = n-2\]where \(n\) is the sample size; and

the confidence level adjusted for the table you are using.

Depending on the table you consult, the confidence level may have to be adjusted to \(1-\tfrac{\alpha}{2}\) or to \(\tfrac{\alpha}{2} \).

For example, for a confidence level of \(99\%\), you know that \(c=100(1-\alpha)\%\) and so:

\[\begin{align} 99\%&=100\%(1-\alpha) \\ 0.99&=1-\alpha \\ \alpha&=0.01 .\end{align}\]

Now, depending on the table you consult, you'll do:

\[1-\frac{\alpha}{2}=1-\frac{0.01}{2}=0.995\]

or

\[\frac{\alpha}{2} = \frac{0.01}{2}=0.005\]

**Step 3**: **Find the margin of error** \(t\cdot SE_{\beta_1}\).

As you already know, the margin of error is the product of the critical value \(t\) with the value of the standard error. The formula for the standard error is:

\[SE_{\beta_1}=\frac{s}{\sqrt{\sum_{i=1}^{n}(x_i-\bar{x})^2}}\]

where \(s\) is the sample standard deviation.

**Step 4**: **Find the confidence interval.**

Here you just have to replace the values you got in the previous step in the formula:

\[\hat{\beta}_1\pm t\cdot SE_{\beta_1}\\]

Let's look at an example where you can apply the steps by hand.

Given that the data set in the table below

x | y |

1 | 3 |

2 | 4 |

2 | 7 |

3 | 8 |

5 | 9 |

Table 1. Example data.

find a confidence interval of \(95\%\) for the slope knowing that the least squares regression line of this data is:

\[\hat{y}=2.41+1.46x\]

the sample variance is \(s^2=2.39\) and \(t=3.182\).

**Solution:**

**Step 1**: Find the sample statistic \(\hat{\beta}_1\)

You were given the equation of the regression line, so you know that \(\hat{\beta}_1=1.46\).

**Step 2**: Select a confidence level \(c\%\)

The confidence level is given: \(c=95\%\). You’re also given the critical value \(t=3.182\).

If you had to consult a \(t\) table, you would first see that \(df=5-2=3\), second that \(95\%=100\%(1-\alpha)\) if and only if \(0.95=1-\alpha\) if and only if \(\alpha=0.05\), and then that \(1-\alpha/2=1-0.05/2=0.975\).

**Step 3**: Find the margin of error \(t\cdot SE_{\beta_1}\).

You know that:

\[SE_{\beta_1}=\frac{s}{\sqrt{\sum_{i=1}^{n}(x_i-\bar{x})^2}}\\]

You know \(s^2=2.39\), so the sample standard deviation is \(s=1.55\).

For the sum in the denominator, you first need the sample mean of the \(x-\)values.

\[\bar{x}=\frac{1+2+2+3+5}{5}=2.6\]

Now the sum:

\[\begin{align} \sum_{i=1}^{n}(x_i-\bar{x})^2=&(1-2.6)^2+(2-2.6)^2+(2-2.6)^2+\\&+(3-2.6)^2+(5-2.6)^2 \\ &=9.2 \end{align}\]

Finally, for the margin of error:

\[\begin{align} t\cdot SE_{\beta_1}&=3.182\left( \frac{1.55}{\sqrt{9.2}}\right)\\ &=3.182(0.51)\\ &=1.62282. \end{align} \]

**Step 4**: Find the confidence interval

Now just substitute the values you determined in the previous steps into the formula:

\[\hat{\beta}_1\pm t\cdot SE_{\beta_1}= 1.46\pm 1.62282\]

which gives you

\[ -0.16282\le \beta_1 \le 3.08282\ \]

If you have satisfied the conditions for doing a confidence interval for the slope of a regression model, you can say with \(95\%\) confidence that the true value of the slope \(\beta_1\) is between \(-0.16282\) and \(3.08282\).

Let's look at an example of doing the calculations necessary for finding the confidence interval for the slope of a regression line.

Between \(2010\) and \(2022\), data was collected on the average cost of college textbooks required for a semester that year. That data is in the table below. Find the confidence interval for the slope of the regression line at a \(99\%\) confidence level.

Year | Average Book Cost (in \($\)) | Year | Average Book Cost (in \($\)) |

\(2010\) | \(660\) | \(2017\) | \(1125\) |

\(2011\) | \(678\) | \(2018\) | \(1100\) |

\(2012\) | \(596\) | \(2019\) | \(1300\) |

\(2013\) | \(550\) | \(2020\) | \(1320\) |

\(2014\) | \(770\) | \(2021\) | \(1369\) |

\(2015\) | \(790\) | \(2022\) | \(1400\) |

\(2016\) | \(860\) |

Table 2. Data sample.

**Solution:**

First, draw a scatter plot of the data.

It certainly looks reasonable to consider a linear regression model, and there are no obvious outliers. Assume year \(2010\) corresponds to \(x=1\). You can find the correlation coefficient \(r = 0.96\) and the line of best fit \(\hat{y} = 79.9x+ 458.1\). With the correlation coefficient being close to \(1\) you can see there is a strong linear relationship between the year and the average book cost.

For a reminder of how to find the correlation coefficient and the line of best fit see Linear Regression and Least-Squares Regression

In fact if you graph the line of best fit you can see immediately that there is a strong linear relationship.

Now let's follow the steps to find the confidence interval for the slope of the regression line.

**Step 1**: **Find the sample statistic **\(\hat{\beta}_1\).

The line of best fit is \(\hat{y} = 79.9x + 458.1\), so \(\beta_1 = 79.9\). This is the point estimator for the data.

**Step 2**: **Select a confidence level** \(c\%\).

The confidence level for this problem is \(99\%\). There are \(13\) samples, which means the degree of freedom is \(13-2=11\). Consulting a \(t\)-table then gives the \(t\) critical value as \(3.11\), so \(t = 3.11\).

**Step 3**: Find the margin of error \(t\cdot SE_{\beta_1}\).

To do this you first need to calculate \(s^2\). Given the equation for the line:

\[ y_i-\hat{y}_i = y_i - (79.9x_i - 458.1 ) \]

To make the calculations for \(s\) a little easier to follow it can help to make a table.

\(x_i\) | \(y_i\) | \(\hat{y}_i\) | \((y_i-\hat{y}_i )^2 \) |

1 | 660 | 538 | 3844 |

2 | 678 | 617.9 | 3612.01 |

3 | 596 | 697.8 | 10363.24 |

4 | 550 | 777.7 | 51847.29 |

5 | 770 | 857.6 | 24837.76 |

6 | 790 | 937.5 | 21756.25 |

7 | 860 | 1017.4 | 24774.76 |

8 | 1125 | 1097.3 | 767.29 |

9 | 1100 | 1177.2 | 5959.84 |

10 | 1300 | 1257.1 | 1840.41 |

11 | 1320 | 1337 | 289 |

12 | 1369 | 1416.9 | 2294.41 |

13 | 1400 | 1496.8 | 9370.24 |

Table 3. Data sample.

Using the formula and the information in the table above:

\[\begin{align} s &=\sqrt{\frac{\sum_{i=1}^{n}(y_i-\hat{y}_i)^2}{n-2}} \\ &= \sqrt{\frac{\sum_{i=1}^{13}(y_i-\hat{y}_i)^2}{11}} \\ &= \sqrt{\frac{161556.5 }{11}} \\ &\approx 121.2 \end{align}\]

Then you have:

\[\begin{align} SE_{\beta_1}&=\frac{s}{\sqrt{\sum_{i=1}^{n}(x_i-\bar{x})^2}} \\ &= \frac{121.2}{182} \\ &\approx 0.67 \end{align} \]

You have already found the critical value \(t = 3.11\), so:

\[ \begin{align} \text{margin of error} &= t\cdot SE_{\beta_1} \\ &= (3.11)(0.67 ) \\ &\approx 2.08 \end{align}\]

**Step 4**: Find the confidence interval

Substituting the values you found in the previous steps into the formula:

\[\hat{\beta}_1\pm t\cdot SE_{\beta_1}= 79.9\pm 2.08\]

which gives you a confidence interval of \( (77.82, 79.98) \).

If you have satisfied the conditions for doing a confidence interval for the slope of a regression model, you can say with \(99\%\) confidence that the true value of the slope \(\beta_1\) is between \(77.82 \) and \(79.98 \).

- By calculating a confidence interval with a
*high confidence level*, say \(c\%\), for the slope \(\beta_1\), you get two values that define the limits of a range of values in which you can find the slope. You can say with \(c\%\) confidence that the value of the slope will be between those two values. - You can say that the method used to construct the interval is successful in capturing the actual slope of the linear regression model about \(c\%\) of the time.
- The formula for the confidence interval for the slope of a regression model is \[\hat{\beta}_1\pm t\cdot SE_{\beta_1}\, ,\] where
- \(\hat{\beta}_1\) is the estimate of the slope \(\beta_1\)
- \(t\cdot SE_{\beta_1}\) is the margin of error
- \(t\) is the critical value from the \(t-\)distribution with parameter \(df=n-2\) (\(n-2\) degrees of freedom)
- \(SE_{\beta_1}\) is the standard error for the slope

_{1}* is going to overlap with the true value of the slope β_{1 }that you’re estimating.

_{1}*, is in that range.

For a small data set like

x 1 2 2 3 5

y 3 4 7 8 9

the confidence interval for the slope is

-0.16282 ≤ β_{1} ≤ 3.08282

To calculate the confidence interval for the slope, follow these steps:

Step 1: Find the slope estimate, β_{1}*

Step 2: Select a confidence level c%

Step 3: Find the margin of error t×SE_{β}_{1}

Step 4: Find the confidence interval

_{1}* ± t×SE_{β}_{1}, where β_{1}* is the slope estimate, t is the critical value, and SE_{β}_{1} is the standard error of the slope.

More about Confidence Interval for Slope of Regression Line

60%

of the users don't pass the Confidence Interval for Slope of Regression Line quiz! Will you pass the quiz?

Start QuizBe perfectly prepared on time with an individual plan.

Test your knowledge with gamified quizzes.

Create and find flashcards in record time.

Create beautiful notes faster than ever before.

Have all your study materials in one place.

Upload unlimited documents and save them online.

Identify your study strength and weaknesses.

Set individual study goals and earn points reaching them.

Stop procrastinating with our study reminders.

Earn points, unlock badges and level up while studying.

Create flashcards in notes completely automatically.

Create the most beautiful study materials using our templates.

Sign up to highlight and take notes. It’s 100% free.

Over 10 million students from across the world are already learning smarter.

Get Started for Free