Americas
Europe
Q14E
Expert-verifiedLet \(\overline x = \frac{1}{n}\left( {{x_1} + \cdots + {x_n}} \right)\), and \(\overline y = \frac{1}{n}\left( {{y_1} + \cdots + {y_n}} \right)\). Show that the least-squares line for the data \(\left( {{x_1},{y_1}} \right), \ldots ,\left( {{x_n},{y_n}} \right)\) must pass through \(\left( {\overline x ,\overline y } \right)\). That is, show that \(\overline x \) and \(\overline y \) satisfies the linear equation \(\overline y = {\hat \beta _0} + {\hat \beta _1}\overline x \). (Hint: Derive this equation from the vector equation \({\bf{y}} = X{\bf{\hat \beta }} + \in \). Denote the first column of \(X\) by 1. Use the fact that the residual vector \( \in \) is orthogonal to the column space of \(X\) and hence is orthogonal to 1.)
It is verified that, \(\bar y\) and \(\) satisfies the linear equation \(\bar y = {\beta _0} + \bar x{\beta _1}\).
The equation of the general linear model is defined as:
\({\bf{y}} = X\beta + \in \)
Here, \({\bf{y}} = \left( {\begin{aligned}{{y_1}}\\{{y_2}}\\ \vdots \\{{y_n}}\end{aligned}} \right)\) is an observational vector, \(X = \left( {\begin{aligned}1&{{x_1}}& \cdots &{x_1^n}\\1&{{x_2}}& \cdots &{x_2^n}\\ \vdots & \vdots & \ddots & \vdots \\1&{{x_n}}& \cdots &{x_n^n}\end{aligned}} \right)\) is the design matrix, \(\beta = \left( {\begin{aligned}{{\beta _1}}\\{{\beta _2}}\\ \vdots \\{{\beta _n}}\end{aligned}} \right)\) is parameter vector, and \( \in = \left( {\begin{aligned}{{ \in _1}}\\{{ \in _2}}\\ \vdots \\{{ \in _n}}\end{aligned}} \right)\) is a residual vector.
Write the design matrix as \(X = \left( {\begin{aligned}1&{\bf{x}}\end{aligned}} \right)\).
As the equation of the general linear model is given by \({\bf{y}} = X\beta + \in \). Isolate residual vector from the equation.
\( \in = {\bf{y}} - X{\bf{\hat \beta }}\)
As the residual vector is orthogonal to columns of \(X\), so \(1 \cdot \in = 0\).
\(\begin{aligned}1 \cdot \left( {{\bf{y}} - X{\bf{\hat \beta }}} \right) = 0\\{1^T}{\bf{y}} - \left( {{1^T}X} \right){\bf{\hat \beta }} = 0\\\left( {\begin{aligned}1\\1\\ \vdots \\1\end{aligned}} \right)\left( {\begin{aligned}{{y_1}}\\{{y_2}}\\ \vdots \\{{y_n}}\end{aligned}} \right) - {\left( {\begin{aligned}1\\1\\ \vdots \\1\end{aligned}} \right)^T}\left( {\begin{aligned}{{x_1}}\\{{x_2}}\\ \vdots \\{{x_n}}\end{aligned}} \right) \cdot {\bf{\hat \beta }} = 0\\\left( {{y_1} + {y_2} + \cdots + {y_n}} \right) - \left( {\begin{aligned}{1 + 1 + \ldots n{\rm{ times}}}&{{x_1} + {x_2} + \cdots + {x_n}}\end{aligned}} \right) \cdot {\bf{\hat \beta }} = 0\\\sum y - \left( {\begin{aligned}n&{\sum x }\end{aligned}} \right)\left( {\begin{aligned}{{\beta _0}}\\{{\beta _1}}\end{aligned}} \right) = 0\\\sum y - \left( {n{\beta _0} + \sum x {\beta _1}} \right) = 0\end{aligned}\)
Write \(\sum y \) as \(n\bar y\) and \(\sum x \) as \(n\bar x\).
\(\begin{aligned}n\bar y - \left( {n{\beta _0} + n\bar x{\beta _1}} \right) = 0\\n\bar y - n{\beta _0} - n\bar x{\beta _1} = 0\\\bar y - {\beta _0} - \bar x{\beta _1} = 0\\\bar y = {\beta _0} + \bar x{\beta _1}\end{aligned}\)
This implies that \(\bar y\) and \(\bar x\) satisfies the linear equation \(\bar y = {\beta _0} + \bar x{\beta _1}\).
94% of StudySmarter users get better grades.
Sign up for free