StudySmarter - The all-in-one study app.

4.8 • +11k Ratings

More than 3 Million Downloads

Free

Suggested languages for you:

Americas

Europe

Transforming Random Variables

- Calculus
- Absolute Maxima and Minima
- Absolute and Conditional Convergence
- Accumulation Function
- Accumulation Problems
- Algebraic Functions
- Alternating Series
- Antiderivatives
- Application of Derivatives
- Approximating Areas
- Arc Length of a Curve
- Area Between Two Curves
- Arithmetic Series
- Average Value of a Function
- Calculus of Parametric Curves
- Candidate Test
- Combining Differentiation Rules
- Combining Functions
- Continuity
- Continuity Over an Interval
- Convergence Tests
- Cost and Revenue
- Density and Center of Mass
- Derivative Functions
- Derivative of Exponential Function
- Derivative of Inverse Function
- Derivative of Logarithmic Functions
- Derivative of Trigonometric Functions
- Derivatives
- Derivatives and Continuity
- Derivatives and the Shape of a Graph
- Derivatives of Inverse Trigonometric Functions
- Derivatives of Polar Functions
- Derivatives of Sec, Csc and Cot
- Derivatives of Sin, Cos and Tan
- Determining Volumes by Slicing
- Direction Fields
- Disk Method
- Divergence Test
- Eliminating the Parameter
- Euler's Method
- Evaluating a Definite Integral
- Evaluation Theorem
- Exponential Functions
- Finding Limits
- Finding Limits of Specific Functions
- First Derivative Test
- Function Transformations
- General Solution of Differential Equation
- Geometric Series
- Growth Rate of Functions
- Higher-Order Derivatives
- Hydrostatic Pressure
- Hyperbolic Functions
- Implicit Differentiation Tangent Line
- Implicit Relations
- Improper Integrals
- Indefinite Integral
- Indeterminate Forms
- Initial Value Problem Differential Equations
- Integral Test
- Integrals of Exponential Functions
- Integrals of Motion
- Integrating Even and Odd Functions
- Integration Formula
- Integration Tables
- Integration Using Long Division
- Integration of Logarithmic Functions
- Integration using Inverse Trigonometric Functions
- Intermediate Value Theorem
- Inverse Trigonometric Functions
- Jump Discontinuity
- Lagrange Error Bound
- Limit Laws
- Limit of Vector Valued Function
- Limit of a Sequence
- Limits
- Limits at Infinity
- Limits at Infinity and Asymptotes
- Limits of a Function
- Linear Approximations and Differentials
- Linear Differential Equation
- Linear Functions
- Logarithmic Differentiation
- Logarithmic Functions
- Logistic Differential Equation
- Maclaurin Series
- Manipulating Functions
- Maxima and Minima
- Maxima and Minima Problems
- Mean Value Theorem for Integrals
- Models for Population Growth
- Motion Along a Line
- Motion in Space
- Natural Logarithmic Function
- Net Change Theorem
- Newton's Method
- Nonhomogeneous Differential Equation
- One-Sided Limits
- Optimization Problems
- P Series
- Particle Model Motion
- Particular Solutions to Differential Equations
- Polar Coordinates
- Polar Coordinates Functions
- Polar Curves
- Population Change
- Power Series
- Radius of Convergence
- Ratio Test
- Removable Discontinuity
- Riemann Sum
- Rolle's Theorem
- Root Test
- Second Derivative Test
- Separable Equations
- Separation of Variables
- Simpson's Rule
- Solid of Revolution
- Solutions to Differential Equations
- Surface Area of Revolution
- Symmetry of Functions
- Tangent Lines
- Taylor Polynomials
- Taylor Series
- Techniques of Integration
- The Fundamental Theorem of Calculus
- The Mean Value Theorem
- The Power Rule
- The Squeeze Theorem
- The Trapezoidal Rule
- Theorems of Continuity
- Trigonometric Substitution
- Vector Valued Function
- Vectors in Calculus
- Vectors in Space
- Washer Method
- Decision Maths
- Geometry
- 2 Dimensional Figures
- 3 Dimensional Vectors
- 3-Dimensional Figures
- Altitude
- Angles in Circles
- Arc Measures
- Area and Volume
- Area of Circles
- Area of Circular Sector
- Area of Parallelograms
- Area of Plane Figures
- Area of Rectangles
- Area of Regular Polygons
- Area of Rhombus
- Area of Trapezoid
- Area of a Kite
- Composition
- Congruence Transformations
- Congruent Triangles
- Convexity in Polygons
- Coordinate Systems
- Dilations
- Distance and Midpoints
- Equation of Circles
- Equilateral Triangles
- Figures
- Fundamentals of Geometry
- Geometric Inequalities
- Geometric Mean
- Geometric Probability
- Glide Reflections
- HL ASA and AAS
- Identity Map
- Inscribed Angles
- Isometry
- Isosceles Triangles
- Law of Cosines
- Law of Sines
- Linear Measure and Precision
- Median
- Parallel Lines Theorem
- Parallelograms
- Perpendicular Bisector
- Plane Geometry
- Polygons
- Projections
- Properties of Chords
- Proportionality Theorems
- Pythagoras Theorem
- Rectangle
- Reflection in Geometry
- Regular Polygon
- Rhombuses
- Right Triangles
- Rotations
- SSS and SAS
- Segment Length
- Similarity
- Similarity Transformations
- Special quadrilaterals
- Squares
- Surface Area of Cone
- Surface Area of Cylinder
- Surface Area of Prism
- Surface Area of Sphere
- Surface Area of a Solid
- Surface of Pyramids
- Symmetry
- Translations
- Trapezoids
- Triangle Inequalities
- Triangles
- Using Similar Polygons
- Vector Addition
- Vector Product
- Volume of Cone
- Volume of Cylinder
- Volume of Pyramid
- Volume of Solid
- Volume of Sphere
- Volume of prisms
- Mechanics Maths
- Acceleration and Time
- Acceleration and Velocity
- Angular Speed
- Assumptions
- Calculus Kinematics
- Coefficient of Friction
- Connected Particles
- Conservation of Mechanical Energy
- Constant Acceleration
- Constant Acceleration Equations
- Converting Units
- Elastic Strings and Springs
- Force as a Vector
- Kinematics
- Newton's First Law
- Newton's Law of Gravitation
- Newton's Second Law
- Newton's Third Law
- Power
- Projectiles
- Pulleys
- Resolving Forces
- Statics and Dynamics
- Tension in Strings
- Variable Acceleration
- Work Done by a Constant Force
- Probability and Statistics
- Bar Graphs
- Basic Probability
- Charts and Diagrams
- Conditional Probabilities
- Continuous and Discrete Data
- Frequency, Frequency Tables and Levels of Measurement
- Independent Events Probability
- Line Graphs
- Mean Median and Mode
- Mutually Exclusive Probabilities
- Probability Rules
- Probability of Combined Events
- Quartiles and Interquartile Range
- Systematic Listing
- Pure Maths
- ASA Theorem
- Absolute Value Equations and Inequalities
- Addition and Subtraction of Rational Expressions
- Addition, Subtraction, Multiplication and Division
- Algebra
- Algebraic Fractions
- Algebraic Notation
- Algebraic Representation
- Analyzing Graphs of Polynomials
- Angle Measure
- Angles
- Angles in Polygons
- Approximation and Estimation
- Area and Circumference of a Circle
- Area and Perimeter of Quadrilaterals
- Area of Triangles
- Argand Diagram
- Arithmetic Sequences
- Average Rate of Change
- Bijective Functions
- Binomial Expansion
- Binomial Theorem
- Chain Rule
- Circle Theorems
- Circles
- Circles Maths
- Combination of Functions
- Combinatorics
- Common Factors
- Common Multiples
- Completing the Square
- Completing the Squares
- Complex Numbers
- Composite Functions
- Composition of Functions
- Compound Interest
- Compound Units
- Conic Sections
- Construction and Loci
- Converting Metrics
- Convexity and Concavity
- Coordinate Geometry
- Coordinates in Four Quadrants
- Cubic Function Graph
- Cubic Polynomial Graphs
- Data transformations
- De Moivre's Theorem
- Deductive Reasoning
- Definite Integrals
- Deriving Equations
- Determinant of Inverse Matrix
- Determinants
- Differential Equations
- Differentiation
- Differentiation Rules
- Differentiation from First Principles
- Differentiation of Hyperbolic Functions
- Direct and Inverse proportions
- Disjoint and Overlapping Events
- Disproof by Counterexample
- Distance from a Point to a Line
- Divisibility Tests
- Double Angle and Half Angle Formulas
- Drawing Conclusions from Examples
- Ellipse
- Equation of Line in 3D
- Equation of a Perpendicular Bisector
- Equation of a circle
- Equations
- Equations and Identities
- Equations and Inequalities
- Estimation in Real Life
- Euclidean Algorithm
- Evaluating and Graphing Polynomials
- Even Functions
- Exponential Form of Complex Numbers
- Exponential Rules
- Exponentials and Logarithms
- Expression Math
- Expressions and Formulas
- Faces Edges and Vertices
- Factorials
- Factoring Polynomials
- Factoring Quadratic Equations
- Factorising expressions
- Factors
- Finding Maxima and Minima Using Derivatives
- Finding Rational Zeros
- Finding the Area
- Forms of Quadratic Functions
- Fractional Powers
- Fractional Ratio
- Fractions
- Fractions and Decimals
- Fractions and Factors
- Fractions in Expressions and Equations
- Fractions, Decimals and Percentages
- Function Basics
- Functional Analysis
- Functions
- Fundamental Counting Principle
- Fundamental Theorem of Algebra
- Generating Terms of a Sequence
- Geometric Sequence
- Gradient and Intercept
- Graphical Representation
- Graphing Rational Functions
- Graphing Trigonometric Functions
- Graphs
- Graphs and Differentiation
- Graphs of Common Functions
- Graphs of Exponents and Logarithms
- Graphs of Trigonometric Functions
- Greatest Common Divisor
- Growth and Decay
- Growth of Functions
- Highest Common Factor
- Hyperbolas
- Imaginary Unit and Polar Bijection
- Implicit differentiation
- Inductive Reasoning
- Inequalities Maths
- Infinite geometric series
- Injective functions
- Instantaneous Rate of Change
- Integers
- Integrating Polynomials
- Integrating Trig Functions
- Integrating e^x and 1/x
- Integration
- Integration Using Partial Fractions
- Integration by Parts
- Integration by Substitution
- Integration of Hyperbolic Functions
- Interest
- Inverse Hyperbolic Functions
- Inverse Matrices
- Inverse and Joint Variation
- Inverse functions
- Iterative Methods
- Law of Cosines in Algebra
- Law of Sines in Algebra
- Laws of Logs
- Limits of Accuracy
- Linear Expressions
- Linear Systems
- Linear Transformations of Matrices
- Location of Roots
- Logarithm Base
- Logic
- Lower and Upper Bounds
- Lowest Common Denominator
- Lowest Common Multiple
- Math formula
- Matrices
- Matrix Addition and Subtraction
- Matrix Determinant
- Matrix Multiplication
- Metric and Imperial Units
- Misleading Graphs
- Mixed Expressions
- Modulus Functions
- Modulus and Phase
- Multiples of Pi
- Multiplication and Division of Fractions
- Multiplicative Relationship
- Multiplying and Dividing Rational Expressions
- Natural Logarithm
- Natural Numbers
- Notation
- Number
- Number Line
- Number Systems
- Numerical Methods
- Odd functions
- Open Sentences and Identities
- Operation with Complex Numbers
- Operations with Decimals
- Operations with Matrices
- Operations with Polynomials
- Order of Operations
- Parabola
- Parallel Lines
- Parametric Differentiation
- Parametric Equations
- Parametric Integration
- Partial Fractions
- Pascal's Triangle
- Percentage
- Percentage Increase and Decrease
- Percentage as fraction or decimals
- Perimeter of a Triangle
- Permutations and Combinations
- Perpendicular Lines
- Points Lines and Planes
- Polynomial Graphs
- Polynomials
- Powers Roots And Radicals
- Powers and Exponents
- Powers and Roots
- Prime Factorization
- Prime Numbers
- Problem-solving Models and Strategies
- Product Rule
- Proof
- Proof and Mathematical Induction
- Proof by Contradiction
- Proof by Deduction
- Proof by Exhaustion
- Proof by Induction
- Properties of Exponents
- Proportion
- Proving an Identity
- Pythagorean Identities
- Quadratic Equations
- Quadratic Function Graphs
- Quadratic Graphs
- Quadratic functions
- Quadrilaterals
- Quotient Rule
- Radians
- Radical Functions
- Rates of Change
- Ratio
- Ratio Fractions
- Rational Exponents
- Rational Expressions
- Rational Functions
- Rational Numbers and Fractions
- Ratios as Fractions
- Real Numbers
- Reciprocal Graphs
- Recurrence Relation
- Recursion and Special Sequences
- Remainder and Factor Theorems
- Representation of Complex Numbers
- Rewriting Formulas and Equations
- Roots of Complex Numbers
- Roots of Polynomials
- Roots of Unity
- Rounding
- SAS Theorem
- SSS Theorem
- Scalar Triple Product
- Scale Drawings and Maps
- Scale Factors
- Scientific Notation
- Second Order Recurrence Relation
- Sector of a Circle
- Segment of a Circle
- Sequences
- Sequences and Series
- Series Maths
- Sets Math
- Similar Triangles
- Similar and Congruent Shapes
- Simple Interest
- Simplifying Fractions
- Simplifying Radicals
- Simultaneous Equations
- Sine and Cosine Rules
- Small Angle Approximation
- Solving Linear Equations
- Solving Linear Systems
- Solving Quadratic Equations
- Solving Radical Inequalities
- Solving Rational Equations
- Solving Simultaneous Equations Using Matrices
- Solving Systems of Inequalities
- Solving Trigonometric Equations
- Solving and Graphing Quadratic Equations
- Solving and Graphing Quadratic Inequalities
- Special Products
- Standard Form
- Standard Integrals
- Standard Unit
- Straight Line Graphs
- Substraction and addition of fractions
- Sum and Difference of Angles Formulas
- Sum of Natural Numbers
- Surds
- Surjective functions
- Tables and Graphs
- Tangent of a Circle
- The Quadratic Formula and the Discriminant
- Transformations
- Transformations of Graphs
- Translations of Trigonometric Functions
- Triangle Rules
- Triangle trigonometry
- Trigonometric Functions
- Trigonometric Functions of General Angles
- Trigonometric Identities
- Trigonometric Ratios
- Trigonometry
- Turning Points
- Types of Functions
- Types of Numbers
- Types of Triangles
- Unit Circle
- Units
- Variables in Algebra
- Vectors
- Verifying Trigonometric Identities
- Writing Equations
- Writing Linear Equations
- Statistics
- Bias in Experiments
- Binomial Distribution
- Binomial Hypothesis Test
- Bivariate Data
- Box Plots
- Categorical Data
- Categorical Variables
- Central Limit Theorem
- Chi Square Test for Goodness of Fit
- Chi Square Test for Homogeneity
- Chi Square Test for Independence
- Chi-Square Distribution
- Combining Random Variables
- Comparing Data
- Comparing Two Means Hypothesis Testing
- Conditional Probability
- Conducting a Study
- Conducting a Survey
- Conducting an Experiment
- Confidence Interval for Population Mean
- Confidence Interval for Population Proportion
- Confidence Interval for Slope of Regression Line
- Confidence Interval for the Difference of Two Means
- Confidence Intervals
- Correlation Math
- Cumulative Distribution Function
- Cumulative Frequency
- Data Analysis
- Data Interpretation
- Degrees of Freedom
- Discrete Random Variable
- Distributions
- Dot Plot
- Empirical Rule
- Errors in Hypothesis Testing
- Estimator Bias
- Events (Probability)
- Frequency Polygons
- Generalization and Conclusions
- Geometric Distribution
- Histograms
- Hypothesis Test for Correlation
- Hypothesis Test for Regression Slope
- Hypothesis Test of Two Population Proportions
- Hypothesis Testing
- Inference for Distributions of Categorical Data
- Inferences in Statistics
- Large Data Set
- Least Squares Linear Regression
- Linear Interpolation
- Linear Regression
- Measures of Central Tendency
- Methods of Data Collection
- Normal Distribution
- Normal Distribution Hypothesis Test
- Normal Distribution Percentile
- Paired T-Test
- Point Estimation
- Probability
- Probability Calculations
- Probability Density Function
- Probability Distribution
- Probability Generating Function
- Quantitative Variables
- Quartiles
- Random Variables
- Randomized Block Design
- Residual Sum of Squares
- Residuals
- Sample Mean
- Sample Proportion
- Sampling
- Sampling Distribution
- Scatter Graphs
- Single Variable Data
- Skewness
- Spearman's Rank Correlation Coefficient
- Standard Deviation
- Standard Error
- Standard Normal Distribution
- Statistical Graphs
- Statistical Measures
- Stem and Leaf Graph
- Sum of Independent Random Variables
- Survey Bias
- T-distribution
- Transforming Random Variables
- Tree Diagram
- Two Categorical Variables
- Two Quantitative Variables
- Type I Error
- Type II Error
- Types of Data in Statistics
- Variance for Binomial Distribution
- Venn Diagrams

Random variables are everywhere! Think of a groceries store. Whenever a client comes in, you cannot know in advance how much money they will spend on groceries. The manager of the store, however, needs to have a picture in mind of how much an average client spends in his store.

To address this issue, the money spent by a single client is treated as a **random variable**. However, big stores typically have hundreds or thousands of clients a day! It might also be possible that the whole store goes on sale. What happens to random variables under these circumstances? They are **transformed**. Here you will learn about **transforming random variables **and what scenarios can be modeled by these transformations.

Let's use the groceries store example to talk about the transformation of random variables. Rather than being the manager, you will now be a customer.

Suppose you go to the groceries store once a week. You usually buy the same stuff, with only slight variations, so you know more or less how much will be spent on your groceries. How can you give an estimate of how much will you spend on groceries in the **month**?

Now suppose the entire store goes on a \( 10 \% \) sale, but you will limit yourself to buying the usual stuff. How can you give an estimate of how much will you spend on groceries while the store is on sale?

The above scenarios describe two types of transformations that can be done to random variables. In your monthly groceries shopping example, you are **adding random variables.** In the store-on-sale example, you are **multiplying a random variable by a constant.**

Transforming random variables refers to doing operations on random variables and analyzing how these operations affect the possible outcomes.

Multiplication and addition are not the only transformations available, as typically, you can:

- Add a constant to a random variable.
- Subtract a constant from a random variable.
- Multiply a random variable by a constant.
- Add or subtract two random variables.

Since there are numbers and operations involved, it is assumed that you are working with quantitative data.

The addition/subtraction of two or more random variables is usually referred to as **combining ****random variables**.

Keep reading this article to see how to address each scenario!

You have seen that there are four types of transformations that can be done to random variables. Here you will see how data is transformed accordingly.

You can use the groceries store example to have an idea of how to transform random variables. Suppose you spent \( \$ 14.25\) on groceries in one go, so the value of the random variable has been now determined. This value will be used to illustrate each case of the transformation of random variables.

Adding a constant to a random variable is pretty straightforward. You just need to add that value to the value of the random variable! If you are working with a data set, then the variable is added to each value of the data set. If \( X\) is a random variable and \(k\) is a constant, this is just \[ X+k.\]

What if just before paying for your groceries, you get a phone call from your friend who is craving a chocolate bar?

You decide to pick one for \( \$ 1.25\) so you anticipate that the sales total will be \( \$ 1.25\) more than expected. Suppose \(X\) is the random variable that represents how much will you spend on groceries under normal circumstances, then you need to find

\[X+1.25.\]

In this hypothetical scenario, you previously determined that \(X=14.25\), so if you now include the chocolate bar this would be

\[ \begin{align} X+1.25 &= 14.25+1.25 \\ &= 15.50 \end{align}\]

The idea of adding a constant to a random variable in this scenario is that, no matter how much you spend on groceries, in the end, you will have to add \( \$1.25\) to the total to satisfy your friend's cravings!

Like with the addition of a constant to a random variable, to subtract a constant from a random variable you just need to subtract the constant from the value of the random variable, that is

\[X - k.\]

Data sets are worked similarly, so you have to subtract the constant from every value of the data set.

What if, just before paying for your groceries, you remember that you have a coupon for \(2\) dollars off in your next buy?

Suppose \(X\) is the random variable that represents how much will you spend on groceries under normal circumstances, then you need to find

\[X-2.\]

In this hypothetical scenario, you previously determined that \(X=14.25\), so if you now include the discount coupon this would be

\[ \begin{align} X-2 &= 14.25-2 \\ &= 12.25 \end{align}\]

This time, you will need to subtract \( \$ 2\) from the total. Thank you, coupon!

This time rather than adding or subtracting from a random variable, the constant multiplies the value of the random variable, so

\[kX.\]

For a data set, you multiply each value of the data set by the same constant \(k\).

What if just before paying for your groceries, you remember that you have a coupon for \(10 \%\) off in your next buy?

In this example, you have a \(10 \%\) off coupon, so you end up paying only \(90 \%\) of the price. Let \(X\) be the random variable that represents how much will you spend on groceries under normal circumstances, then you need to find

\[0.9X.\]

In this hypothetical scenario, you previously determined that \(X=14.25\), so if you now include the discount coupon this would be

\[ \begin{align} 0.9X &= 0.9(14.25) \\ &= 12.83 \end{align}\]

This coupon discounts by a percentage, so you will have to multiply the total by the percentage written in decimal form.

Sometimes, you will need to operate with more than one random variable, essentially you will have a combination of random variables! As usual, it is assumed that both random variables represent quantitative data.

For this one, you have to be careful. The notation can be deceiving because, even if you are using the plus sign, you are dealing with **random variables** and not numbers. This means that the usual algebraic conventions, like saying that \(X+X=2X\) do not always apply! The plus sign is used as a means of **notation**.

What if you have to consider your next buy at the groceries store? You cannot know how much are you going to spend next week, but you know that you will have to add both values. Let \(Y\) be the random variable that represents how much will you pay next week, then

\[ X+Y\]

will represent the money spent in two weeks.

The notation \(X+Y\) is used to represent combined random variables, but to work with actual numbers you have to use values like the mean and the standard deviation, as you will see below.

You can have a clearer grasp on the transformation of random variables by taking a look at how the mean and the standard deviation are affected.

Whenever you do a transformation of a random variable, you are essentially transforming data. Because of this transformation, you can expect that the mean is transformed as well!

Adding a constant to a random variable essentially translates to adding the same constant to the mean. Likewise, if you subtract a constant from a random variable, then the mean will be modified by subtracting the same constant from the original mean.

For the multiplication of a random variable by a constant, you can see the operation as a rescaling of the data, so the mean will be rescaled by the same factor.

Let \(X\) be a random variable and \(k\) a constant. Then the mean \(\mu\) of the random variable has the following properties:

\[ \mu (X+k) = \mu (X) + k,\]

\[ \mu (X-k) = \mu (X) - k,\]

and

\[ \mu (kX) = k \cdot \mu(X).\]

And what if you need to combine random variables?

If \(X\) and \(Y\) are two independent random variables, then

\[ \mu (X+Y) = \mu(X)+\mu(Y).\]

The above expression gives you a method for operating combined random variables by adding their means.

Variance and standard deviation are ways of measuring how spread the possible values of a random variable are. If the random variable is modified, you can expect these measures of spread to be modified as well.

Since adding (or subtracting) a constant to a random variable essentially translates its possible values by the same amount, then the spread will be the same! It is like if you moved a bunch of things together, their relative position will stay the same, so the measures of spread will not be modified.

If you were to multiply the random variable by a constant \(k\), then its values will be rescaled. You can expect the spread of the data to be modified as well, so the data will spread more if \(k>1\), or less if \(k<1\).

Let \(X\) be a random variable and \(k\) a constant. Then the standard deviation \(\sigma\) of the random variable has the following properties:

\[ \sigma(X+k) = \sigma(X),\]

\[ \sigma(X-k) = \sigma(X),\]

and

\[ \sigma(kX) = k \, \sigma(X).\]

Since the variance is the square of the standard deviation, you can also conclude that

\[ \sigma^2(X+k) = \sigma^2(X),\]

\[ \sigma^2(X-k) = \sigma^2(X),\]

and

\[ \sigma^2(kX) = k^2 \, \sigma^2 (X).\]

Here is something to spice things up. When combining random variables, the standard deviation is not as straightforward as the mean. What is straightforward is the variance, that is:

\[ \sigma^2(X+Y) = \sigma^2(X)+\sigma^2(Y).\]

This, of course, is assuming that the variables are independent.

To find the standard deviation of the sum of two random variables, first add the variances, and then take the square root.

If \(X\) and \(Y\) are two independent random variables, then:

\[ \sigma(X+Y) = \sqrt{ \sigma^2(X)+\sigma^2(Y)}\]

When working with higher-level statistics, you might come across more elaborated transformations involving several random variables. These transformations will not only involve sums and multiplications but will involve functions and derivatives as well!

These types of transformations are usually analyzed using methods from calculus and more advanced statistics, so this will remain out of the scope of this article.

Typically, you will be asked to find means and standard deviations when transforming random variables. Here are some examples.

The average height of the members of the theater club is \( 67 \) inches.

- Suppose that everyone needs to wear stilts for a school play. The stilts increase the height of the wearer by \(10\) inches. What is the average height of the theater club when everyone is wearing stilts?
- Will the standard deviation of the height of the theater club change if everyone is wearing their stilts?

**Solution:**

- In this case, the random variable \(X\) represents the height of a member of the theater club. You are being told that the average height is \(67\) inches, which corresponds to the mean height of the theater club. By wearing stilts, everyone becomes \(10\) inches taller, so\[X+10\]is the height of a person in the theater club that is wearing stilts. You are adding \(10\) to the random variable \(X\), so the new mean can be found by adding \(10\) to the original mean as well. This means that\[ \begin{align} \mu(X+10) &= \mu(X)+10 \\ &= 67+10 \\ &=77 \end{align} \]is the average height of the theater club when everyone is wearing their stilts.
- You just found that this situation is being described by the addition of a constant to a random variable. Since the standard deviation does not change when a constant is added to a random variable, you can conclude that the standard deviation of the height of the theater club will not change if everyone wears their stilts.

Statistics are often used in stores for administrative purposes.

The average customer of a store spends \( \$ 20\) on groceries in each visit, with a standard deviation of \( \$ 7.25\).

- What is the expected income obtained from \( 30\) customers?
- What is the standard deviation of the income obtained from \( 30\) customers?
- Suppose the store goes on sale and everything is \(50 \%\) off. What can you say about the mean income from a client?

**Solution:**

- Since each client purchases different stuff, the money spent by a customer can be seen as a random variable. You are told that the average customer spends \( \$ 20\), so\[ \mu(X)=20.\]You are trying to find the expected income from \(30\) customers, so you can label these as\[X_1, X_2, \dots , X_{30}.\]This means that you need to find the expected value of \(X_1+X_2+\dots+X_{30}\). Each customer is independent, so you can use the formula for the mean of combined random variables, that is\[ \begin{align} \mu(X_1+X_2+\dots+X_{30}) &= \mu(X_1)+\mu(X_2)+\dots+\mu(X_{30}) \end{align}\]Each client is expected to spend \( \$20\), so each \(\mu\) equals \(20\)\[ \mu(X_1+X_2+\dots+X_{30}) = 20+20+\dots+20.\]Since you are adding \(20\) a total of \(30\) times, you can just multiply \(20\) and \(30\), so\[ \mu (X_1+X_2+\dots+X_{30}) = 600.\]This means that the store can expect about \( \$ 600 \) from attending \(30\) customers.
Each client is different! Do not make the mistake of using the same label for everyone.

- This time you need to find\[ \sigma(X_1+X_2+\dots+X_{30}).\]Whenever you need to find the standard deviation of combined random variables, you first have to find the variance.Just like you did before since each random variable is independent, you can add the variance \(30\) times, that is\[ \begin{align} \sigma^2(X_1+X_2+\dots+X_{30}) &= \sigma^2(X_1)+\sigma^2(X_2)+\dots+\sigma^2(X_{30}) \\ &= 7.25^2+7.25^2+\dots+7.25^2 \\ &= 30(7.25^2) \\ &= 1576.875.\end{align}\]Finally, find the standard deviation by taking the square root of the variance, so\[ \begin{align} \sigma(X_1+X_2+\dots+X_{30}) &= \sqrt{1576.875} \\ &= 39.7\end{align}\]
- While it is true that having a \( 50\%\) discount is the same as multiplying every price by \(0.5\), which in turn means that you can use the formula\[ \mu(kX) = k \, \mu(X),\]you should first think of the scenario for a while.What would you do if you went to your usual store just to find out that everything is way cheaper than usual? You would most likely buy more things! Because of this, further information on consumption habits is needed before assuming that the store will just make half its earnings from the discount. If this were true, no store would ever go on sale!

- Transforming random variables refers to doing operations on random variables and analyzing how these operations affect the possible outcomes.
- When transforming random variables, you typically can:
- Add a constant to a random variable.
- Subtract a constant from a random variable.
- Multiply a random variable by a constant.
- Add or subtract two random variables.

- Let \(X\) and \(Y\) be independent random variables and \(k\) a constant. The mean \(\mu\) has the following properties:
\[ \mu (X+k) = \mu (X) + k,\]

\[ \mu (X-k) = \mu (X) - k,\]

\[ \mu (kX) = k \cdot \mu(X),\]and\[ \mu (X+Y) = \mu(X) + \mu(Y).\]

Let \(X\) be a random variable and \(k\) a constant. Then the standard deviation \(\sigma\) of the random variable has the following properties:

\[ \sigma(X+k) = \sigma(X),\]

\[ \sigma(X-k) = \sigma(X),\]

and

\[ \sigma(kX) = k \, \sigma(X).\]

The variance of two independent random variables is given instead by\[ \sigma^2(X+Y) = \sigma^2(X)+\sigma^2(Y),\]so the standard deviation is\[ \sigma(X+Y) = \sqrt{\sigma^2(X) + \sigma^2(Y)}.\]

The usual transformation of random variables are the following:

- Add a constant to a random variable.
- Subtract a constant from a random variable.
- Multiply a random variable by a constant.
- Add or subtract two random variables.

More about Transforming Random Variables

60%

of the users don't pass the Transforming Random Variables quiz! Will you pass the quiz?

Start QuizBe perfectly prepared on time with an individual plan.

Test your knowledge with gamified quizzes.

Create and find flashcards in record time.

Create beautiful notes faster than ever before.

Have all your study materials in one place.

Upload unlimited documents and save them online.

Identify your study strength and weaknesses.

Set individual study goals and earn points reaching them.

Stop procrastinating with our study reminders.

Earn points, unlock badges and level up while studying.

Create flashcards in notes completely automatically.

Create the most beautiful study materials using our templates.

Sign up to highlight and take notes. It’s 100% free.

Over 10 million students from across the world are already learning smarter.

Get Started for Free