StudySmarter - The all-in-one study app.

4.8 • +11k Ratings

More than 3 Million Downloads

Free

Suggested languages for you:

Americas

Europe

Logic

- Calculus
- Absolute Maxima and Minima
- Absolute and Conditional Convergence
- Accumulation Function
- Accumulation Problems
- Algebraic Functions
- Alternating Series
- Antiderivatives
- Application of Derivatives
- Approximating Areas
- Arc Length of a Curve
- Area Between Two Curves
- Arithmetic Series
- Average Value of a Function
- Calculus of Parametric Curves
- Candidate Test
- Combining Differentiation Rules
- Combining Functions
- Continuity
- Continuity Over an Interval
- Convergence Tests
- Cost and Revenue
- Density and Center of Mass
- Derivative Functions
- Derivative of Exponential Function
- Derivative of Inverse Function
- Derivative of Logarithmic Functions
- Derivative of Trigonometric Functions
- Derivatives
- Derivatives and Continuity
- Derivatives and the Shape of a Graph
- Derivatives of Inverse Trigonometric Functions
- Derivatives of Polar Functions
- Derivatives of Sec, Csc and Cot
- Derivatives of Sin, Cos and Tan
- Determining Volumes by Slicing
- Direction Fields
- Disk Method
- Divergence Test
- Eliminating the Parameter
- Euler's Method
- Evaluating a Definite Integral
- Evaluation Theorem
- Exponential Functions
- Finding Limits
- Finding Limits of Specific Functions
- First Derivative Test
- Function Transformations
- General Solution of Differential Equation
- Geometric Series
- Growth Rate of Functions
- Higher-Order Derivatives
- Hydrostatic Pressure
- Hyperbolic Functions
- Implicit Differentiation Tangent Line
- Implicit Relations
- Improper Integrals
- Indefinite Integral
- Indeterminate Forms
- Initial Value Problem Differential Equations
- Integral Test
- Integrals of Exponential Functions
- Integrals of Motion
- Integrating Even and Odd Functions
- Integration Formula
- Integration Tables
- Integration Using Long Division
- Integration of Logarithmic Functions
- Integration using Inverse Trigonometric Functions
- Intermediate Value Theorem
- Inverse Trigonometric Functions
- Jump Discontinuity
- Lagrange Error Bound
- Limit Laws
- Limit of Vector Valued Function
- Limit of a Sequence
- Limits
- Limits at Infinity
- Limits at Infinity and Asymptotes
- Limits of a Function
- Linear Approximations and Differentials
- Linear Differential Equation
- Linear Functions
- Logarithmic Differentiation
- Logarithmic Functions
- Logistic Differential Equation
- Maclaurin Series
- Manipulating Functions
- Maxima and Minima
- Maxima and Minima Problems
- Mean Value Theorem for Integrals
- Models for Population Growth
- Motion Along a Line
- Motion in Space
- Natural Logarithmic Function
- Net Change Theorem
- Newton's Method
- Nonhomogeneous Differential Equation
- One-Sided Limits
- Optimization Problems
- P Series
- Particle Model Motion
- Particular Solutions to Differential Equations
- Polar Coordinates
- Polar Coordinates Functions
- Polar Curves
- Population Change
- Power Series
- Radius of Convergence
- Ratio Test
- Removable Discontinuity
- Riemann Sum
- Rolle's Theorem
- Root Test
- Second Derivative Test
- Separable Equations
- Separation of Variables
- Simpson's Rule
- Solid of Revolution
- Solutions to Differential Equations
- Surface Area of Revolution
- Symmetry of Functions
- Tangent Lines
- Taylor Polynomials
- Taylor Series
- Techniques of Integration
- The Fundamental Theorem of Calculus
- The Mean Value Theorem
- The Power Rule
- The Squeeze Theorem
- The Trapezoidal Rule
- Theorems of Continuity
- Trigonometric Substitution
- Vector Valued Function
- Vectors in Calculus
- Vectors in Space
- Washer Method
- Decision Maths
- Geometry
- 2 Dimensional Figures
- 3 Dimensional Vectors
- 3-Dimensional Figures
- Altitude
- Angles in Circles
- Arc Measures
- Area and Volume
- Area of Circles
- Area of Circular Sector
- Area of Parallelograms
- Area of Plane Figures
- Area of Rectangles
- Area of Regular Polygons
- Area of Rhombus
- Area of Trapezoid
- Area of a Kite
- Composition
- Congruence Transformations
- Congruent Triangles
- Convexity in Polygons
- Coordinate Systems
- Dilations
- Distance and Midpoints
- Equation of Circles
- Equilateral Triangles
- Figures
- Fundamentals of Geometry
- Geometric Inequalities
- Geometric Mean
- Geometric Probability
- Glide Reflections
- HL ASA and AAS
- Identity Map
- Inscribed Angles
- Isometry
- Isosceles Triangles
- Law of Cosines
- Law of Sines
- Linear Measure and Precision
- Median
- Parallel Lines Theorem
- Parallelograms
- Perpendicular Bisector
- Plane Geometry
- Polygons
- Projections
- Properties of Chords
- Proportionality Theorems
- Pythagoras Theorem
- Rectangle
- Reflection in Geometry
- Regular Polygon
- Rhombuses
- Right Triangles
- Rotations
- SSS and SAS
- Segment Length
- Similarity
- Similarity Transformations
- Special quadrilaterals
- Squares
- Surface Area of Cone
- Surface Area of Cylinder
- Surface Area of Prism
- Surface Area of Sphere
- Surface Area of a Solid
- Surface of Pyramids
- Symmetry
- Translations
- Trapezoids
- Triangle Inequalities
- Triangles
- Using Similar Polygons
- Vector Addition
- Vector Product
- Volume of Cone
- Volume of Cylinder
- Volume of Pyramid
- Volume of Solid
- Volume of Sphere
- Volume of prisms
- Mechanics Maths
- Acceleration and Time
- Acceleration and Velocity
- Angular Speed
- Assumptions
- Calculus Kinematics
- Coefficient of Friction
- Connected Particles
- Conservation of Mechanical Energy
- Constant Acceleration
- Constant Acceleration Equations
- Converting Units
- Elastic Strings and Springs
- Force as a Vector
- Kinematics
- Newton's First Law
- Newton's Law of Gravitation
- Newton's Second Law
- Newton's Third Law
- Power
- Projectiles
- Pulleys
- Resolving Forces
- Statics and Dynamics
- Tension in Strings
- Variable Acceleration
- Work Done by a Constant Force
- Probability and Statistics
- Bar Graphs
- Basic Probability
- Charts and Diagrams
- Conditional Probabilities
- Continuous and Discrete Data
- Frequency, Frequency Tables and Levels of Measurement
- Independent Events Probability
- Line Graphs
- Mean Median and Mode
- Mutually Exclusive Probabilities
- Probability Rules
- Probability of Combined Events
- Quartiles and Interquartile Range
- Systematic Listing
- Pure Maths
- ASA Theorem
- Absolute Value Equations and Inequalities
- Addition and Subtraction of Rational Expressions
- Addition, Subtraction, Multiplication and Division
- Algebra
- Algebraic Fractions
- Algebraic Notation
- Algebraic Representation
- Analyzing Graphs of Polynomials
- Angle Measure
- Angles
- Angles in Polygons
- Approximation and Estimation
- Area and Circumference of a Circle
- Area and Perimeter of Quadrilaterals
- Area of Triangles
- Argand Diagram
- Arithmetic Sequences
- Average Rate of Change
- Bijective Functions
- Binomial Expansion
- Binomial Theorem
- Chain Rule
- Circle Theorems
- Circles
- Circles Maths
- Combination of Functions
- Combinatorics
- Common Factors
- Common Multiples
- Completing the Square
- Completing the Squares
- Complex Numbers
- Composite Functions
- Composition of Functions
- Compound Interest
- Compound Units
- Conic Sections
- Construction and Loci
- Converting Metrics
- Convexity and Concavity
- Coordinate Geometry
- Coordinates in Four Quadrants
- Cubic Function Graph
- Cubic Polynomial Graphs
- Data transformations
- De Moivre's Theorem
- Deductive Reasoning
- Definite Integrals
- Deriving Equations
- Determinant of Inverse Matrix
- Determinants
- Differential Equations
- Differentiation
- Differentiation Rules
- Differentiation from First Principles
- Differentiation of Hyperbolic Functions
- Direct and Inverse proportions
- Disjoint and Overlapping Events
- Disproof by Counterexample
- Distance from a Point to a Line
- Divisibility Tests
- Double Angle and Half Angle Formulas
- Drawing Conclusions from Examples
- Ellipse
- Equation of Line in 3D
- Equation of a Perpendicular Bisector
- Equation of a circle
- Equations
- Equations and Identities
- Equations and Inequalities
- Estimation in Real Life
- Euclidean Algorithm
- Evaluating and Graphing Polynomials
- Even Functions
- Exponential Form of Complex Numbers
- Exponential Rules
- Exponentials and Logarithms
- Expression Math
- Expressions and Formulas
- Faces Edges and Vertices
- Factorials
- Factoring Polynomials
- Factoring Quadratic Equations
- Factorising expressions
- Factors
- Finding Maxima and Minima Using Derivatives
- Finding Rational Zeros
- Finding the Area
- Forms of Quadratic Functions
- Fractional Powers
- Fractional Ratio
- Fractions
- Fractions and Decimals
- Fractions and Factors
- Fractions in Expressions and Equations
- Fractions, Decimals and Percentages
- Function Basics
- Functional Analysis
- Functions
- Fundamental Counting Principle
- Fundamental Theorem of Algebra
- Generating Terms of a Sequence
- Geometric Sequence
- Gradient and Intercept
- Graphical Representation
- Graphing Rational Functions
- Graphing Trigonometric Functions
- Graphs
- Graphs and Differentiation
- Graphs of Common Functions
- Graphs of Exponents and Logarithms
- Graphs of Trigonometric Functions
- Greatest Common Divisor
- Growth and Decay
- Growth of Functions
- Highest Common Factor
- Hyperbolas
- Imaginary Unit and Polar Bijection
- Implicit differentiation
- Inductive Reasoning
- Inequalities Maths
- Infinite geometric series
- Injective functions
- Instantaneous Rate of Change
- Integers
- Integrating Polynomials
- Integrating Trig Functions
- Integrating e^x and 1/x
- Integration
- Integration Using Partial Fractions
- Integration by Parts
- Integration by Substitution
- Integration of Hyperbolic Functions
- Interest
- Inverse Hyperbolic Functions
- Inverse Matrices
- Inverse and Joint Variation
- Inverse functions
- Iterative Methods
- Law of Cosines in Algebra
- Law of Sines in Algebra
- Laws of Logs
- Limits of Accuracy
- Linear Expressions
- Linear Systems
- Linear Transformations of Matrices
- Location of Roots
- Logarithm Base
- Logic
- Lower and Upper Bounds
- Lowest Common Denominator
- Lowest Common Multiple
- Math formula
- Matrices
- Matrix Addition and Subtraction
- Matrix Determinant
- Matrix Multiplication
- Metric and Imperial Units
- Misleading Graphs
- Mixed Expressions
- Modulus Functions
- Modulus and Phase
- Multiples of Pi
- Multiplication and Division of Fractions
- Multiplicative Relationship
- Multiplying and Dividing Rational Expressions
- Natural Logarithm
- Natural Numbers
- Notation
- Number
- Number Line
- Number Systems
- Numerical Methods
- Odd functions
- Open Sentences and Identities
- Operation with Complex Numbers
- Operations with Decimals
- Operations with Matrices
- Operations with Polynomials
- Order of Operations
- Parabola
- Parallel Lines
- Parametric Differentiation
- Parametric Equations
- Parametric Integration
- Partial Fractions
- Pascal's Triangle
- Percentage
- Percentage Increase and Decrease
- Percentage as fraction or decimals
- Perimeter of a Triangle
- Permutations and Combinations
- Perpendicular Lines
- Points Lines and Planes
- Polynomial Graphs
- Polynomials
- Powers Roots And Radicals
- Powers and Exponents
- Powers and Roots
- Prime Factorization
- Prime Numbers
- Problem-solving Models and Strategies
- Product Rule
- Proof
- Proof and Mathematical Induction
- Proof by Contradiction
- Proof by Deduction
- Proof by Exhaustion
- Proof by Induction
- Properties of Exponents
- Proportion
- Proving an Identity
- Pythagorean Identities
- Quadratic Equations
- Quadratic Function Graphs
- Quadratic Graphs
- Quadratic functions
- Quadrilaterals
- Quotient Rule
- Radians
- Radical Functions
- Rates of Change
- Ratio
- Ratio Fractions
- Rational Exponents
- Rational Expressions
- Rational Functions
- Rational Numbers and Fractions
- Ratios as Fractions
- Real Numbers
- Reciprocal Graphs
- Recurrence Relation
- Recursion and Special Sequences
- Remainder and Factor Theorems
- Representation of Complex Numbers
- Rewriting Formulas and Equations
- Roots of Complex Numbers
- Roots of Polynomials
- Roots of Unity
- Rounding
- SAS Theorem
- SSS Theorem
- Scalar Triple Product
- Scale Drawings and Maps
- Scale Factors
- Scientific Notation
- Second Order Recurrence Relation
- Sector of a Circle
- Segment of a Circle
- Sequences
- Sequences and Series
- Series Maths
- Sets Math
- Similar Triangles
- Similar and Congruent Shapes
- Simple Interest
- Simplifying Fractions
- Simplifying Radicals
- Simultaneous Equations
- Sine and Cosine Rules
- Small Angle Approximation
- Solving Linear Equations
- Solving Linear Systems
- Solving Quadratic Equations
- Solving Radical Inequalities
- Solving Rational Equations
- Solving Simultaneous Equations Using Matrices
- Solving Systems of Inequalities
- Solving Trigonometric Equations
- Solving and Graphing Quadratic Equations
- Solving and Graphing Quadratic Inequalities
- Special Products
- Standard Form
- Standard Integrals
- Standard Unit
- Straight Line Graphs
- Substraction and addition of fractions
- Sum and Difference of Angles Formulas
- Sum of Natural Numbers
- Surds
- Surjective functions
- Tables and Graphs
- Tangent of a Circle
- The Quadratic Formula and the Discriminant
- Transformations
- Transformations of Graphs
- Translations of Trigonometric Functions
- Triangle Rules
- Triangle trigonometry
- Trigonometric Functions
- Trigonometric Functions of General Angles
- Trigonometric Identities
- Trigonometric Ratios
- Trigonometry
- Turning Points
- Types of Functions
- Types of Numbers
- Types of Triangles
- Unit Circle
- Units
- Variables in Algebra
- Vectors
- Verifying Trigonometric Identities
- Writing Equations
- Writing Linear Equations
- Statistics
- Bias in Experiments
- Binomial Distribution
- Binomial Hypothesis Test
- Bivariate Data
- Box Plots
- Categorical Data
- Categorical Variables
- Central Limit Theorem
- Chi Square Test for Goodness of Fit
- Chi Square Test for Homogeneity
- Chi Square Test for Independence
- Chi-Square Distribution
- Combining Random Variables
- Comparing Data
- Comparing Two Means Hypothesis Testing
- Conditional Probability
- Conducting a Study
- Conducting a Survey
- Conducting an Experiment
- Confidence Interval for Population Mean
- Confidence Interval for Population Proportion
- Confidence Interval for Slope of Regression Line
- Confidence Interval for the Difference of Two Means
- Confidence Intervals
- Correlation Math
- Cumulative Distribution Function
- Cumulative Frequency
- Data Analysis
- Data Interpretation
- Degrees of Freedom
- Discrete Random Variable
- Distributions
- Dot Plot
- Empirical Rule
- Errors in Hypothesis Testing
- Estimator Bias
- Events (Probability)
- Frequency Polygons
- Generalization and Conclusions
- Geometric Distribution
- Histograms
- Hypothesis Test for Correlation
- Hypothesis Test for Regression Slope
- Hypothesis Test of Two Population Proportions
- Hypothesis Testing
- Inference for Distributions of Categorical Data
- Inferences in Statistics
- Large Data Set
- Least Squares Linear Regression
- Linear Interpolation
- Linear Regression
- Measures of Central Tendency
- Methods of Data Collection
- Normal Distribution
- Normal Distribution Hypothesis Test
- Normal Distribution Percentile
- Paired T-Test
- Point Estimation
- Probability
- Probability Calculations
- Probability Density Function
- Probability Distribution
- Probability Generating Function
- Quantitative Variables
- Quartiles
- Random Variables
- Randomized Block Design
- Residual Sum of Squares
- Residuals
- Sample Mean
- Sample Proportion
- Sampling
- Sampling Distribution
- Scatter Graphs
- Single Variable Data
- Skewness
- Spearman's Rank Correlation Coefficient
- Standard Deviation
- Standard Error
- Standard Normal Distribution
- Statistical Graphs
- Statistical Measures
- Stem and Leaf Graph
- Sum of Independent Random Variables
- Survey Bias
- T-distribution
- Transforming Random Variables
- Tree Diagram
- Two Categorical Variables
- Two Quantitative Variables
- Type I Error
- Type II Error
- Types of Data in Statistics
- Variance for Binomial Distribution
- Venn Diagrams

Three logicians walk into a bar. The bartender says "would you all like a drink?" The first logician answers "I'm not sure." The second logician also answers "I'm not sure." The third and final logician answers "yes."

This joke requires some knowledge of mathematical logic to understand. The first and second logician's wanted a drink but did not know if **all three** of them would like a drink, which is what the bartender asked, and hence answered "I don't know". If either of these men did not want a drink, they would have instead answered "no", since in this scenario it would not be the case that all three men want a drink. The third man, knowing that if either of the other two men didn't want a drink they would have answered "no", can deduce that all three of them want a drink and thus answers "yes". And who said math couldn't be fun?

Logic is essentially the study of truth and reasoning. Logic is used in everyday life all the time.

For example, given the statements:

If it is raining, I will stay at home.

If I'm at home, I will wear my slippers.

From these statements alone, if you see it raining you are able to conclude that I will be wearing my slippers, even though I never directly told you this. This is an example of logical reasoning.

**Mathematical Logic** is the application of logic to Mathematics.

There are many different types of logic, but the two most fundamental branches of mathematical logic are:

Propositional Logic (Also called Propositional Calculus),

First-Order Logic.

First-order logic is an extension of propositional logic, with a few extra components that allow you to apply the ideas of propositional logic to many other areas of mathematics. For high school mathematics, only propositional logic is needed.

Propositional logic, as the name would suggest, is the study of mathematical propositions.

Let's define what a proposition is.

A **proposition** is a statement that is either true or false.

Some examples of mathematical propositions are:

- 3 is odd.
- 3 is even.
- 7 + 4 = 12.
- 1 is a prime number.

These statements all must be true or false, they cannot be both and they cannot be neither. In propositional logic, there is no 'maybe' or 'sort of'.

Propositions can be joined using **connectives**.

**Connectives **join propositional statements together, into a larger propositional statement.

Connectives are used all the time in both everyday life and mathematics. For example:

3 is a prime number and 3 is odd. The connective here is "and".

3 is a prime number or 2 is odd. The connective here is "or".

It is not true that 7 + 4 = 12. The connective here is "not".

6 is even implies that 6 divisible by 2. The connective here is "implies". This can also be thought of as an "If... then..." statement.

Connectives can be combined to make more complex statements. Consider Pythagoras' Theorem:

"If \(a, b, c\) are the sides of a right-angle triangle and \(c\) is the hypotenuse, then \(a^2 + b^2 = c^2.\)"

This uses the connective "and" as well as the connective "if... then..." or "implies".

The main connectives in propositional logic are:

Connective | Symbol |

And | \( \land \) |

Or | \( \lor \) |

Not | \( \lnot\) |

Implies | \(\implies\) |

Many different, and more complicated, propositions can be made using these connectives.

Rather that writing out sentences, mathematicians often use a letter in place of a proposition. If \(p, q, r, s\) are propositions, then the following are also propositions:

- \( (p \land q): \) \(p\) and \(q.\)
- \( (p \lor q): \) \( p\) or \(q.\)
- \( (\lnot p): \) not \(p.\)
- \( (p \implies q):\) \(p\) implies \(q.\)
- \( ((p \land q) \implies r): \) if \(p\) and \(q,\) then \(r.\)
- \( ((p \land (\lnot q)) \implies (r \land s)) \) if \(p\) and not \(q,\) then \(r\) and \(s.\)

To understand the meaning of these connectives, **truth tables** are used.

A **truth table **is a table showing when a more complicated proposition is true, based on when each individual component of it is true.

Let's first look at the truth table for 'and'. The statement "\(p\) and \(q\)" is only true if \(p\) is true **and** \(q\) is true, hence this is the only case in which the column for \(p \land q\) is true.

\(p\) | \(q\) | \(p \land q\) |

True | True | True |

True | False | False |

False | True | False |

False | False | False |

The truth table for 'or' is below. The statement "\(p\) or \(q\)" is only true if \(p\) is true **or** \(q\) is true. In other words, if one of the statements is true, then the **or** statement is true.

\(p\) | \(q\) | \(p \lor q\) |

True | True | True |

True | False | True |

False | True | True |

False | False | False |

'Not' essentially changes the value from true to false, or false to true. The truth table for '**not**' is:

\(p\) | \(\lnot p\) |

True | False |

False | True |

The truth table for '**implies**' is:

\(p\) | \(q\) | \(p \implies q\) |

True | True | True |

True | False | False |

False | True | True |

False | False | True |

The truth tables for **or** and **not** are quite intuitive, but the truth table for implies can look quite strange at first. To understand it, think of the following statement: "If you can do a backflip, I'll pay you $20." The possibilities here are:

You manage to do the backflip, and I give you the money. In this case, the statement was true.

You manage to do the backflip, and I don't give you the money. In this case, the statement was false.

If you fail to do the backflip, it doesn't matter if I pay you or not, because I have still kept to my word so far. Hence, the statement is still true.

Truth tables can be made for more complicated statements. You can do this by increasing the number of columns, having columns dedicated to each section of the statement with increasing size each time. The truth table for one of the statements you saw earlier, \( ((p \land q) \implies r) \), looks like this:

\(p\) | \(q\) | \(r\) | \(p \land q\) | \( ((p \land q) \implies r) \) |

True | True | True | True | True |

True | True | False | True | False |

True | False | True | False | True |

True | False | False | False | True |

False | True | True | False | True |

False | True | False | False | True |

False | False | True | False | True |

False | False | False | False | True |

As you can see, truth tables grow very quickly. With each new letter added, the number of rows will double. This is why a different format is often used, called **Logic Trees**.

A **logic tree** is another way of showing when a proposition is true.

A **logic tree **is a tree diagram for showing when a proposition is true. Each branching point is when one of the variables is determined to be true or false.

Let's first look at the logic tree for '\(p\) and \(q\)', written \(p \land q\).

You will see that at the top of this tree, there is the letter \(p.\) This is called the **root node**, meaning that it is the first point in the tree, where everything else branches off from, just like the root of a tree. This represents the first decision that has to be made for this proposition: is \(p\) true or false?

Underneath this, there are arrows pointing downwards. These are called **branches**. Each one of these represents an answer to the decision from the previous node above it. The branch on the left represents \(p\) being true, and the branch on the right represents \(p\) being false. This is shown by the labels on the sides of the arrows.

In the situation where \(p\) is false, you know for a fact that \(p \land q\) will also be false, no matter what \(q\) is. This is why underneath the 'false' branch, there is an 'F'. This is called a **leaf node**, and represents the value of the whole proposition.

If you follow the 'true' branch instead, you will reach the letter \(q.\) This is called a **decision node. **This is just like the root node, but instead represents the decision: is \(q\) true or false?** **Since \(p\) being true is not enough to determine for a fact whether or not \(p \land q\) will be true, you must also look into the case where \(q\) is true and \(q \) is false, hence the branches underneath this decision node. Since at this point both of the variables (\(p\) and \(q\)) in the proposition have been explored, the nodes underneath \(q\) must be leaf nodes, as there are no more decisions to be made.

So far, you know how to create the logic tree. But how do you read the tree? Starting at the top!

If \(p\) is false, go down the 'F' branch. You reach an 'F' leaf node, so the proposition must be false if \(p\) is false.

If \(p\) is true, you go down the \(T\) branch to reach the decision node for \(q.\) This makes sense, as it is still unclear whether or not \(p \land q\) is true or false yet.

If from this point, you choose \(q\) to be true, then you go down the \(T\) branch again and will reach a \(T\) leaf node. This means in this case, the proposition must be true.

If from the \(q\) decision node, you instead choose false, you will go down the \(F\) branch instead and reach an \(F\) leaf node. In this case, the proposition must also be false.

Before you look at the logic trees for the other connectives, lets look at a quick example where you must build a logic tree and a truth table side by side using the slippers and rain example from the beginning, to see that truth tables and logic trees represent the same thing.

Make a truth table and logic tree for the following statement:

"If it is raining, I will wear my slippers."

**Solution**

There are two individual propositions in this statement:

- It is raining.
- I am wearing my slippers.

These will be the first two columns in the truth table, and the two decision nodes in the logic tree. Let's begin with the truth table. The truth table will look like this with the first two columns filled in:

Raining. | Slippers. | If it is raining, I will wear my slippers. |

True | True | |

True | False | |

False | True | |

False | False |

If it is raining and I am wearing my slippers, the main proposition must be true. Hence, the first row in the "If it is raining, I will wear my slippers" column must be true.

Raining. | Slippers. | If it is raining, I will wear my slippers. |

True | True | True |

True | False | |

False | True | |

False | False |

If it is raining and I am not wearing my slippers, the proposition must be false. Hence, the second row in the main proposition column must be false.

Raining. | Slippers. | If it is raining, I will wear my slippers. |

True | True | True |

True | False | False |

False | True | |

False | False |

As discussed earlier when the truth table for "implies" was shown, no matter what slippers is, if raining is false then the proposition will be false. For this reason, the last two columns will be false.

Raining. | Slippers. | If it is raining, I will wear my slippers. |

True | True | True |

True | False | False |

False | True | True |

False | False | True |

You may notice that this is exactly the same as the truth table for \( p \implies q.\) This makes sense, as this is exactly the same as the proposition \( p \implies q,\) but where \(p\) and \(q\) have been given real-world meanings.

Now let's build the logic tree for this. The root node will be "It is raining." If it is raining but you do not know whether I am wearing slippers, you can see in the logic table that the proposition could be true or false. This means that the true branch coming off the root node must lead to a decision node for "I am wearing slippers." If it is not raining, you can see in the logic table that the proposition is always true. For this reason, the false branch of the root node must lead to a leaf node for true.

Now, all that is left is to put in the branches and leaf nodes coming off the "I am wearing slippers." decision node. Since in this case "it is raining." is true, if "I am wearing slippers." is true, the whole proposition must be true. Hence, the true branch must lead to a true leaf node. If "I am wearing slippers" is false, the whole proposition must be false, meaning the false branch must lead to a false leaf node. The final logic tree will look like this:

You should notice how no matter which assignment of "truth" and "false" you put for each proposition, the outcome is the same in the logic tree and truth table.

Now that you have seen an example of a logic tree side by side with a truth table, let's look at the logic trees for the other connectives.

The logic tree for '\(p\) **or** \(q\)', written '\( p \lor q\)', is:

The logic tree for '**not **\(p\)', written \(\lnot p\), is:

The logic tree for '\(p\) **implies** \(q\)', written \( p \implies q\), is:

These logic trees all come quite naturally from the truth tables.

Using the same example from the previous section with 3 initial propositions, \( ((p \land q) \implies r),\) you will see that logic trees can greatly simplify the amount of writing required. This is the logic tree for \( ((p \land q) \implies r): \)

First, let's look at some examples where you must work out the truth tables and logic trees of various propositions.

Finish the following truth table, for \( (p \land (\lnot q)).\)

\(p\) | \(q\) | \( \lnot q\) | \(p \land \lnot q\) |

True | True | ||

True | False | ||

False | True | ||

False | False |

**Solution**

First, fill in the column for \(\lnot q.\) Remember that whenever \(q\) is true, \(\lnot q\) will be false, and whenever \(q\) is false, \(\lnot q\) will be true.

\(p\) | \(q\) | \( \lnot q\) | \(p \land \lnot q\) |

True | True | False | |

True | False | True | |

False | True | False | |

False | False | True |

Now you can fill in \(p \land \lnot q.\) Remember that this will only be true if \( p\) is true and \(\lnot q\) is true, and false everywhere else.

\(p\) | \(q\) | \( \lnot q\) | \(p \land \lnot q\) |

True | True | False | False |

True | False | True | True |

False | True | False | False |

False | False | True | False |

Let's look at a similar example, but using a logic tree instead.

Draw a logic tree for the proposition \((\lnot p) \implies q.\)

**Solution**

The first step, as always for logic trees, is the root node. The root node, in this case, will be \(p.\) You can write a \(p\) at the top of the diagram to represent it. This root node will have 2 branches underneath it, one representing the scenario where \(p\) is true, and the other representing the scenario where \(p\) is false. If \(p\) is true, then \( \lnot p\) must be false. \(\lnot p\) is the left-hand side of the implication in our proposition, and as stated when you first looked at the 'implies' connective, if the left hand side of the implication is false, the whole proposition is false. For this reason, \(p\) being false must lead to a 'false' leaf node.

If \(p\) is false, \(\lnot p\) must be true. Since \(\lnot p\) is the left hand side of the implication, and this is true, it is impossibile to tell whether the implication as a whole is true or false, until \(q\) has been decided. For this reason, \(p\) being true must lead to a decision node for \(q.\) The diagram so far will look like this:

If \(p\) is true, \(\lnot p\) will be false and hence the implication will always be true.

The remainder of this proposition is just like a normal implication. Hence, if \(q\) is true, the implication will be true, meaning the 'true' branch must lead to a 'true' leaf node. If \(q\) is false, the implication will also be false, meaning the 'false' branch must lead to a 'false' leaf node. The logic tree for \((\lnot p) \implies q\) will thus look like:

The rest of the implication is the same as the result of a normal implication. Hence \(q\) is true will make the result true, and \(q\) is false will make the result false.

Here, you will see some important properties within logic, and learn how to solve them using truth tables. Before this though, an important relation must be introduced.

Two propositions are **equivalent** if they are always the same, no matter what the individual variables are set as. If \(p\) and \(q\) are equivalent, then you write \( p \equiv q.\)

Don't think that equivalence is the same as 2 things being equal, they are very slightly different. For example, it is okay to say \( 2x = x + 8, \) as you interpret this to be true for a particular value of \(x\) (in this case, \(x = 8\).) However, you can't write \(2x \equiv x + 8,\) because this is not true: \(2x\) and \(x+8\) are not the same thing, they are completely different functions.

Now, onto the properties of logical connectives. These results are:

Double Negation: \[ \lnot (\lnot p) \equiv p. \]

- Commutativity: \[ \begin{align} p \land q & \equiv q \land p, \\ p \lor q & \equiv q \lor p. \end{align}\]

Associativity: \[ \begin{align} ( p \land (q \land r)) & \equiv ( (p \land q) \land r), \\ ( p \lor (q \lor r)) & \equiv ( ( p \lor q) \lor r). \end{align} \]

- Distributivity: \[ \begin{align} ( p \land (q \lor r)) & \equiv ((p \land q ) \lor ( p \land r) ), \\ (p \lor (q \land r)) & \equiv ((p \lor q) \land (p \lor r)). \end{align} \]

These formulas can be proven by showing that their truth tables or their logic trees are the same. Let's look at an example of this.

Prove the associativity of \(\land:\) \[ ( p \land (q \land r)) \equiv ( (p \land q) \land r) \] by filling in the following truth table.

\(p\) | \(q\) | \(r\) | \(q \land r\) | \(p \land q\) | \(( p \land (q \land r)) \) | \( ((p \land q) \land r) \) |

True | True | True | ||||

True | True | False | ||||

True | False | True | ||||

True | False | False | ||||

False | True | True | ||||

False | True | False | ||||

False | False | True | ||||

False | False | False |

**Solution**

First, fill in the column for \(q \land r.\) This will only be true when \(q\) is true and \(r\) is true, and will be false everywhere else. This finished column should look like:

\(p\) | \(q\) | \(r\) | \(q \land r\) | \(p \land q\) | \( ( p \land (q \land r)) \) | \(((p \land q) \land r) \) |

True | True | True | True | |||

True | True | False | False | |||

True | False | True | False | |||

True | False | False | False | |||

False | True | True | True | |||

False | True | False | False | |||

False | False | True | False | |||

False | False | False | False |

Now, you can fill in the column for \(p \land q.\) This works exactly the same as the last column you filled in: it will be true whenever \(p\) is true and \(q\) is true, and false everywhere else.

\(p\) | \(q\) | \(r\) | \(q \land r\) | \(p \land q\) | \(( p \land (q \land r)) \) | \( ((p \land q) \land r) \) |

True | True | True | True | True | ||

True | True | False | False | True | ||

True | False | True | False | False | ||

True | False | False | False | False | ||

False | True | True | True | False | ||

False | True | False | False | False | ||

False | False | True | False | False | ||

False | False | False | False | False |

Now, you can fill in the column for \(( p \land (q \land r)). \) This will only be true when the column for \(p\) is true and the column for \(q \land r\) is true, and will be false everywhere else.

\(p\) | \(q\) | \(r\) | \(q \land r\) | \(p \land q\) | \(( p \land (q \land r)) \) | \( ((p \land q) \land r) \) |

True | True | True | True | True | True | |

True | True | False | False | True | False | |

True | False | True | False | False | False | |

True | False | False | False | False | False | |

False | True | True | True | False | False | |

False | True | False | False | False | False | |

False | False | True | False | False | False | |

False | False | False | False | False | False |

\(p\) | \(q\) | \(r\) | \(q \land r\) | \(p \land q\) | \(( p \land (q \land r)) \) | \( ((p \land q) \land r) \) |

True | True | True | True | True | True | True |

True | True | False | False | True | False | False |

True | False | True | False | False | False | False |

True | False | False | False | False | False | False |

False | True | True | True | False | False | False |

False | True | False | False | False | False | False |

False | False | True | False | False | False | False |

False | False | False | False | False | False | False |

Since the final two columns are the same, it must be the case that these two propositions are equivalent.

Next, let's look at a similar example but using a logic tree instead.

The logic tree for \( ((p \land q ) \lor ( p \land r))\) is:

Prove the distributivity of \(\land\) over \(\lor:\) \(( p \land (q \lor r)) \equiv ((p \land q ) \lor ( p \land r)),\) by showing that their logic trees are identical.

**Solution**

You must find the logic tree for \(( p \land (q \lor r)), \) and show that it is identical to the logic tree given in the question. The first variable to consider is \(p.\) If \(p\) is false, then the whole statement must be false, since it is joined by an and. If \(p\) is true, there it is currently impossible to tell if the whole statement will be true or false, so it must branch onto a logic for \(q.\) hence, the first branches of the logic tree will look like this:

Next, consider what happens when \(q\) is true or false. If \(q\) is true, \(q \lor r\) must be true. Since \(p\) is already true in this scenario, the whole statement \(( p \land (q \lor r))\) must be true. If \(q\) is false, it is unclear whether \(q \lor r\) is true, and hence this must branch onto the decision for \(r.\) The next branches of the logic tree will look like this:

Finally, only the value of \(r\) remains. If \(r\) is true, \((q \lor r)\) must be true, meaning \(( p \land (q \lor r))\) must be true. If \(r\) is false, the or statement is false, meaning the whole statement must also be false since it \(p\) is false in this scenario. Hence, the complete logic tree for \(( p \land (q \lor r)) \) will be:

Given that \(p\) is true and \9q\) is false, if \(r\) is true, the whole statement must be true. If \(r\) is false, the whole statement must be false.

Since this logic tree is identical to the one given in the question, so therefore the two statements must be the same.

The first important result in propositional logic are **De Morgan's Laws**. De Morgan's laws state that:

\[ \begin{align} \lnot ( p \land q ) & \equiv (\lnot p \lor \lnot q) \\ \lnot (p \lor q ) & \equiv (\lnot p \land \lnot q). \end{align} \]

De Morgan's laws can be proven by showing that the logic trees or logic tables are the same for the right hand side and left hand side, just like the properties in the previous section.

Another important result within logic is that implies can be written just using not and or symbols.

\[ p \implies q \equiv \lnot p \lor q.\]

This means that all the examples that you have seen so far can be written just using and, or and not connectives. Again, this can be proven by comparing logic trees or logic tables.

Mathematical logic is essential in many subfields of mathematics. Without the use of logic, many results within mathematics could never have been proven. To name a few, mathematical logic is essential in the fields of:

Proof Theory,

Set Theory,

Recursion Theory,

Model Theory.

Beyond pure mathematics, mathematical logic is essential in computer science, natural sciences, economics, and basically every other field that uses mathematics.

- A
**proposition**is a statement that is either true or false. A**connective**is something that joins two propositions together, creating a new proposition. The main 4 connectives are:**and**:**or**: \( \lor,\)**not**: \( \lnot,\)**implies**: \( \implies.\)

**Truth Tables**and**Logic Trees**are used to show when more complicated propositions are true or false, based on when their individual components are true or false. Equivalencies within logic can be proven by showing that the truth tables or logic trees are identical.- Some important properties of the connectives in propositional logic are:
Double Negation: \[ \lnot (\lnot p) \equiv p. \]

- Commutativity: \[ \begin{align} p \land q & \equiv q \land p, \\ p \lor q & \equiv q \lor p. \end{align}\]
Associativity: \[ \begin{align} ( p \land (q \land r)) & \equiv ( (p \land q) \land r), \\ ( p \lor (q \lor r)) & \equiv ( ( p \lor q) \lor r). \end{align} \]

- Distributivity: \[ \begin{align} ( p \land (q \lor r)) & \equiv ((p \land q ) \lor ( p \land r) ), \\ (p \lor (q \land r)) & \equiv ((p \lor q) \land (p \lor r)). \end{align} \]

- 2 Important properties within logic are:
- De Morgan's Laws: \[ \begin{align} \lnot ( p \land q ) & \equiv (\lnot p \lor \lnot q), \\ \lnot (p \lor q ) & \equiv (\lnot p \land \lnot q). \end{align} \]
- The equivalency of implies: \[ p \implies q \equiv \lnot p \lor q.\]

There are four parts of mathematical logic:

Model Theory

Proof Theory

Recursion Theory, it is also known as computability theory

Set theory

An example of logic is deducing that two truths imply a third truth.

More about Logic

Be perfectly prepared on time with an individual plan.

Test your knowledge with gamified quizzes.

Create and find flashcards in record time.

Create beautiful notes faster than ever before.

Have all your study materials in one place.

Upload unlimited documents and save them online.

Identify your study strength and weaknesses.

Set individual study goals and earn points reaching them.

Stop procrastinating with our study reminders.

Earn points, unlock badges and level up while studying.

Create flashcards in notes completely automatically.

Create the most beautiful study materials using our templates.

Sign up to highlight and take notes. It’s 100% free.

Over 10 million students from across the world are already learning smarter.

Get Started for Free