- Algebra
- Arithmetic
- Whole Numbers
- Numbers
- Types of Numbers
- Odd and Even Numbers
- Prime & Composite Numbers
- Sieve of Eratosthenes
- Number Properties
- Commutative Property
- Associative Property
- Identity Property
- Distributive Property
- Order of Operations
- Rounding Numbers
- Absolute Value
- Number Sequences
- Factors & Multiples
- Prime Factorization
- Greatest Common Factor
- Least Common Multiple
- Squares & Perfect Squares
- Square Roots
- Squares & Square Roots
- Simplifying Square Roots
- Simplifying Radicals
- Radicals that have Fractions
- Multiplying Radicals

- Integers
- Fractions
- Introducing Fractions
- Converting Fractions
- Comparing Fractions
- Ordering Fractions
- Equivalent Fractions
- Reducing Fractions
- Adding Fractions
- Subtracting Fractions
- Multiplying Fractions
- Reciprocals
- Dividing Fractions
- Adding Mixed Numbers
- Subtracting Mixed Numbers
- Multiplying Mixed Numbers
- Dividing Mixed Numbers
- Complex Fractions
- Fractions to Decimals

- Decimals
- Exponents
- Percent
- Scientific Notation
- Proportions
- Equality
- Properties of equality
- Addition property of equality
- Transitive property of equality
- Subtraction property of equality
- Multiplication property of equality
- Division property of equality
- Symmetric property of equality
- Reflexive property of equality
- Substitution property of equality
- Distributive property of equality

- Commercial Math

- Calculus
- Differential Calculus
- Limits calculus
- Mean value theorem
- L’Hôpital’s rule
- Newton’s method
- Derivative calculus
- Power rule
- Sum rule
- Difference rule
- Product rule
- Quotient rule
- Chain rule
- Derivative rules
- Trigonometric derivatives
- Inverse trig derivatives
- Trigonometric substitution
- Derivative of arctan
- Derivative of secx
- Derivative of csc
- Derivative of cotx
- Exponential derivative
- Derivative of ln
- Implicit differentiation
- Critical numbers
- Derivative test
- Concavity calculus
- Related rates
- Curve sketching
- Asymptote
- Hyperbolic functions
- Absolute maximum
- Absolute minimum

- Integral Calculus
- Fundamental theorem of calculus
- Approximating integrals
- Riemann sum
- Integral properties
- Antiderivative
- Integral calculus
- Improper integrals
- Integration by parts
- Partial fractions
- Area under the curve
- Area between two curves
- Center of mass
- Work calculus
- Integrating exponential functions
- Integration of hyperbolic functions
- Integrals of inverse trig functions
- Disk method
- Washer method
- Shell method

- Sequences, Series & Tests
- Parametric Curves & Polar Coordinates
- Multivariable Calculus
- 3d coordinate system
- Vector calculus
- Vectors equation of a line
- Equation of a plane
- Intersection of line and plane
- Quadric surfaces
- Spherical coordinates
- Cylindrical coordinates
- Vector function
- Derivatives of vectors
- Length of a vector
- Partial derivatives
- Tangent plane
- Directional derivative
- Lagrange multipliers
- Double integrals
- Iterated integral
- Double integrals in polar coordinates
- Triple integral
- Change of variables in multiple integrals
- Vector fields
- Line integral
- Fundamental theorem for line integrals
- Green’s theorem
- Curl vector field
- Surface integral
- Divergence of a vector field
- Differential equations
- Exact equations
- Integrating factor
- First order linear differential equation
- Second order homogeneous differential equation
- Non homogeneous differential equation
- Homogeneous differential equation
- Characteristic equations
- Laplace transform
- Inverse laplace transform
- Dirac delta function

- Differential Calculus
- Matrices
- Pre-Calculus
- Lines & Planes
- Functions
- Domain of a function
- Transformation Of Graph
- Polynomials
- Graphs of rational functions
- Limits of a function
- Complex Numbers
- Exponential Function
- Logarithmic Function
- Sequences
- Conic Sections
- Series
- Mathematical induction
- Probability
- Advanced Trigonometry
- Vectors
- Polar coordinates

- Probability
- Geometry
- Angles
- Triangles
- Types of Triangles
- Special Right Triangles
- 3 4 5 Triangle
- 45 45 90 Triangle
- 30 60 90 Triangle
- Area of Triangle
- Pythagorean Theorem
- Pythagorean Triples
- Congruent Triangles
- Hypotenuse Leg (HL)
- Similar Triangles
- Triangle Inequality
- Triangle Sum Theorem
- Exterior Angle Theorem
- Angles of a Triangle
- Law of Sines or Sine Rule
- Law of Cosines or Cosine Rule

- Polygons
- Circles
- Circle Theorems
- Solid Geometry
- Volume of Cubes
- Volume of Rectangular Prisms
- Volume of Prisms
- Volume of Cylinders
- Volume of Spheres
- Volume of Cones
- Volume of Pyramids
- Volume of Solids
- Surface Area of a Cube
- Surface Area of a Cuboid
- Surface Area of a Prism
- Surface Area of a Cylinder
- Surface Area of a Cone
- Surface Area of a Sphere
- Surface Area of a Pyramid
- Geometric Nets
- Surface Area of Solids

- Coordinate Geometry and Graphs
- Coordinate Geometry
- Coordinate Plane
- Slope of a Line
- Equation of a Line
- Forms of Linear Equations
- Slopes of Parallel and Perpendicular Lines
- Graphing Linear Equations
- Midpoint Formula
- Distance Formula
- Graphing Inequalities
- Linear Programming
- Graphing Quadratic Functions
- Graphing Cubic Functions
- Graphing Exponential Functions
- Graphing Reciprocal Functions

- Geometric Constructions
- Geometric Construction
- Construct a Line Segment
- Construct Perpendicular Bisector
- Construct a Perpendicular Line
- Construct Parallel Lines
- Construct a 60° Angle
- Construct an Angle Bisector
- Construct a 30° Angle
- Construct a 45° Angle
- Construct a Triangle
- Construct a Parallelogram
- Construct a Square
- Construct a Rectangle
- Locus of a Moving Point

- Geometric Transformations

- Sets & Set Theory
- Statistics
- Collecting and Summarizing Data
- Common Ways to Describe Data
- Different Ways to Represent Data
- Frequency Tables
- Cumulative Frequency
- Advance Statistics
- Sample mean
- Population mean
- Sample variance
- Standard deviation
- Random variable
- Probability density function
- Binomial distribution
- Expected value
- Poisson distribution
- Normal distribution
- Bernoulli distribution
- Z-score
- Bayes theorem
- Normal probability plot
- Chi square
- Anova test
- Central limit theorem
- Sampling distribution
- Logistic equation
- Chebyshev’s theorem

- Difference
- Correlation Coefficient
- Tautology
- Relative Frequency
- Frequency Distribution
- Dot Plot
- Сonditional Statement
- Converse Statement
- Law of Syllogism
- Counterexample
- Least Squares
- Law of Detachment
- Scatter Plot
- Linear Graph
- Arithmetic Mean
- Measures of Central Tendency
- Discrete Data
- Weighted Average
- Summary Statistics
- Interquartile Range
- Categorical Data

- Trigonometry
- Vectors
- Multiplication Charts
- Time Table
- 2 times table
- 3 times table
- 4 times table
- 5 times table
- 6 times table
- 7 times table
- 8 times table
- 9 times table
- 10 times table
- 11 times table
- 12 times table
- 13 times table
- 14 times table
- 15 times table
- 16 times table
- 17 times table
- 18 times table
- 19 times table
- 20 times table
- 21 times table
- 22 times table
- 23 times table
- 24 times table

- Time Table

# Independent Events – Explantion & Examples

There are many problems related to probability theory that deal with more than one event. In some cases, the occurrence of one event affects the probability of another event. In others, the probabilities of the events remain unaffected by each other. Such events are termed independent events.

**Two events are said to be independent if the occurrence of one event has no effect on the probability of occurrence of the other event.**

After reading this article, you should understand the following:

- Independent events
- Identifying two events are independent
- Solving problems related to independent events
- Various formulae related to probabilities of independent events

To understand the concept of independent events, it is advisable to refresh the following topics:

## What Are Independent Events

Suppose we toss a coin twice. The probability of getting Heads (or Tails) in the second toss is $\frac12$ irrespective of whether we got Heads or Tails in the first toss. Similarly, if we are given the additional information that the second toss resulted in Heads, it does not change the probability of the first toss, which remains $\frac12$ for both Heads and Tails.

Hence, when we toss a coin twice, both tosses are independent of each other. Similarly, if we roll a die $n$ times (or roll $n$ dice together), each roll is independent of the other since the outcome of one roll cannot affect the outcome of other rolls.

### How to Tell If Two Events Are Independent

To analyze whether two events are independent or not, we first need to understand the concept of conditional probability.

### Conditional Probability

Conditional probability $P(A|B)$ is the probability of event A given the information that B has already taken place. For any two events $A$ and $B,$ $P(A|B)$ is given as

$P(A|B) = \frac{P(A \cap B)}{P(B)}$

Let’s reconsider an example

**Example 1:**

We roll a six-sided fair die. Let $E1$ be the event that the outcome is $3$. Let $E2$ be the event that the outcome is odd. Find $P(E1|E2)$.

Solution:

Note that the sample space $S = \{1,2,3,4,5,6\}$ has six elements. Also, $E1=\{3\}$ and $E2 = \{1,3,5\}$ and $E1 \cap E2 = \{3\}$, hence $P(E2) = \frac36$ and $P(E1 \cap E2) = \frac16$. Therefore,

$P(E1|E2) = \frac{\frac16}{\frac36} = \frac13$.

### Conditional Probability and Independence

Now that we have the tool of conditional probability with us, we can easily define the concept of Independence mathematically. By definition, $A$ and $B$ are independent if the occurrence of $B$ has no effect on the probability of $A$, and similarly, the occurrence of $A$ does not affect $B$. Using the concept of conditional probability, we can write that $A$ and $B$ are independent if $P(A|B) = P(A)$ and $P(B|A) = P(B)$. Using the formula for conditional probability, we can write

$P(A|B) = \frac{P(A \cap B)}{P(B)} = P(A)$

Hence, if $A$ is independent of $B$, then $P(A \cap B) = P(A)P(B)$.

Similarly, we can show that if $P(A \cap B) = P(A)P(B)$, then $P(B|A) = P(B)$.

### Independent Events Definition

Two events are said to be independent if the occurrence of one event has no effect on the probability of occurrence of the other event.

Mathematically, two events $A$ and $B$ are said to be independent if $P(A \cap B) = P(A)P(B)$.

## How to Solve Independent Events

Let us consider an example to see how to solve independent events using the above definition.

* Example 2:* We roll a dice twice. Let us define $E1$ as the event that the first outcome is odd. Let $E2$ be the event that both outcomes are the same. Finally, let $E3$ be the event that the sum of outcomes is even.

1. Are $E1$ and $E2$ independent?

1. Are $E2$ and $E3$ independent?

1. Are $E3$ and $E1$ independent?

Solution:

The sample space for two dice rolls is:

$\{(1,1), (1,2), (1,3), (1,4), (1,5), (1,6),$

$(2,1), (2,2), (2,3), (2,4), (2,5), (2,6),$

$(3,1), (3,2), (3,3), (3,4), (3,5), (3,6),$

$(4,1), (4,2), (4,3), (4,4), (4,5), (4,6),$

$(5,1), (5,2), (5,3), (5,4), (5,5), (5,6),$

$(6,1), (6,2), (6,3), (6,4), (6,5), (6,6)\}$.

The event $E1$ can be written as

$E1 = \{1,1), (1,2), (1,3), (1,4), (1,5), (1,6),$

$(3,1), (3,2), (3,3), (3,4), (3,5), (3,6),$

$(5,1), (5,2), (5,3), (5,4), (5,5), (5,6)\}$.

$E2$ can be written as

$E2 = \{(1,1), (2,2), (3,3), (4,4), (5,5), (6,6)\}$. And $E3$ is written as

$E3 = \{(1,1), (1,3), (1,5), (2,2), (2,4), (2,6),$

$(3,1), (3,3), (3,5), (4,2), (4,4), (4,6),$

$(5,1), (5,3), (5,5), (6,2), (6,4), (6,6)\}$.

Since there are $36$ members of the sample space, $18$ members in $E1$, $6$ members in $E2$ and $18$ members in $E3$, hence

$P(E1) = \frac{18}{36} = \frac12$

$P(E2) = \frac{6}{36} = \frac16$, and

$P(E3) = \frac{18}{36} = \frac12$.

1)

To check if $E1$ and $E2$ are independent, we first find $E1 \cap E2 = \{(1,1), (3,3), (5,5)\}$. There are three members in $E1 \cap E2$, hence

$P(E1 \cap E2) = \frac{3}{36} = \frac{1}{12}$.

Now $P(E1) \times P(E2) = \frac12 \times \frac16 = \frac{1}{12}$.

This shows that $P(E1 \cap E2) = P(E1)P(E2)$, hence $E1$ and $E2$ are independent events.

2)

We now find $E2 \cap E3 = \{(1,1), (2,2), (3,3), (4,4), (5,5), (6,6) \}$. Hence,

$P(E2 \cap E3) = \frac{6}{36} = \frac16$.

Now $P(E2) \times P(E3) = \frac16 \times \frac12 = \frac{1}{12}$.

We note that $P(E2 \cap E3) \neq P(E2) \times P(E3)$, hence $E2$ and $E3$ are dependent events.

3)

We note that $E1 \cap E3 = \{(1,1), (1,3), (1,5), (3,1), (3,3), (3,5), (5,1), (5,3), (5,5)\}$.

$P(E1 \cap E3) = \frac{9}{36} = \frac14$.

Now, $P(E1) \times P(E3) = \frac12 \times \frac12 =\frac14$.

Since, $P(E1 \cap E3) = P(E1)\times P(E3)$, hence $E1$ and $E3$ are independent events.

### Independent Events Formula

We list the various relations between $A$ and $B$, if $A$ and $B$ are independent

1. $P(A|B) = P(A)$

2. $P(B|A) = P(B)$

3. $P(A \cap B) = P(A)P(B)$

4. $P(\textrm{NOT} \, A \cap \textrm{NOT}\, B) = P(\textrm{NOT}\, A)P(\textrm{NOT} \,B)$

5. $P(A \cup B) = P(A) + P(B) – P(A)P(B)$

**Example 3:**

A probability exam contains two Questions Q1 and Q2. The probability that a student would solve the first question accurately is $60\%$, and the probability of solving the second question correctly is $30\%$. Both questions are independent of each other. Find the probability

1. A student will solve both questions correctly.

2. A student will solve both questions incorrectly

3. A student will at least solve one question correctly.

Solution:

Let $E1$, $E2$ be the events that a student correctly solves Q1 and Q2, respectively.

1)

We are interested in the probability $P(E1 \,\textrm{AND}\, E2) = P(E1 \cap E2)$. Since, $E1$ and $E2$ are independent, hence $P(E1 \cap E2) = P(E1)\times P(E2) = 0.6 \times 0.3 = 0.18 = 18\%$

2)

The probability that a student will incorrectly solve the first question is $1 – P(E1) = 0.4$.

The probability that a student will incorrectly solve the second question is $1 – P(E2) = 0.7$.

Since both questions are independent, hence

$P(\textrm{Both questions incorrect}) = (1 – P(E1)) \times (1-P(E2)) = 0.4 \times 0.7 = 0.28$.

3)

We are interested in the probability $P(E1 \,\textrm{OR}\, E2) = P(E1 \cup E2)$. Note that, in this context, OR means either $E1$ or $E2$ or both. Using the formulae for the independent events discussed above, we can write

$P(E1 \cup E2) = P(E1) + P(E2) – P(E1)P(E2)$

$P(E1 \cup E2) = 0.6 + 0.3 – (0.6 \times 0.3) = 0.72 = 72\%$

**Example 4:**

You are travelling from location A to location B using a bus and a train. The probability that the bus will get delayed is $10\%$, and the probability that the train will get delayed is $5\%$. Both events are independent. Find the probability that

1. You will experience a delay during your travelling.

2. You will get on time to location B

Solution:

Let $E1$ represent the event that the bus gets delayed, and $E2$ is the event that the train gets delayed.

1)

We will experience a delay if either the bus gets delayed or the train gets delayed or both get delayed. So, we are interested in finding the probability $P(E1 \,\textrm{OR}\, E2) = P(E1 \cup E2)$. Using the formulae for the independent events, we can write

$P(E1 \cup E2) = P(E1) + P(E2) – P(E1)P(E2)$

$P(E1 \cup E2) = 0.1 + 0.05 – (0.1)(0.5) = 0.145 = 14.5\%$

2)

To get on time, we need that neither the bus gets delayed nor the train gets delayed. Hence, we need to find the probability

$P(\textrm{NOT} \, E1 \textrm{AND}\, \textrm{NOT}\, E2) = P(\textrm{NOT} \,E1 \cap \,\textrm{NOT} \,E2)$.

$P(\textrm{NOT} \,E1) = 1 – P(E1) = 1 – 0.1 = 0.9$.

$P(\textrm{NOT} \,E2) = 1 – P(E2) = 1 – 0.05 = 0.95$.

Since, both events are independent

$P(\textrm{NOT} \,E1 \cap \,\textrm{NOT} \,E2) = P(\textrm{NOT} \,E1) \times P(\textrm{NOT} \,E2)$.

$P(\textrm{NOT} \,E1 \cap \, \textrm{NOT}\, E2) = 0.9 \times 0.95 = 0.855 = 85.5\%$

**Example 5:**

A message is transmitted from Node-A to Node-B through three intermediate nodes. The message will be successfully transmitted only if all the intermediate nodes are working. The probability that an intermediate node will fail is $1\%$. All nodes are independent of each other. What is the probability that you will not successfully transmit the message?

Solution:

We first find the probability that you will successfully transmit the message. For successful transmission, we need all nodes to be working. The probability that a node will not fail is

$P(\textrm{Node does not fail}) = 1 – P(\textrm{Node fails}) = 1 – 0.01 = 0.99$.

Since all nodes are independent so the probability that node 1 AND node 2 and node 3 do not fail = $0.99 \times 0.99 \times 0.99 = 0.97$.

Hence, the message will be successfully transmitted with a probability of $97\%$.

Accordingly, $P(\textrm{message is not successful}) = 1 – P(\textrm{message is successful}) = 1 – 0.97 = 0.03 = 3\%$.

## Independent Events and Tree Diagrams:

There are many scenarios of interest where we repeat the same experiments many times. For instance, tossing a coin three times or rolling a die four times, etc. Tree diagrams offer a useful tool to analyze such events. Let us consider how to make tree diagrams when we know that the multiple attempts or trials of the same experiment are independent of each other.

For simplicity, we consider an experiment with two outcomes only. Let’s call those $A$ and $A’$. When the experiment is performed the first time, we can have two possible outcomes, as shown in the tree diagram below. We label the branches of the tree diagram with the probabilities of the events $A1$ and $A’1$, where $1$ denotes that the first attempt.

Now we perform the same experiment again. If the outcome of the first attempt was $A$, then we get a tree diagram as shown below. Note that, in this case, we are drawing the branches of the tree diagram for the case when the first outcome is known to be $A$. Hence the branches are labelled with probabilities $P(A2|A1)$ and $P(A’2|A1)$. However, since the trials are known to be independent, hence $P(A2|A1) = P(A2)$ and $P(A’2|A1) = P(A’2)$.

Similarly, we draw the branches for the case when the first attempt was $A’$ as shown below:

The overall tree diagram is as follows: