In this section, we will first introduce random variables, which will enable us to model diverse random experiments in a unified mathematical framework. We will also learn about probability distributions, which are useful for describing and predicting how likely random events are to occur, and expected values which summarize distributions as a single value.
Random variables
Sample spaces of random experiments are fairly arbitrary objects (e.g., {heads, tails}, values showing on dice, playing cards). This generality makes probability theory powerful. But unfortunately, such generality is also inconvenient – mathematics works best with concrete definitions.
So how can we develop a uniform framework for coin flips, die rolls, card draws, etc.? We define random variables to represent random outcomes.
A random variable (RV) assigns a real number to any possible outcome of a trial or experiment. Sometimes the RV can be defined in a natural way for example for a die roll, the random variable can be defined as the number showing on the die. In other case, we have to define an arbitrary mapping such as .
We actually represented outcomes with numbers with foresight in the last section; we just didn’t mention the term random variables (it’s a good idea to review this table and the examples after it).
Random variable are shown by capital letters, such as . When a particular outcomes occurs the random variable takes the corresponding value. For example, let be the random variable defined for a coin toss, where we assign . Then if heads shows, we say while if tails shows we say .
Random variables allow us to translate general ideas about probability to numeric concepts:
Finite or discrete sample space: finite or discrete range of random variables (The range of a random variable is the set of values it can take)
Continuous sample space: continuous range of random variable
Events as sets of outcomes: events as sets of numbers, intervals, or collections of intervals, on real number line
Probability function on events: probability function on intervals, or ranges, on real number line
A random variable whose range is discrete, is called a discrete random variable and those with continuous ranges are called continuous random variables.
Define a random variable for
a birthdate (discrete)
a temperature (continuous)
Consider a random variable , defined as the number showing on a die. We can now represent events as membership in a set of numbers. For example, the event that an odd number shows on the die can be written as and its probability can be written as
A die is rolled and the number showing on it is denoted by . Find the probability of the events below.
,
When an experiment has more than one component, for example, when an action is repeated, we can describe the outcomes as a tuple of random variables. For example, suppose a die is rolled twice. Let the result of the first roll be denoted by and the result of the second roll be denoted by . We can represent each outcome with a pair of numbers, i.e., . For example, (2,3) represents the first die showing 2 and the second die showing 3.
Random variables can also be functions of other random variables. Continuing the two dice example, we can define a third random variable as the sum of and , i.e., .
What is the range of the random variable defined above (i.e., the set of possible values)? What is the probability of . (I strongly recommend reviewing this example and also this example.)
The smallest value for is 2, which occurs when . It largest value is which occurs when . So the range of is . There are two cases where , i.e., and . So the probability is .
A fair coin is flipped twice (all outcomes are equally likely). Let be the number of heads. What is .
The sample space is . Then .
A binary sequence of length 3 is chosen at random with all choices equally likely. Let be the number of 0s and be the number of 1s in this sequence. What is the probability that ?
The possibilities are
So the event consists of sequences 000,001,010,100 so its probability is . We could have also arrived at this by symmetry. Either or as is not possible. By symmetry, the probability must be .
When dealing with more than one random variable, say and , we can write to indicate the intersection of the two events and , i.e., .
Describing probabilities: distributions
Probability over discrete sets
Suppose a die is rolled twice, with all outcomes equally likely. Consider the random variable defined as the sum of the two die rolls. The sample space, with all outcomes equally likely, is
and the range of the random variable is the set {2,3,4,5,6,7,8,9,10,11,12}. We can find the probability of each of these values:
Now that we have these probabilities, we can plot , for , which you can see below:
0,0
The function is the distribution of . It determines the probability of each value that can take. This is useful for a few reasons. First, after we determine the distribution, we no longer need to recalculate the probabilities of outcomes of interest. Second, we can calculate the probabilities of different events by summing the probabilities of the relevant outcomes. Finally, as we will see, it will allow us to define quantities such as the expected value, which help us better understand the behavior of .
Let be the sum of two die rolls as above.
Find .
What is the most likely value?
As we can see in the distribution in the graph, the most likely value is .
In this example, our random variable was discrete. The distribution of a discrete random variable is called a probability mass function (pmf). Note that pmf’s do not apply to continuous random variables. We will discuss continuous probability distributions later, albeit much more briefly.
For brevity, instead of we may write or .
Let denote the number showing when a die is rolled. Plot the distribution of .
Since all outcomes are equally likely, for .
0,0
A binary sequence of length 3 is chosen at random with all choices equally likely. Let be the number of 0s this sequence.
Find the pmf of .
What is the most likely value(s)?
The sample space is . We thus have
0,0
and are the most likely values.
Independence
Suppose we have a fair die, which when rolled has equal probability of showing each of the numbers 1 to 6, and a fair coin with equal probability of heads and tails. Let denote the number showing on the die and be equal to 0 if heads shows and equal to 1 if tails shows. If you know what is, does that provide any information about ? It doesn’t seem so. For example, if , that doesn’t affect the probability of . The latter probability is still .
Two random variables and are called independent if the value of one does not affect the other.
We described above what independence intuitively mean. What about a mathematical definition?
Two random variables and are called independent if for any possible values and ,
This definition may seem strange at first sight but it agrees with our intuitive understanding of independence. For example, what is the probability , with and defined as above? The mathematical definition of independence says if and are independent, as we believe they are, then
Suppose the experiment is performed many many times. Would we intuitively expect to see in a twelfth of the trials? Yes! In 1/6 of the trials, we would expect to see . Since the coin doesn’t care about the die, among the trials in which shows, in about half of them the coin should show heads (). So overall, in of the trials, we would expect to see both and .
Let’s give this a try:
Click the button below to simulate 60 die roll and coin flip trials (heads are shown with 🙂 and tails with 🏛️). In what fraction of the trials do you observe a 4? In what fraction do you observe heads? In what fraction do you observe both 4 and heads?
In my experiment, I observed 11 4s, 36 heads, and 7 cases with both 4 and heads. So equation we need for independence nearly holds:
Suppose two dice are rolled, with all outcomes equally likely. Let be the first die, be the second die, and be their sum.
Show that and are independent (using the mathematical definition and the fact that all outcomes are equally likely).
Show that and are not independent by considering the probability .
Note that to prove independence you need to prove it for all possible values, but to prove that two random variables are not independent, it suffices to show that the equality in the definition does not hold for one pair of values.
We'll only solve the first part. For each ,
So
proving independence.
The definition of independence also extends to events:
Two events and are called independent if
Sometimes the components of an experiment are physically independent from each other, for example, when two dice are rolled, one by one. In such cases, independence is very natural. But independence also may be the case when the two events are physically related or may result from a single action. For example, consider a deck of cards from which you draw a card at random. The color of the card could be red or black; its suit can be heart, diamond, club or spade; and its rank may be . Let’s check the independence of a few events:
‘heart’ and ‘ace’ are independent:
‘red’ and ‘heart’ are not independent:
‘black’ and ‘heart’ are not independent:
‘ace’ and ‘red’ are independent:
The examples above show that color and suit are not independent. Since the first example works for any rank, not just ace, and for any color, not just red, the rank and color are independent. Similarly, rank and suit are independent.
Let denote the probability of rain and let denote the probability that UVa wins in a given basketball game. We can assume and are independent and , . Find
Show that if and are independent, then and are also independent.
where we have used this fact for the third equality.
Events defined using random variables: If two random variables and are independent, any event defined based on these will also be independent. (This is an important result but we will not prove it.) For example, if and are two independent die rolls, then
Specifically, corresponds to the event .
What about more than two random variables?
We can talk about the independence of any pair of random variables.
We can talk about independence among any group of random variables:
If an entire group of random variables are independent, we say they are mutually independent.
Mutual independence implies pairwise independence, but the converse is not true.
Bernoulli and binomial distributions
Bernoulli distribution
The distribution of a random variable that takes only two values, typically 0 and 1, is called a Bernoulli distribution, where the probability of 1 is usually denoted by , i.e., . Such a random variable results from an experiment with two outcomes such as flipping a coin, playing a game (no draw), performing any task that may lead to success or failure, etc. The distribution is determined in full by :
As an example, for , the plot of the pmf is given below.
0,0
If the distribution of is Bernoulli with probability of 1 equal to , we write . The most common case is resulting from a fair coin.
Binomial distribution
An archer hits the target with probability . She participates in a competition that involves shooting three times and we can assume each shot is independent from the others. Let denote the number of times she hits the target. What is the distribution of ?
Let’s show each outcome as a binary sequence of length 3, with 1 denoting hitting the target. We have
Let us now find the probability of each event. corresponds to three hits. Since each hit has probability and they are independent,
Similarly, the probability of is
The case of is trickier. There are three outcomes in this event. But they all have the same probability:
So . Similarly, .
Now let us consider the general case, when the archer tries times. What is the probability of hits, i.e., ? The probability of a particular sequence of hits and misses is . But we also need to consider the number of such sequences. Each sequence of hits and misses is equivalent to a binary sequence with 1s and 0s. There are such sequences. Putting thing together,
This is called the binomial distribution, written as .
Below, you can plot the Binomial pmf for different values of and .
50, 0.5
0,0
An archer hits the target with probability 0.1. If she shoots 10 times, find the following probabilities: she hits the target once; she hits the target at least once; and she hits the target 5 or 6 times.
Let denote the number of times that the archer hits the target. We have
Probability over continuous sets
What if we are interested in the amount of rainfall, blood pressure, etc.? Here the probability is over a continuous set.
The distribution is then shown with a probability density function, or pdf for short, which is a continuous curve. An example is shown below. Intuitively, wherever pdf is larger, the surrounding region has a higher probability.
0,0
The pdf shown above is for a random variable with Gaussian distribution, whose formula is
where and control the shape of the distribution (similar to and for binomial distribution.) In the figure .
If the pdf is given by a function , then we can compute event probabilities using integrals:
If we are interested in the probability that is between -1 and 1, we can find it as
which is the area of the shaded region in the graph below.
0,0
Below, we simulate 20 values from this distribution:
13 of them are in the interval of interest, a fraction of which is close to the probability .
Our discussion of continuous distributions will for the most part be limited to the above. We won’t need to compute integrals and Gaussian distribution is the only one we will consider, although there are many more common continuous distributions.
Write a question that you still have about this section.