ST2131/MA2216: Probability
AY2012/2013, Semester 2, Lecturer: Adrian Roellin, David Chew
Course Coverage:
1. Combinatorial Analysis
2. Axioms of Probability
3. Conditional Probability & Independence
4. Discrete Random Variables & Distributions
5. Continuous Random Variables & Distributions
6. Jointly Distributed Random Variables
7. Properties of Expectations
8. Limiting Theorems
Probability is a cross-listed module offered by both mathematics and statistics departments. This module focus on the mathematical properties of combinatorics and distributions, without the statistical tests.
The first half of the module begins with counting, permutations and combinations followed with the axioms of probability which form the basis for all statistical analysis. This part is generally straightforward, reintroducing the concept of probability in a formal setting. From the axioms of probability, conditional probabilities, Bayes formulas and the concept of independent events are introduced. Conditional probabilities examine the probability of event A given that some event B has occurred. If the events are independent of each other, than the conditional probability of event A given event B is equals the total probability of event A happening.
Random variables that are discrete have ranges of possible events that are finite or countably infinite. This chapter covers the expectations and variance of discrete random variables, as well as distributions that arise from discrete random variable such as the binomial or Poisson distribution. The Poisson arrival process is an important stochastic process that has many applications in decision science and engineering.
A continuous random variable is a mapping of the sample space to real numbers. Expectations and variance of random variables are covered together with 4 main distributions that arise from continuous random variables: uniform distribution, normal distribution, exponential distribution and gamma distribution. The Poisson distribution, exponential distribution and the gamma distribution are related in the following way: If the arrival rate follows a Poisson process, then the interarrival time is exponentially distributed and the cumulative time for n arrivals is gamma distributed. (Honestly, I did not realize this until I took DSC3215.)
Towards the end, the properties of expectations are studied. Aside from mean, variance and covariance, this chapter also covers conditional expectations, conditional variance and moment generating functions. The final topic of the semester are the limit theorems, which introduces Markov's and Chebyshev's inequalities to derive the weak law of large number. The Weak Law of Large Number states that as the sample becomes large, the probability that the sample mean deviates from the population mean approaches zero. The chapter then moves on to the Central Limit Theorem. It is one of the most remarkable results in probability because it provides a rigourous basis for normal approximation of large samples. The last theorem taught was the Strong Law of Large Numbers, which states that the limit of the sample mean is the population mean.
This module is highly recommended for all undergraduates because any research would require some form of statistics and this module provides the necessary background in a rigourous manner. Alternatively, ST2334: Probability and Statistics might be a better option because it covers both probability and statistics which makes it a more applied-oriented module.
Workload: Light
Difficulty: Difficult
Grade: B-
Course Coverage:
1. Combinatorial Analysis
2. Axioms of Probability
3. Conditional Probability & Independence
4. Discrete Random Variables & Distributions
5. Continuous Random Variables & Distributions
6. Jointly Distributed Random Variables
7. Properties of Expectations
8. Limiting Theorems
Probability is a cross-listed module offered by both mathematics and statistics departments. This module focus on the mathematical properties of combinatorics and distributions, without the statistical tests.
The first half of the module begins with counting, permutations and combinations followed with the axioms of probability which form the basis for all statistical analysis. This part is generally straightforward, reintroducing the concept of probability in a formal setting. From the axioms of probability, conditional probabilities, Bayes formulas and the concept of independent events are introduced. Conditional probabilities examine the probability of event A given that some event B has occurred. If the events are independent of each other, than the conditional probability of event A given event B is equals the total probability of event A happening.
Random variables that are discrete have ranges of possible events that are finite or countably infinite. This chapter covers the expectations and variance of discrete random variables, as well as distributions that arise from discrete random variable such as the binomial or Poisson distribution. The Poisson arrival process is an important stochastic process that has many applications in decision science and engineering.
A continuous random variable is a mapping of the sample space to real numbers. Expectations and variance of random variables are covered together with 4 main distributions that arise from continuous random variables: uniform distribution, normal distribution, exponential distribution and gamma distribution. The Poisson distribution, exponential distribution and the gamma distribution are related in the following way: If the arrival rate follows a Poisson process, then the interarrival time is exponentially distributed and the cumulative time for n arrivals is gamma distributed. (Honestly, I did not realize this until I took DSC3215.)
Towards the end, the properties of expectations are studied. Aside from mean, variance and covariance, this chapter also covers conditional expectations, conditional variance and moment generating functions. The final topic of the semester are the limit theorems, which introduces Markov's and Chebyshev's inequalities to derive the weak law of large number. The Weak Law of Large Number states that as the sample becomes large, the probability that the sample mean deviates from the population mean approaches zero. The chapter then moves on to the Central Limit Theorem. It is one of the most remarkable results in probability because it provides a rigourous basis for normal approximation of large samples. The last theorem taught was the Strong Law of Large Numbers, which states that the limit of the sample mean is the population mean.
This module is highly recommended for all undergraduates because any research would require some form of statistics and this module provides the necessary background in a rigourous manner. Alternatively, ST2334: Probability and Statistics might be a better option because it covers both probability and statistics which makes it a more applied-oriented module.
Workload: Light
Difficulty: Difficult
Grade: B-