Skip to content

Probability

Probability Basics

Random Variables

Probability Density Function

Normal Distribution and Gaussian Function

Multivariate Distribution

Joint Distribution

If we have two different RVs representing two different events \(X\) and \(Y\), then we represent the probability of two distinct events \(x \in \mathcal{X}\) and \(y \in \mathcal{Y}\) both happening, which we will denote as follows:

\[ p(X=x AND Y=y) = p(x,y) \]

The function \(p(x,y)\) is called joint distribution.

Independence

Conditional Probability

Theorem of Total Probability and Marginal Distribution

Bayes Rule

Conditional Independence

Moments of RVs

Entropy

Resources

Comments

Back to top