In the last two posts we discussed about the basics of digital communication systems and channel models. In this post, we will discuss about the probability functions and we will see the important characteristics of them along with the equivalent matlab functions.

**Random Variables:
**

Given an experiment having sample spaces **S** and elements ** s ɛ**, we define a function X(s) whose domain is

S

**S**and range is a set of numbers on the real line. The function X(s) is called

*random variable.*

**Probability Distribution Function / Cumulative Distribution Function (CDF)
**

Given a random variable X, let us consider the event {X ≤ x} where x is any real number in the interval (-∞, ∞). We write the probability of this event as P(X≤ x) and denote it simply by F(x), i.e.

F(x) = P(X≤ x) -∞ < x < ∞

Function F(x) is called *Probability Distribution Function*, also known as *Cumulative Distribution Function (CDF).
*

**Probability Density Function (PDF)
**

The derivative of CDF is known as probability density function (PDF) and it is denoted as *p(x)*. Mathematically:

Equivalently

This is for continuous variables, but in case of discrete random variables PDF is expressed as

**Statistical Averages of Random variables:
**

The first and second moments of a single random variable and the joint moments such are correlation and covariance between any pair of random variables in a multidimensional set of random variables are of particular interests.

For all the cases below: X is single random variable and its PDF is p(x).

**Mean or Expected value of X:
**

This is first moment of the random variable X. *E()* denotes expectation (statistical averaging)

The n-th moment is defined as

if m_{x } is the mean value of the random variable X, the nth central moment is defined as

**Variance:**

when n = 2 the central moment is called the variance of the random variable and denoted as σ

variance provides the measure of the dispersion of the random variable X.

**Joint moment:**

in case of two random variables X1 and X2 with joint PDF *p(x1,x2)* we define joint moment as:

*joint central moment *as:

when k = n = 1, we get correlation and covariance of the random variables X_{1} and X_{2} i.e. Correlation between X_{i} and X_{j} is given by joint moment

and covariance of X_{i} and X_{j} is

the n x n matrix with the elements µ_{ij} is called the covariance matrix of the random variables X_{i}, i = 1,2,……..n

The two random variables are said to be uncorrelated if

E(X_{i}X_{j}) = E(X_{i})E(X_{j}) = m_{i}m_{j}

which means µ_{ij} =0.

i.e. X_{i} and X_{j} are statistically independent then they are uncorrelated but vice versa is not true.

Two random variables are said to be orthogonal is

E(X_{i}X_{j}) = 0

this condition holds when X_{i} and X_{j} are uncorrelated and either one or both of the random variables have zero mean

**The equivalent Matlab functions so far:
**

** **

———————

Author

how to find conditional probability density function.

Dear Sarvanathilak,

this may help you

http://craigwwright.wordpress.com/2010/02/21/the-exponential-distribution/

else refer to any book like Proakis or schaum’s series you will get some examples

Thanks and Regards,

WirelessCafe Team !

yeah,I just thought you might want to know that your blog is messed up when you view it on my iphone. I?m not sure if it has something to do with my phone?s browser or your website? just saying…