Thursday, December 20, 2012

Independent Random Vectors


The basic needs of vectors are to provide the relationship between the magnitude and direction. Let us take any two random vectors respectively s and t. If the directions of these random vectors are in the different direction and different magnitude we can say that these two vectors are independent random vectors.  If any of two vectors will be in the different places, we can say that the two are independent.

Independent Random Vectors:

Let us take {X1, X2, . . .  Xn} are the set of n random variables.

This is often convenient to consider these random variables set at a single object x = {X1, X2, . . .  Xn}. This will be called random vectors.  If it is particularly true true when this set of random variables submitted to the linear transformations.

If a random variable is described using its probability distribution the random vector is always described using its joint probability distribution of the n random variables which will make the random vector.
Expectation of Independent Random Vectors:

By the definition of independent random vectors the expectation is the vector whose components are the expectation of the independent random vectors.

If we make the vector

E(x) =` [[E[x_1]],[E[x_2]],[[....]],[E[x_n]]] `

Where the mean μ = E[x] and this is the center of gravity of the joint probability distribution of the random variables {X1, X2, . . .  Xn}

Linearity of the independent random vectors:

If the random vectors are immediately verified that the expectation is linear

If x and y are two independent random vectors, and if A and B are two constant matrices:

Then E [Ax + By] = AE[x] + BE[y]

And if b is a constant vector:

E [Ax + b] = AE[x] + b

My forthcoming post is on First Order Differential Equation and Definite Integrals will give you more understanding about Algebra.


Variance of the independent random vector:

Definition of the variance of the random vector and the random variable are equal. If we denote μ and the vector E[x], then by definition of the variance:

Var(x) = E[(x – μ)(x – μ)']

No comments:

Post a Comment