cross-covariance
In the case of complex random variables, the covariance is defined slightly differently compared to real random variables. For complex random variables ( Z_1 ) and ( Z_2 ), the covariance is defined as:
\[\text{Cov}(Z_1, Z_2) = E[(Z_1 - E[Z_1])(Z_2 - E[Z_2])^*] \]where ( E[Z] ) is the expected value of ( Z ), and the asterisk ((^*)) denotes the complex conjugate.
Given ( Y_1 = G_1 + N_1 ) and ( Y_2 = G_2 + N_2 ), where ( G_1 ) and ( G_2 ) are complex constants, and ( N_1 ) and ( N_2 ) are complex Gaussian random variables with zero mean and ( \sigma ) variance, we can calculate the covariance as follows:
-
The expected values ( E[Y_1] ) and ( E[Y_2] ) are ( G_1 ) and ( G_2 ) respectively, since the expected value of ( N_1 ) and ( N_2 ) is zero.
-
The covariance between ( Y_1 ) and ( Y_2 ) becomes:
\[\text{Cov}(Y_1, Y_2) = E[(Y_1 - G_1)(Y_2 - G_2)^*] = E[(G_1 + N_1 - G_1)((G_2 + N_2 - G_2)^*)] = E[N_1 N_2^*] \] -
Since ( N_1 ) and ( N_2 ) are independent complex Gaussian random variables with zero mean, the expected value of their product ( E[N_1 N_2^*] ) is zero.
Thus, the covariance between ( Y_1 ) and ( Y_2 ) in this complex case is also zero.
auto-covariance
The expression ( E[YY^] ) represents the expected value of the magnitude squared of the complex random variable ( Y ). Given ( Y = G + N ), where ( G ) is a complex constant and ( N ) is a complex Gaussian random variable with zero mean and ( \sigma^2 ) variance, we can calculate ( E[YY^] ) as follows:
-
Expand ( YY^* ):
\[YY^* = (G + N)(G + N)^* = (G + N)(G^* + N^*) \]where ( G^* ) and ( N^* ) are the complex conjugates of ( G ) and ( N ), respectively.
-
Expand the product:
\[YY^* = GG^* + GN^* + NG^* + NN^* \] -
Calculate the expected value ( E[YY^*] ):
\[E[YY^*] = E[GG^* + GN^* + NG^* + NN^*] \]Since ( G ) is a constant, ( E[GG^] = GG^ ). The terms ( E[GN^] ) and ( E[NG^] ) are zero because ( N ) has zero mean. The term ( E[NN^*] ) is the variance of ( N ), which is ( \sigma^2 ).
-
Combine the results:
\[E[YY^*] = GG^* + \sigma^2 \]
So, ( E[YY^*] ) is the sum of the magnitude squared of the complex constant ( G ) and the variance ( \sigma^2 ) of the complex Gaussian random variable ( N ).
codes
%% covariance
rng(100)
N = 100000;
N1 = (randn(1, N) + randn(1, N)*1i);
Y1 = (1 + 2*1i) + N1;
E1 = mean(Y1);
N2 = (randn(1, N) + randn(1, N)*1i);
Y2 = (1 + 2*1i) + N2;
E2 = mean(Y2);
cov1 = Y1 * Y2' / N;
cov2 = (1 + 2*1i) * (1 + 2*1i)';
%% auto-covariance
N = 100000;
N1 = (randn(1, N) + randn(1, N)*1i);
Y1 = (5 + 6*1i) + N1;
autocov = Y1 * Y1' / N;
autocov2 = (5 + 6*1i) * (5 + 6*1i)' + 2;
标签:random,signal,processing,covariance,YY,zero,complex,1i
From: https://www.cnblogs.com/uceec00/p/17913407.html