Taylor expansion of Orthogonality Deficiency of a matrix
Assume that ${\mathbf H}$ is a $N \times M$ matrix. The following
parameter is called orthogonality deficiency and describes how much
orthogonal the columns of ${\mathbf H}$ are. $$ od({\mathbf H}) = 1 -
\frac{\det({{\mathbf H}^H{\mathbf H})}}{\Pi_{n=1}^M\|{{\mathbf
h}_n}\|^2}$$ where ${\mathbf h}_n$ is the $n$th column of matrix ${\mathbf
H}$. It can be seen that $0\leq od({\mathbf H})\leq 1$. If ${\mathbf H}$
is singular then $od({\mathbf H})=1$ and when $od({\mathbf H})=0$ the
colums of mathrix ${\mathbf H}$ are orthogonal. This is, indeed a very
useful criterion for matrix orthogonality and has many applications in
communications engineering and signal processing.
My question:
When the matrix ${\mathbf H}$ has the form ${\mathbf H}={\mathbf
A}+e{\mathbf B}$, where $e$ is a scalar, how can we approximate
$od({\mathbf H})$ when $e \ll 1$ and what is the approximation of
$od({\mathbf H})$ when $e \ll 1$ (Probably using Taylor expansion)?
No comments:
Post a Comment