Character Theory #2에서 다양한 표현의 분해에 대해서 미리 언급했는데, 이번에는 이에 대해 다루어 보고자 한다.
6. Canonical Decomposition of A Representation
Let $\rho: G \rightarrow GL(V)$ be a linear representation of $G$. We are going to define a direct sum decomposition of $V$ of $V$ which is coarser than the decomposition into irreducible representations. Its advantage is uniqueness.
$$ V = W_{1} \oplus \cdots \oplus W_{h} $$
Each irreducible subrepresentation $W_{i}$ with character $\chi_{i}$ and degee of $n_{i}$ for $1 \le i \le h$.
$$ V = V_{1} \oplus \cdots \oplus V_{m} $$
Each subrepresentation $V_{i}$ is the direct sum of some $W_{j}$. In other words, $V_{i}$ is a direct dum of irreducible representations and collects together the isomorphic representations.
Theorem 8. The projection $p_{i}$ of $V$ into $V_{i}$ associated with this decomposition is given by the formula:
$$ p_{i} = {n_{i} \over g} \underset{t \in G}{\sum} \chi_{i}(t)^{*} \rho_{t} $$
By Proposition 6, the restriction of $p_{i}$ to any arbitrary irreducible subrepresentation $W$ of $V$ with character $\chi$ and degree $n$ is a homothety of ratio $ (n_{i}/n) \left(\chi_{i} | \chi \right). $
It implies that $p_{i}$ is the identity on an irreducible representation, which is isomorphic to $W_{i}$, and $p_{i}$ is zero on the others. $\therefore$ In view of the definition of $V_{i}$, $q_{i} = \delta_{ij} \cdot id.\ _\Box$
In this case, $V_{i}$ does not have to be an irreducible representation. After the chapter, we deal with decomposition from subrepresentation $V_{i}$ into a direct sum of irreducible representations each isomorphic to $W_{i}$:
$$ V_{i} = W_{i} \oplus \cdots \oplus W_{i} $$
This decomposition is called Explicit Decomposition, and it is not unique. Then, it is just as arbitrary as the choice of a basis in a vector space.
7. Explicit Decomposition of A Representation
Let construct a decomposition of $V_{i}$ into a direct sum of subrepresentations isomorphic to $W_{i}$. Let $W_{i}$ be given in matrix form $ (r_{\alpha \beta} (s)) $ with respect to a basis $ (e_{1}, \ldots, e_{n}) $.
Then, $\chi_{i} (s) = \sum_{\alpha} r_{\alpha \alpha} (s)$ and $n=n_{i}=$ dim$(W_{i})$.
Let define a projection $p_{\alpha \beta}$:
$$ p_{\alpha \beta} = {n \over g} \underset{t \in G}{\sum} r_{\beta \alpha} (t^{-1}) \rho_{t}. $$
Proposition 8.
$($i$)$ The map $p_{\alpha \alpha}$ is a projection. It is zero on the $V_{j}, j \neq i.$ Its image $V_{i, \alpha}$ is contained in $V_{i}$, and $V_{i}$ is the direct sum of the $V_{i, \alpha}$ for $1 \le \alpha \le n_{i}. $ Then, $p_{i} = \sum_{\alpha=1}^{n_{i}} p_{\alpha \alpha} $.
$($ii$)$ The linear map $p_{\alpha \beta}$ is zero on the $V_{j}, j \neq i$, as well as on the $V_{i,\gamma}$ for $\gamma \neq \beta$. $p_{\alpha \beta}$ is an isomprphism from $V_{i,\beta}$ onto $V_{i,\alpha}$.
$ V_{i} = \underset{i=1}{\overset{n_{i}}{\bigoplus}} V_{i,\alpha}\ s.t.\ V_{i,\alpha} \cong W_{i}. $
$ p_{\alpha \beta}(e_{\gamma}) = {n \over g} \underset{t \in G}{\sum} r_{\beta \alpha}(t^{-1}) \rho_{t}(e_{\gamma}) = {n \over g } \underset{\omega}{\sum} \underset{t \in G}{\sum} r_{\beta \alpha} (t^{-1}) r_{\omega \gamma}(t) e_{\omega}$$ \qquad \quad \ = \underset{\omega}{\sum} \left[ {n \over g} \underset{ t \in G}{\sum} r_{\beta \alpha} (t^{-1}) r_{\omega \alpha} (t) \right] e_{\omega} = \underset{\omega}{\sum} \delta_{\beta \gamma} \delta_{\alpha \omega} e_\omega = \delta_{\beta \gamma} e_{\alpha}. $$ \Rightarrow p_{\alpha \alpha}(e_{\alpha}) = e_{\alpha} \Rightarrow \sum_{\alpha = 1}^{n_{i}}$ is the identity map of $W_{i}$ because every basis of $W_{i}$ maps into itself.$\Rightarrow$ By orthogonality from Corollary 2 of Proposition 4, $p_{\alpha \beta} = 0$ if $W_{j}$ with $j \neq i$. $_\Box$
댓글 영역