1206.1703 (E. B. Davies)
E. B. Davies
This paper considers $N\times N$ matrices of the form $A_\gamma =A+ \gamma B$, where $A$ is self-adjoint, $\gamma \in C$ and $B$ is a non-self-adjoint perturbation of $A$. We obtain some monodromy-type results relating the spectral behaviour of such matrices in the two asymptotic regimes $|\gamma |\to\infty$ and $|\gamma |\to 0$ under certain assumptions on $B$. We also explain some properties of the spectrum of $A_\gamma$ for intermediate sized $\gamma$ by considering the limit $N\to\infty$, concentrating on properties that have no self-adjoint analogue. A substantial number of the results extend to operators on infinite-dimensional Hilbert spaces.
View original:
http://arxiv.org/abs/1206.1703
No comments:
Post a Comment