Random Matrix Theory

Random Matrix Theory

  • Eytan Katzav added an answer:
    Is there any internal relations between circle law and semicircle law?

    In random matrix theory, Let $A$  be a random $n \times n$ matrix whose entries i.i.d with expectation 0 and variance 1, let F be the  LSD of $A$ , $F$ will be uniform distribution over the unit disk. The marginal distribution of $F$ equal to a non-standard  semicircle law, which is the LSD of a wigner matrix $B$. Is there any relationship between circle law and semicircle law ? That is, if we have known one of the two law, can we prove another one by means of matrix analysis?

    Eytan Katzav · Hebrew University of Jerusalem

    The reply of James Peters is to the point and includes many references and explanations.

    Let me take a different angle on things - both the circle law and the semi-circle law are some sort of large-number-theorems, namely capture some universal average behaviour of a large collection of random numbers (in this case organized as a matrix). The eigenvalues of those random matrices converge to one of the above-mentioned laws under quite general conditions (i.e., even if the individual entries are not Gaussian or uniformly distributed - any distribution with finite variance usually would do the job - Tim Rogers' PhD thesis indeed discusses this for example).

    The main difference between the laws is that in the case of the circle law the eigenvalues are complex while in Wigner's semi-circle law the eigenvalues are real - and the reason is simple. In the first case the random matrices are non-hermitian while in the second they are hermitian (or even symmetric). 

    Obtaining one from the other is non-trivial, but there are techniques that take the non-hermitian case, and create from them an ensemble of larger hermitian matrices (double the size) and then use results for those hermitian ones (such as the semi-circle law) to infer results for the non-hermitian ones.

  • Alexander Wickstrom added an answer:
    What sort of mathematical background would best prepare a neuroscientist to really understand cortical networks?

    Structurally, both micro- and macro- cortical networks are highly complex. 

    When analyzed functionally in real time, whether locally or globally, the complexity of these networks really challenges my comprehension. It seems that being able to manipulate mathematical models of brain networks should prepare a scientist to better conceptualize the dynamic function of in vivo networks.

    I've looked at mathematical concepts such as random matrix theory, Bayesian networks, hidden-hierarchical Markov chains, chaos and nonlinear dynamics; are any of these better suited than the others, or is a combined approach utilizing techniques from various disciplines more appropriate? Additionally, what kind of relevant course work should a hopeful neuroscientist invest in to perform these kinds of operations?

    Alexander Wickstrom · University of Wisconsin - Milwaukee

    I appreciate all of these great answers. Looks like I still have a lot of work to do! 

  • I. V. Fialkovsky added an answer:
    Can someone suggest any references on diagrammatics and renormalization for a QFT model with a non-trivial tadpole?

    See above.

    I. V. Fialkovsky · Universidade Federal do ABC (UFABC)

    Thanks a lot for your answers!
    I also find this work helpfull

    Steven Weinberg
    Calculations of Symmetry Breaking

    PRD 7 (1973) 2887

  • Alexander Yurkin added an answer:
    Could the distribution of the prime numbers be related to a physical system?
    It seems that the distribution of eigenvalues in quantum chaotic systems obeys the same statistics as eigenvalues of random matrices. e.g.
    http://www.researchgate.net/publication/51969774_Generalized_random_matrix_conjecture_for_chaotic_systems

    It has also been shown that the distribution of the critical zeros of the Riemann zeta-function can be related to the distribution of eigenvalues of certain random matrices:
    http://www.researchgate.net/publication/51940315_Random_matrices_and_Riemann_hypothesis

    So this would seem to to suggest a statistical relationship between the zeros of the zeta function and the energy levels some quantum chaotical system. Can any relationship be drawn between the placement of the prime numbers and a dynamical system? Any other implications of this connection?
    Alexander Yurkin · Russian Academy of Sciences

    I think about the probability theory. Nevertheless M. Planck, A. Einstein and E. Schrödinger were right! They were fully developed people. And here M. Born and L. Landau aren't right because at them (at Born and Landau) the right cerebral hemisphere in a cherpny box was underdeveloped. So the theory and experiment showed.

  • Parviz Parvin added an answer:
    Why doesn't random lasing in highly scattering media occur by CW optical pumping?
    Random lasing was seen in several kinds of optically active highly scattering media. Is there a method to make a CW-pumped (optically) random laser?
    Parviz Parvin · Amirkabir University of Technology
    Here, thermal effects may induce high ASE and high losses with lower inverted population (small gain) in the same time.
  • Camille Male added an answer:
    Order in N of Cov( u_ij u_ik ) and Var( u_ij), for i,j,k distinct and U=(u_ij) an N by N Haar matrix ?
    What are the order of Cov( u_ij u_ik ) and Var( u_ij), for i,j,k distinct and (u_ij) the entries of a matrix distributed according to the Haar distribution on the unitary group ?
    It is clear that the formulas given by Weingarten calculus (e.g. Benoit's ones) answer this question in a larger generality. But I cannot find a reference with this simple application, do you know any ?
    Camille Male · Paris Diderot University
    Thank James for your p.m. answer. Let me share it, and comment it in the context of traffics.
    __________
    By James Mingo: "First regarding your post: according to Benoit's formula
    \[ \mathrm{E}(u_{i_1j_1} \cdots u_{i_nj_n}
    \overline{u_{i_{1}'j_{1}'}} \cdots
    \overline{u_{i_{n}'j_{n}'}}) \]
    = \sum_{\pi, \sigma \in S_n}
    \delta_{i_1 i_{\pi(1)}'} \cdots \delta_{i_n i_{\pi(n)}'}
    \delta_{j_1 {j}_{\sigma(1)}'} \cdots \delta_{j_n j_{\sigma(n)}'}
    \mathrm{Wg}(\pi^{-1}\sigma) \]
    We have (because there no complex conjugates)
    E(u_{ij}u_{ik}) = 0 as well as E(u_{ij}) = 0.
    Thus the covariance is 0. If we replace U by O (the
    orthogonal group) we still get E(o_{ij}o_{ik}) = 0 for k and
    k not equal. E(o_{ij})^2) = 1/N where N is the size of the
    matrix."
    ___________

    I have then realized a simple interesting fact, a simple way to observe the difference between the limiting distribution of permutation matrices and Haar matrices in the classical compact groups.

    For any n.c.v.r. "a" in a space of traffics, thanks to certain operad construction I'll present soon in my revised version of [1], I can define the operator "deg(a)", which correspond, for a matrix "A = (A_{i,j})_{i,j=1..N}" to the diagonal matrix
    "deg(A) = diag( \sum_{j=1..N} A_{ij} )_{i=1..N}"

    If "u" is the limit in distribution of traffics of a uniform permutation matrix (or any permutation matrix with small density of cycle of given size), then "deg(u)" is the identity "\mbf 1" in the underlying *-probability space.
    Nevertheless, if "u" is the limit of a Haar matrix on the unitary group, then "deg(u)" has the distribution of "\theta \mbf 1", where "\theta" is a complex gaussian random variable.

About Random Matrix Theory

In probability theory and mathematical physics, a random matrix is a matrix-valued random variable.

Topic Followers (235) See all