Generating correlated random deviates

Discussion in 'Math' started by someonesdad, Jul 19, 2010.

  1. someonesdad

    Thread Starter Senior Member

    Jul 7, 2009
    1,585
    141
    Here's a problem that I'd like to solve to help me with some Monte Carlo modeling. Suppose I have n random variables Xi (indexed by i, i = 1, ..., n). The distributions for each of these random variables is arbitrary; however, practically, I'll be using numpy and python, so I'll use the discrete and continuous distributions that numpy supports.

    Now, suppose I also have an n x n correlation matrix R for these random variables; this is a matrix with all 1's on the diagonal, is symmetric, and each off-diagonal element is less than or equal to 1 in absolute value. I want to generate m random vectors with components Xi such that the random samples in each dimension have the requisite covariances implied by the correlation matrix.

    Do any of you folks know how to do this? A reference to the literature on how to do it is fine too. An ideal answer would work within the constraints of python/numpy.
     
  2. Tesla23

    Active Member

    May 10, 2009
    318
    67
    Diagonalise the correlation matrix, transform to the basis of eigenvectors, in this representation the samples are uncorrelated so generate your samples here then transform back.

    For more info look up the Karhunen-Loev transform.
     
  3. someonesdad

    Thread Starter Senior Member

    Jul 7, 2009
    1,585
    141
    Cool! I'd never heard of the transform, but the method (kind of) intuitively rings true. I'll give it a try.
     
  4. someonesdad

    Thread Starter Senior Member

    Jul 7, 2009
    1,585
    141
    Tesla23:

    I haven't played with this yet, but I'll get to it. My only concern is as follows. Suppose I generate a column vector X where the elements are samples from a chosen group distributions (each row will correspond to an independent variable in the Monte Carlo simulation). Suppose P is the matrix of eigenvectors of the correlation matrix. I transform X to get  Y = P^{-1}XP. Now I have a set of n transformed vectors Yi; I take these Yi and put them in as columns of a matrix A. The rows of this matrix now form the random samples of the chosen distributions. These rows should have the desired covariance matrix.

    Does this diagonalizing similarity transformation change the stochastic properties of the rows of matrix A? In other words, are these rows still random samples from the originally-chosen distributions?

    I'll find this out experimentally when I try the technique out, but since you obviously know this stuff, you could short-circuit my work.
     
Loading...