[R] Positive Definite Matrix

Martin Maechler maechler at stat.math.ethz.ch
Sat Jan 29 23:40:10 CET 2011


>>>>> Prof Brian Ripley <ripley at stats.ox.ac.uk>
>>>>>     on Sat, 29 Jan 2011 21:00:19 +0000 (GMT) writes:

    > On Sat, 29 Jan 2011, David Winsemius wrote:
    >> 
    >> On Jan 29, 2011, at 12:17 PM, Prof Brian Ripley wrote:
    >> 
    >>> On Sat, 29 Jan 2011, David Winsemius wrote:
    >>> 
    >>>> 
    >>>> On Jan 29, 2011, at 10:11 AM, David Winsemius wrote:
    >>>> 
    >>>>> On Jan 29, 2011, at 9:59 AM, John Fox wrote:
>>>>> Dear David and Alex, I'd be a little careful about testing exact
>>>>> equality as in all(M == t(M) and careful as well about a test
>>>>> such as all(eigen(M)$values > 0) since real arithmetic on a
>>>>> computer can't be counted on to be exact.
    >>>>> Which was why I pointed to that thread from 2005 and the
    >>>>> existing work that had been put into packages. If you want
    >>>>> to substitute all.equal for all, there might be fewer
    >>>>> numerical false alarms, but I would think there could be
    >>>>> other potential problems that might deserve warnings.
    >>>> 
    >>>> In addition to the two "is." functions cited earlier there is
    >>>> also a "posdefify" function by Maechler in the sfsmisc
    >>>> package: " Description : From a matrix m, construct a "close"
    >>>> positive definite one."
    >>> 
    >>> But again, that is not usually what you want.  There is no
    >>> guarantee that the result is positive-definite enough that the
    >>> Cholesky decomposition will work.
    >> 
    >> I don't see a Cholesky decomposition method being used in that
    >> function. It appears to my reading to be following what would
    >> be called an eigendecomposition.

    > Correct, but my point is that one does not usually want a

    > "close" positive definite one

    > but a 'square root'.

Probably, but the more "unusual" case is still a not at all
uncommon,
and the help page of posdefify wchih David mentioned has some
relative recent litterature references.
Further, it has had the following note, for a while.

>>  Note:
>> 
>>       As we found out, there are more sophisticated algorithms to solve
>>       this and related problems.  See the references and the ‘nearPD()’
>>       function in the ‘Matrix’ package.

and when you look at the  nearPD() function in Matrix and
notably its help page, you see that it's based on relatively
recent published research and algorithms from the NA community.

Martin


    >> 
    >> -- 
    >> David.
    >> 
    >> 
    >>> Give up on Cholesky factors unless you have a matrix you know
    >>> must be symmetric and strictly positive definite, and use the
    >>> eigendecomposition instead (setting negative eigenvalues to
    >>> zero).  You can then work with the factorization to ensure
    >>> that (for example) variances are always non-negative because
    >>> they are always computed as sums of squares.
    >>> 
    >>> This sort of thing is done in many of the multivariate
    >>> analysis calculations in R (e.g. cmdscale) and in
    >>> well-designed packages.
    >>> 
    >>>> 
    >>>> -- 
    >>>> David.
    >>>>>>> On Jan 29, 2011, at 7:58 AM, David Winsemius wrote:
    >>>>>>>> On Jan 29, 2011, at 7:22 AM, Alex Smith wrote:
    >>>>>>>>> Hello I am trying to determine wether a given matrix is symmetric and
    >>>>>>>>> positive matrix. The matrix has real valued elements.
    >>>>>>>>> I have been reading about the cholesky method and another method is
    >>>>>>>>> to find the eigenvalues. I cant understand how to implement either of
    >>>>>>>>> the two. Can someone point me to the right direction. I have used
    >>>>>>>>> ?chol to see the help but if the matrix is not positive definite it
    >>>>>>>>> comes up as error. I know how to the get the eigenvalues but how can
    >>>>>>>>> I then put this into a program to check them as the just come up with
    >>>>>>>>> $values.
    >>>>>>>>> Is checking that the eigenvalues are positive enough to determine
    >>>>>>>>> wether the matrix is positive definite?
    >>>>>>>> That is a fairly simple linear algebra fact that googling or pulling
    >>>>>>>> out a standard reference should have confirmed.
    >>>>>>> Just to be clear (since on the basis of some off-line communications it
    >>>>>>> did not seem to be clear):  A real, symmetric matrix is Hermitian (and
    >>>>>>> therefore all of its eigenvalues are real). Further, it is positive-
    >>>>>>> definite if and only if its eigenvalues are all positive.
    >>>>>>> qwe<-c(2,-1,0,-1,2,-1,0,1,2)
    >>>>>>> q<-matrix(qwe,nrow=3)
    >>>>>>> isPosDef <- function(M) { if ( all(M == t(M) ) ) {  # first test
    >>>>>>> symmetric-ity
    >>>>>>> if (  all(eigen(M)$values > 0) ) {TRUE}
    >>>>>>> else {FALSE} } #
    >>>>>>> else {FALSE}  # not symmetric
    >>>>>>> 
    >>>>>>> }
    >>>>>>>> isPosDef(q)
    >>>>>>> [1] FALSE
    >>>>>>>>> m
    >>>>>>>>> [,1] [,2] [,3] [,4] [,5]
    >>>>>>>>> [1,]  1.0  0.0  0.5 -0.3  0.2
    >>>>>>>>> [2,]  0.0  1.0  0.1  0.0  0.0
    >>>>>>>>> [3,]  0.5  0.1  1.0  0.3  0.7
    >>>>>>>>> [4,] -0.3  0.0  0.3  1.0  0.4
    >>>>>>>>> [5,]  0.2  0.0  0.7  0.4  1.0
    >>>>>>>> isPosDef(m)
    >>>>>>> [1] TRUE
    >>>>>>> You might want to look at prior postings by people more knowledgeable 
    >>>>>>> than
    >>>>>>> me:
    >>>>>>> http://finzi.psych.upenn.edu/R/Rhelp02/archive/57794.html
    >>>>>>> Or look at what are probably better solutions in available packages:
    >>>>>>> http://finzi.psych.upenn.edu/R/library/corpcor/html/rank.condition.html
    >>>>>>> http://finzi.psych.upenn.edu/R/library/matrixcalc/html/is.positive.definit
    >>>>>>> e.html
    >>>>>>> --
    >>>>>>> David.
    >>>>>>>>> this is the matrix that I know is positive definite.
    >>>>>>>>> eigen(m)
    >>>>>>>>> $values
    >>>>>>>>> [1] 2.0654025 1.3391291 1.0027378 0.3956079 0.1971228
    >>>>>>>>> $vectors
    >>>>>>>>> [,1]        [,2]         [,3]        [,4]        [,5]
    >>>>>>>>> [1,] -0.32843233  0.69840166  0.080549876  0.44379474  0.44824689
    >>>>>>>>> [2,] -0.06080335  0.03564769 -0.993062427 -0.01474690  0.09296096
    >>>>>>>>> [3,] -0.64780034  0.12089168 -0.027187620  0.08912912 -0.74636235
    >>>>>>>>> [4,] -0.31765040 -0.68827876  0.007856812  0.60775962  0.23651023
    >>>>>>>>> [5,] -0.60653780 -0.15040584  0.080856897 -0.65231358  0.42123526
    >>>>>>>>> and this are the eigenvalues and eigenvectors.
    >>>>>>>>> I thought of using
    >>>>>>>>> eigen(m,only.values=T)
    >>>>>>>>> $values
    >>>>>>>>> [1] 2.0654025 1.3391291 1.0027378 0.3956079 0.1971228
    >>>>>>>>> $vectors
    >>>>>>>>> NULL
    >>>>>>>>> m <- matrix(scan(textConnection("
    >>>>>>>> 1.0  0.0  0.5 -0.3  0.2
    >>>>>>>> 0.0  1.0  0.1  0.0  0.0
    >>>>>>>> 0.5  0.1  1.0  0.3  0.7
    >>>>>>>> -0.3  0.0  0.3  1.0  0.4
    >>>>>>>> 0.2  0.0  0.7  0.4  1.0
    >>>>>>>> ")), 5, byrow=TRUE)
    >>>>>>>> #Read 25 items
    >>>>>>>>> m
    >>>>>>>> [,1] [,2] [,3] [,4] [,5]
    >>>>>>>> [1,]  1.0  0.0  0.5 -0.3  0.2
    >>>>>>>> [2,]  0.0  1.0  0.1  0.0  0.0
    >>>>>>>> [3,]  0.5  0.1  1.0  0.3  0.7
    >>>>>>>> [4,] -0.3  0.0  0.3  1.0  0.4
    >>>>>>>> [5,]  0.2  0.0  0.7  0.4  1.0
    >>>>>>>> all( eigen(m)$values >0 )
    >>>>>>>> #[1] TRUE
    >>>>>>>>> Then i thought of using logical expression to determine if there are
    >>>>>>>>> negative eigenvalues but couldnt work. I dont know what error this is
    >>>>>>>>> b<-(a<0)
    >>>>>>>>> Error: (list) object cannot be coerced to type 'double'
    >>>>>>>> ??? where did "a" and "b" come from?
    >>>> 
    >>>> ______________________________________________
    >>>> R-help at r-project.org mailing list
    >>>> https://stat.ethz.ch/mailman/listinfo/r-help
    >>>> PLEASE do read the posting guide 
    >>>> http://www.R-project.org/posting-guide.html
    >>>> and provide commented, minimal, self-contained, reproducible code.
    >>> 
    >>> -- 
    >>> Brian D. Ripley,                  ripley at stats.ox.ac.uk
    >>> Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
    >>> University of Oxford,             Tel:  +44 1865 272861 (self)
    >>> 1 South Parks Road,                     +44 1865 272866 (PA)
    >>> Oxford OX1 3TG, UK                Fax:  +44 1865 272595
    >> 
    >> David Winsemius, MD
    >> West Hartford, CT

    > -- 
    > Brian D. Ripley,                  ripley at stats.ox.ac.uk
    > Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
    > University of Oxford,             Tel:  +44 1865 272861 (self)
    > 1 South Parks Road,                     +44 1865 272866 (PA)
    > Oxford OX1 3TG, UK                Fax:  +44 1865 272595

    > ______________________________________________
    > R-help at r-project.org mailing list
    > https://stat.ethz.ch/mailman/listinfo/r-help
    > PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
    > and provide commented, minimal, self-contained, reproducible code.



More information about the R-help mailing list