Nonius

Founding Member Nonius Unbound

Total Posts: 12716 
Joined: Mar 2004 


I guess you are right that the det equal zero will work. but, I want to do something a little more complex. I want to understand the structure of rank 1, rank 2,....rank n matrices, and the all fit together in a nice stratification. this has applications, for examplem in PCA in which you say fuck it, I'm gonna chop off all eigens that have values less than epsilon. well, those lower rank matrices that are given by chopping off, live in a space that borders the space of the next highest rank. anyway..... 
Chiral is Tyler Durden 




Hi,
I see your point and I can figure out what you mean by stratification (like a Russian doll for me). Nevertheless what do u think about my little calculus for the 3dim case (with the two spheres)?
Moreover what do u think about the existence of 0 eigenvalues that could be clues for the existence of arbitrage?
Regards 
I am the dark one, the widower; the unconsoled,
The prince of Aquitaine at his stricken tower:
My sole star is dead, and my constellated lute
Bears the black sun of the melancolia.



pj


Total Posts: 3362 
Joined: Jun 2004 


Revenons a nos moutons... (missing matrix correlation values)
Does anybody has studied the article "Positive definite complations and determinant maximization" by Glunt, Hayden, Johnson, and Tarazaga Linear Algebra and its applications 28 (1999)? Especially the algorithm proposed there?

Нас ебут, а мы крепнем... 




Do you have it and if so could you post it? 
I am the dark one, the widower; the unconsoled,
The prince of Aquitaine at his stricken tower:
My sole star is dead, and my constellated lute
Bears the black sun of the melancolia.



pj


Total Posts: 3362 
Joined: Jun 2004 


I'm not sure that I can post it. (check Your mail) 
Нас ебут, а мы крепнем... 




Thanks
I'm going to have a look 
I am the dark one, the widower; the unconsoled,
The prince of Aquitaine at his stricken tower:
My sole star is dead, and my constellated lute
Bears the black sun of the melancolia.



Nonius

Founding Member Nonius Unbound

Total Posts: 12716 
Joined: Mar 2004 


Eldesichado,
Not sure I understand your calculus.
But, on arbitrage, I am not sure one could do anything with this. A zero eigenvalue matrix, in just about any interesting measure on the space of pos def matrices, is going to lie in a space of measure zero. on the other hand, just about every low dim multi factor model for rates that I know of assumes implicitly that there are a lot of zero eigenvalues. If you believed that the probability that your "random" correlation matrix could not have such a property with high degree of certainty, then, you would believe that average correlation is too high, thus, that swaptions are overpriced compared with caps. the trade would be short swaptions long caps (the analog of shorting index vol and long consituent vol cant remember if they call that long or short dispersion), which, incidentally, is the opposite direction that people did about 6 to 7 years ago. 
Chiral is Tyler Durden 




If you have a zeroeigen value then you have a vector, which is different from 0, and you have COV_Matrix*Vector=0, so you have a portfolio (the vector) with no risk (no variance under a specific measure) so the return of this portfolio must be the return of the riskless asset under this specific measure unless you have an arbitrage. Am I wrong? 
I am the dark one, the widower; the unconsoled,
The prince of Aquitaine at his stricken tower:
My sole star is dead, and my constellated lute
Bears the black sun of the melancolia.



pj


Total Posts: 3362 
Joined: Jun 2004 


Just I've decided to share my joy.
Forget Rebonato and Nonius...
"Positive definite complations and determinant maximization" by Glunt, Hayden, Johnson, and Tarazaga Linear Algebra and its applications 28 (1999)
rulez

Нас ебут, а мы крепнем... 



Nonius

Founding Member Nonius Unbound

Total Posts: 12716 
Joined: Mar 2004 


dooood, what, exxaccly, is my method? 
Chiral is Tyler Durden 


pj


Total Posts: 3362 
Joined: Jun 2004 


Monsieur,
You were proposing something on this very thread. I am not capable explain what,
but it had something with numerically solving algebraic equations of nth degree.
BTW, Wish me luck, tomorrow I'm off to Dublin. 
Нас ебут, а мы крепнем... 



Nonius

Founding Member Nonius Unbound

Total Posts: 12716 
Joined: Mar 2004 


very special polynomials, like DetA=0, but, with a lot of values put into it. 
Chiral is Tyler Durden 


mtsm


Total Posts: 210 
Joined: Dec 2010 


do you mind me bouncing this up a few years and ask you what you ended up doing pj?
did you collect and try various methods?
thanks,
mtsm 




mtsm


Total Posts: 210 
Joined: Dec 2010 


Hi pj,
thanks for the paper you sent. I was spending a little time on that, wondering if and how fast I could implement it.
I was wondering if you have given much thought to the implication of maximizing the determinant of the matrix? I mean what are the implications of such a condition.
I have not spent much time thinking about positive semidefinite matrices, but in the case of rats for example I imagine that many historical correlation matrices have quite low rank, so are fairly degenerate. Wouldn't the procedure described in the paper tend to result in full rank matrices?
Thanks 



pj


Total Posts: 3362 
Joined: Jun 2004 


Hi mtsm,
> Wouldn't the procedure described in the paper tend to result in full rank matrices?
Err, yes. The authors seek to maximise the determinant, which is the multiplication of the eigenvalues, which can be interpreted as the maximisation of the volume of the hyperellipsoid... Thus the variables will be held as independent as possible.
I haven't thought about the degenerate cases. Which are of the interest to you though. 
вакансия "Программист Психологической службы"
але! у нас ошибко! не работает бляблябля
вы хотите об этом поговорить? 



Nonius

Founding Member Nonius Unbound

Total Posts: 12716 
Joined: Mar 2004 


sorry, but that sounds pretty trivial. 
Chiral is Tyler Durden 


gnarsed


Total Posts: 87 
Joined: Feb 2008 


how about doing a low rank approximation to the correlation matrix? basically, you'd be looking for a psd lowrank matrix which minimizes a loss (usually squared error) over the observed correlation entries. rankconstraint can be approximated using trace norm (sum of singular values, aka nuclear norm) regularization, leading is a convex optimization problem. then you can fit an entire regularization path and get a sequence of solutions with increasing ranks (ala lasso).
there is a fair amount of published work on such methods. 




rod


Total Posts: 375 
Joined: Nov 2006 


Correlation matrices
 are symmetric and positive semidefinite  have 1's on their main diagonal.
Filling missing values is known as the positive semidefinite matrix completion problem, which boils down to a semidefinite program (SDP) with equality constraints of the form $x_{ii} = 1$. This is easy. Things get interesting when one starts thinking about the rank. Here's one paper on it:
Marianna EisenbergNagy, Monique Laurent, Antonios Varvitsiotis, Complexity of the positive semidefinite matrix completion problem with a rank constraint, 2012.
Adding to Nonius's 2005 posts on this thread, here's the 3D elliptope, which corresponds to the set of 3 x 3 correlation matrices:
If I add the constraint $x_{13} = 0$, then I am intersecting the elliptope with a plane, which produces a 2D set in 3D space.
This image shows that using SDP with a nonzero objective function, we will always hit the boundary, where the determinant is zero. If we hit one of the 4 pointy vertices, then the rank is 1. 



aickley


Total Posts: 37 
Joined: Oct 2008 


I was having the same question and also thought about using a general SDP solver.
Then I found this Here the author claims his (more complex) method converges for 3000x3000 matrices in about half a minute. The MATLAB and C code is there.
The papers are (to make this post futureproof)
1. Nicholas J. Higham, Computing the Nearest Correlation Matrix—A Problem from Finance, IMA J. Numer. Anal. 22, 329–343, 2002. 2. HouDuo Qi and Defeng Sun, A Quadratically Convergent Newton Method for Computing the Nearest Correlation Matrix, SIAM J. Matrix Anal. Appl. 28, 360385, 2006 3. Ruediger Borsdorf and Nicholas J. Higham, A Preconditioned Newton Algorithm for the Nearest Correlation Matrix, IMA J. Numer. Anal. 30, 94107, 2010. 4. Ruediger Borsdorf, Nicholas Higham and Marcos Raydan, Computing a Nearest Correlation Matrix with Factor Structure, SIAM J. Matrix Anal., Appl. 31, 26032622, 2010
All this stuff seems to be implemented in the NAG library (I did not use it). 




aickley


Total Posts: 37 
Joined: Oct 2008 


Then I found this review of specialized iterative methods. The simplest method is to alternate between projecting on the cone of positive semidefinite matrices and projecting on a set of matrices with unit diagonal.
[Cannot put this paragraph in the previous post] 



mtsm


Total Posts: 210 
Joined: Dec 2010 


There has been a massive massive massive amount of work on this and related subjects. A recent book in this space is https://www.amazon.com/GeneralizedPrincipalComponentInterdisciplinaryMathematics/dp/0387878106/ref=sr_1_1?s=books&ie=UTF8&qid=1477871420&sr=11&keywords=generalized+principal+component+analysis 



