Forums  > Off-Topic  > Advances in linear algebra, information theory, signal processing, optimal control and statistics  
Page 1 of 1
Display using:  


Total Posts: 1251
Joined: Jun 2007
Posted: 2020-09-07 13:59
a bit of an confuse question.

I read and hear from people I respect a lot (or borderline admire), some on them from this board about "interesting advances in statistics".

I would love to get pointers to get books, papers, blog posts or just even the buzzword of rather recent or newly rediscovered topics and results in:
- statistics
- optimal control
- optimization in general
- linear algebra
- information theory
- signal processing

I think that would be interesting for many on this board. Everything you find interesting, know or assume it might be of practical value.

I excluded ML here on purpose. I will probably open a new topic on that one.

Ich kam hierher und sah dich und deine Leute lächeln, und sagte mir: Maggette, scheiss auf den small talk, lass lieber deine Fäuste sprechen...


Total Posts: 89
Joined: Jul 2018
Posted: 2020-09-07 15:18
I think on the statistics side there's a lot of interesting work emerging in high dimensional statistics motivated by random matrix theory and geometry. I think the work I'm going to reference can be seen as the intersection of all the topics you mentioned!

On the RMT side there's a lot of nice work related to covariance matrices. Ledoit and Wolf extended their famous linear shrinkage to nonlinear shrinkage The results are backed by some solid theory and impressive empirical results. The legendary Donoho and co have pinned down optimal singular value shrinkage, an old topic but one for which they have laid down some seriously solid theory,, For a good introduction to this literature see

Everyone knows about the 'low-dimensional manifold' hypothesis but not a lot of people go further than stating or assuming it. I think the signal processing guys are doing some great work with it; take a look at Coifman is a signal processing legend and the idea is very neat (utilises a bit of information geometry which is cool if you haven't seen it before).

did you use VWAP or triple-reinforced GAN execution?


Total Posts: 367
Joined: Mar 2018
Posted: 2020-09-07 15:47
@doomanx This is great stuff! I personally have difficulty finding the threads of interest in the sea of arxiv.


Total Posts: 1176
Joined: Jun 2005
Posted: 2020-09-07 19:59
"Noise invariance" fascinates me.


Total Posts: 1251
Joined: Jun 2007
Posted: 2020-09-08 10:10
So far exactly what I hoped for.

Like Jurassic already said: it helps to filter out interesting stuff.

I recently got interested in Matrix Profile. I wanted to do some experiments, but didn't had the time.

Look at the slides attached in this one:

ucr Matrix Profile Site

site by Yue Lue


Ich kam hierher und sah dich und deine Leute lächeln, und sagte mir: Maggette, scheiss auf den small talk, lass lieber deine Fäuste sprechen...


Total Posts: 71
Joined: Feb 2018
Posted: 2020-09-08 10:27
You asked about control theory

One obvious use case:


Total Posts: 89
Joined: Jul 2018
Posted: 2020-09-08 23:23
@Maggette someone was talking to me about Matrix profile the other day. I have found in general that *machine learning* time series methodology usually oriented towards social media companies and other large time series database mining tasks do not transfer over well to financial application for all the usual reasons, but some of it is interesting from a methodology standpoint. The time series workship at NIPS is actually pretty good, you can find a lot of good ideas there.

did you use VWAP or triple-reinforced GAN execution?


Total Posts: 1251
Joined: Jun 2007
Posted: 2020-09-09 10:03
I am also pretty skeptical about most "ml for time series in finance" stuff. Especially in the instationary + bad signal/noise setting of finance.

I still think that ML for time series is an underrepresented and under-researched topic. Almost all business and many scientific and engineering problems have a temporal component.

I am still excited though and at the beginning of my ML for time series experiments.
To this point I have exactly one "ML for time series" application in production. But lots of ideas and stuff to try out. Most of the blog posts/medium entries on that topic are pretty useless though.

Thanks for the NIPS pointer.

Ich kam hierher und sah dich und deine Leute lächeln, und sagte mir: Maggette, scheiss auf den small talk, lass lieber deine Fäuste sprechen...


Total Posts: 1221
Joined: Feb 2007
Posted: 2020-09-09 11:47
Very good suggestions by Doomanx.

I've shared this opinion already, but be very wary of anything sold by Keogh after he wrote this. He excels at marketing. His other results have been pretty marginal: using edit distance and a histogram (aka SAX) isn't exactly a world-historical innovation either, but the way it is spoken of you'd think the man discovered fire rather than publishing a particular and undistinguished implementation of an ancient idea. Or I could be wrong and matrix profiles really are amazing like the HN weebs tell me.

I think the basket of ideas in conformal prediction is a genuine statistical innovation. I also think Dempster-Shafer might be for signal processing, but I don't understand it well enough to really opine (it was someone on here who mentioned it; Shafer also helped invent conformal prediction). The same guys have done some interesting work using approximations of Kolmogorov entropy (aka classical compression a la LZW) to construct hypothesis tests on wild data. I don't know if these are true innovations or not, but I like the flavor of them anyway.

Linear algebra has had numerous innovations in the last couple of decades; I've lost track of them all, but they do get applied in large scale ML techniques. I don't know why non-negative matrix factorization is considered "ML" -but that's the world we live in. Similarly there are a bunch of approximate low rank factorization techniques based on probabilistic or online approaches which make "very beeeg linear algebra" possible. I think some of them are in Vowpal Wabbit and similar gizmos, but they've definitely been inadequately exploited and most people are not aware of them at all.

Column subset selection and CUR decompositions are ideas which absolutely blew my mind the first time I saw them: they're quite recent. If people had thought of these before SVD or Eigenvectors, everything from quantum mechanics to quantitative finance would look completely different. Of course they don't make any sense unless you're dealing with very large matrices, but now a days we do actually live in a world where most interesting matrices are large.

A friend of mine tried to rope me into going through a chunk of these new ideas and writing a practical textbook on it. I might actually do this one day if nobody else does (nobody else has in the 6 years since he suggested it). Of course neither one of us has any academic chops, and my friend (who knows vastly more about it than me) doesn't even have a Ph.D. -meaning nobody would force students to purchase our book, and so we'd never make any useful amount of money at doing it. I have no interest in adding to my glory, so it would be written for purely selfish motives; maybe recreational math to keep me from going senile as I decay into my elementary hydrocarbons. Frankly my misanthropy levels are so high at this point I'd write it in some un-google-translatable dead language like Crimean Gothic or Medieval Latin to deny it to most of the people who could benefit from it. Meanwhile I have useful work to do.

"Learning, n. The kind of ignorance distinguishing the studious."


Total Posts: 44
Joined: Sep 2009
Posted: 2020-09-09 12:19
Ivan Oseledets has a forthcoming book in this area called "Numerical Tensor Methods" which I am very much looking forward to.


Total Posts: 355
Joined: Feb 2014
Posted: 2020-09-09 12:37
@jslade: you should definitely do it and release that knowledge for humanity! Taleb shows that a good strat is to point out a lot of overhyped academic folks that doing it wrong (or does not understand) to increase the sales.

First Commander of the USS Enterprise


Total Posts: 367
Joined: Mar 2018
Posted: 2020-09-09 13:29
@Maggette (et others) isnt the hunch though that renaissance technologies works on some kind of ml applied to financial markets? it would be strange for speech recognition guys not to have tried something similar. Is it possible that they have made breakthough in this area? If they had, wouldnt there be hints from other authors of papers on arxiv?


Total Posts: 446
Joined: Jan 2015
Posted: 2020-09-09 14:34
My vote would be for compressive sensing. In terms of finance applications, I think there's a lot you can do with sparse factor decomposition.

When it comes to equity models, we know that traditional PCAs crap out after four or five eigenvectors. Most of the remaining covariance structure is probably sparse, which isn't suited for traditional decomposition techniques. So we hand-encode factors like industry and country exposure. I think compressive sensing allows you to recover much more of this structure in a fully systematic way.

Along these lines, another cool thing is the ability to recover sparse factors from much smaller datasets. You could potentially build a trading strategy around detecting short-lived cointegrations caused by large-scale portfolios rebalancing their positions.

Good questions outrank easy answers. -Paul Samuelson


Total Posts: 1176
Joined: Jun 2005
Posted: 2020-09-09 15:10

It is either
1. short-lived x large volume (detectable as excess liquidity)

2. low volume x long-lived (more significant statistically?)

1. If there is cash account in between then it is just usual whale trade (asset-A to cash), becase link to another whale trade (cash to asset-B) is impossible to establish as it may happen the other day.

2. perhaps, this is what you mean? But then it is "usual" stuff


Total Posts: 89
Joined: Jul 2018
Posted: 2020-09-09 15:15
I've done some work on sparse factor decompositions - I agree that it's very hard to make it work with PCA type approach as in the bulk of the spectrum (where in Finance these sparse factors would lie) it's hard to disentangle the Marcenko-Pastur noise from just smaller eigenvalues. Also the ordering of the bulk spectrum changes a lot in time so it can be hard to identify factors between timesteps. That's why you need these various shrinkage methods (some of which I referenced) and identification schemes (the simplest idea is identifying an eigenvector at time t with the eigenvector it has maximal inner product with at time t+1). Compressed sensing is another hat tip to Donoho too.

Choosing factors is just choosing a new basis for the data (if it's lower rank it can be seen as thresholding some coefficients to zero). Eigenvectors of the covariance matrix is one way of doing this (it is the low rank factorisation with minimal MSE), but it's not the only conceivable basis you could use to represent the data and some of these might be naturally sparser than others.

In signal processing they start with what they call a 'Dictionary' of basis elements (in classical settings these would be some kind of wavelet basis) that is usually overcomplete (as in there are more basis elements than dimensions in the data) and they try to find a sparse coefficient representation in this basis. This is akin to high dimensional regression where you can get a solution in terms of minimal l1 norm. It's called basis pursuit, invented by Tukey and brought to signal processing by Donoho You might be interested to look at the organisation of the first author, who was his phd student.

Edit: just so it's easier to understand, in signal processing papers the signal is usually assumed to be univariate and the dimensions are in the time domain, meaning the number of measurements of the univariate signal you have. But the methodology is not unique to this view.

did you use VWAP or triple-reinforced GAN execution?


Total Posts: 1251
Joined: Jun 2007
Posted: 2020-09-09 19:41
THX to everybody so far!!

@jslade: THX for the input. And of course: I would buy:). Like svisstack pointed out beautifully, I guess a ninja marketing like bashing Yann Lecun or something would be a great strategy :).

@Jurrasic: probably bad wording n my side. Of course ML is used in finance. I do know people who use it. Even Matt Hurt talked about random forests in the 90s or something.

I was more talking about the stuff that is published in papers or books on the application in finance.

Ich kam hierher und sah dich und deine Leute lächeln, und sagte mir: Maggette, scheiss auf den small talk, lass lieber deine Fäuste sprechen...
Previous Thread :: Next Thread 
Page 1 of 1