I spent these days in reviewing statistics-related stuff. Something like information theory, probability estimation etc. Many useful methods in PR or CV are based on information theory, like: HMM, MRFs, EM method, ICA, GMM and so on.
When googling, I found mutual information is also used for feature selection, which is called Minimum Redundancy Maximum Dependency method. Maybe it can be also use for feature extraction, like LDA, but it works in a "probability-based space, which adopts the mutual information or symmetric KL divergence as the metric.
In the next few days, hope to have time review something about affine geometry and functional analysis. I really can't afford to waste a sec. Of course, allways keep thinking is a good habit:-)
Monday, March 19, 2007
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment