2014-07-28
* GMM initialization is now safer and provides a working GMM when constructed
with only the dimensionality and number of Gaussians (301).
* Check for division by 0 in Forward-Backward Algorithm in HMMs (301).
* Fix MaxVarianceNewCluster (used when re-initializing clusters for k-means)
(301).
* Fixed implementation of Viterbi algorithm in HMM::Predict() (303).
* Significant speedups for dual-tree algorithms using the cover tree (235,
314) including a faster implementation of FastMKS.
* Fix for LRSDP optimizer so that it compiles and can be used (312).
* CF (collaborative filtering) now expects users and items to be zero-indexed,
not one-indexed (311).
* CF::GetRecommendations() API change: now requires the number of
recommendations as the first parameter. The number of users in the local
neighborhood should be specified with CF::NumUsersForSimilarity().
* Removed incorrect PeriodicHRectBound (58).
* Refactor LRSDP into LRSDP class and standalone function to be optimized
(305).
* Fix for centering in kernel PCA (337).
* Added simulated annealing (SA) optimizer, contributed by Zhihao Lou.
* HMMs now support initial state probabilities; these can be set in the
constructor, trained, or set manually with HMM::Initial() (302).
* Added Nyström method for kernel matrix approximation by Marcus Edel.
* Kernel PCA now supports using Nyström method for approximation.
* Ball trees now work with dual-tree algorithms, via the BallBound<> bound
structure (307); fixed by Yash Vadalia.
* The NMF class is now AMF<>, and supports far more types of factorizations,
by Sumedh Ghaisas.
* A QUIC-SVD implementation has returned, written by Siddharth Agrawal and
based on older code from Mudit Gupta.
* Added perceptron and decision stump by Udit Saxena (these are weak learners
for an eventual AdaBoost class).
* Sparse autoencoder added by Siddharth Agrawal.