30 June 2018
Singular-value decomposition (SVD)
Principal component analysis (PCA) makes SVD is quite a hot topic now. There are a large number of available resources on the Internet regarding SVD. This week Ilya Gvozdetsky and I reviewed some of them:
The Elements of Statistical Learning: Data Mining, Inference, and Prediction. has a chapter devoted to PCA. It looked very promising at the beginning but actually its way of describing material might be mostly understood by only experts in the area.
Mining of Massive Datasets book with video lectures. Chapter 11 Dimensionality Reduction explains SVD in details. Actually very good explanation including Power Iterations method to calculate eigenvectors and eigenvalues and a way to do SVD. With one missed piece. There is no recommendation in Power Iteration algorithm how to choose a random vector to decrease the chance that our vector is orthogonal to the eigenvector (use random vectors).
AMATH 301 Lecture: The Singular Value Decomposition (SVD) is a fair lecture about SVD. Some important things are stressed. Although I have got an impression that the lecturer did not always completely understand what he was talking about (e.g. when he messed with shapes of the matrices).
Our implementation of SVD using numpy: svd.py.
There are quite a few choices for the first option (TypedScript, Elm, PureScript etc.) and a very limited number of choices for the second option: e.g. AssemblyScript which translates TypedScript.
ETCD is a distributed key-value store (like Redis), one cool thing about it is using Raft consensus algorithm. Nice to know that more products adopting Raft.
Charts for web pages
- elm-plot https://terezka.github.io/elm-plot/