Thursday, January 19, 2017

k-means and pca against mnist

Recently I've implemented SVD low-rank approximation for image compression, but then I thought that actually this low-rank approximation shows direction with highest variance... and that means we could estimate how many classes we should pick to train K-Means...

Saturday, January 14, 2017

Octave online

Guys, have you seen Octave online? quite impressive thing, you can store!!! your scripts, it can plot and does symbolic evaluations! Basic Machine Learning, Linear Algebra, Convex optimization, etc courses can be done right without any investment in an expensive software (hello Matlab basic bundle for 5k). w00h00

http://octave-online.net/

Wednesday, January 11, 2017

SVD low rank approximation for pictures (SVD image compression on Python3)

Well, tons of posts were written about it. It wont be "yet another one". But I implemented it anyway, just for fun. I wont explain theoretical part, but will suggest some links about it.

So good docs are:
http://timbaumann.info/svd-image-compression-demo/

Some good docs (thx Berkeley). This book is abandoned now, but it's good to read and store link for future reference.
http://inst.eecs.berkeley.edu/~ee127a/book/login/l_svd_low_rank.html
http://inst.eecs.berkeley.edu/~ee127a/book/login/l_svd_apps_image.html

Just nice presentation about PCA and SVD
http://math.arizona.edu/~brio/VIGRE/ThursdayTalk.pdf

and the source svd-img-compression.py

magic happens in np.dot() we pickup just take approx_rank singular values and discard others, so get some compression.

Result is below, on left side depicted original greyscale picture with rank equals 440 on a right side depicted low rank approximation with rank equals 50