Monday, November 7, 2016

Tensorflow in Gentoo virtualenv

Just tried to install tensorflow in Gentoo virtual env with python 3.4 and failed

Tuesday, September 20, 2016

k-means and MNIST dataset

Just thought few days ago how should look avarage 0 or 1 or 9 or any number. So downloaded MNIST dataset and implemented k-means algorithm (I could calculate average vector in every cluster, but that wont be interesting).

Sunday, August 21, 2016

Back to the Linux or how I debugged silent crash

Just got tired from Mac and decided to move to my old Gentoo Linux. After recompiling world (oh yeah, all from sources) I stuck with SIGSEGV with plot command in Octave (Matlab like math package).

When I tried to plot anything it crashed with SIGSEGV without any obvious reason. So I got to go deeper to fix it. Gentoo supports nice debug options all that one needs is just to enable them
https://wiki.gentoo.org/wiki/Debugging

After that simple
# sudo ulimit -c 99999999
# octave
octave:1> plot(1)

then crash with core in the current folder. Now one need to unwind the stack
# gdb /usr/lib/debug/usr/bin/octave-cli-3.8.2.debug core
(gdb) bt
#0  fl_create_gl_context (g=0x0) at Fl_Gl_Choice.H:102
#1  Fl_Gl_Window::make_current (this=0xc12430) at Fl_Gl_Window.cxx:168
#2  0x00007fc1a5706447 in plot_window::show_canvas (this=0xc11bb0,
    this=0xc11bb0) at dldfcn/__init_fltk__.cc:935
....
#88 0x00007fc1b715a620 in __libc_start_main () from /lib64/libc.so.6
#89 0x0000000000400979 in _start ()

So, I didnt recompile nvidia drivers or my opengl config is bad. I had problems with opengl config, repoint it with eselect opengl and run X11 again.

Problem has solved

Wednesday, July 27, 2016

Two good books about neuro networks

Long time no see :) short post about machine learning. I wanted to go deeper into this field and tried to find some good book, not like a "Machine Learning in 24 hours" but something grad level with acceptable science payload. So I've read ~30% of Simon Haykin "Neural Networks and Learning Machines (3rd Edition)". Book is good, but not enough naive examples to jump in, like 1 neuron with 2 inputs, but science payload there is nice. Gazzillion links to articles and books, so book is useful when you know what to do. Then I found nice complementary book with naive examples. Also it's really cheap on Amazon - about 25$, also it can be downloaded for free from the official site. "Neural Network Design (2nd Edition)" by Hagan, et al. So have fun.

Meanwhile, I started uber small project - neural networks without any special lib. I use only numpy for matrix calculation. Everything slow but primitive and self explanatory. As a data set I'm using MNIST data set, it has 60k training samples and 10k test samples.

Simple implementation of Widrow-Hoff perceptron is here, ~70% success rate after training
https://github.com/venik/simple_neuro_networks/tree/master/src/one_layer_mnist

And MNIST reader is here
https://github.com/venik/simple_neuro_networks/blob/master/lib/mnist/mnist.py

also you probably want to see how does MNIST samples look on a screen
https://github.com/venik/simple_neuro_networks/tree/master/utils/mnist_reader

How to download MNIST is here
https://github.com/venik/simple_neuro_networks/tree/master/data_set/mnist