In the beginning, there was motivation
This was my first contact ever! Armed with a will to learn and inspiration gained from Tambet Matiisens presentation at Helmes Hack Day, I was ready to take on the black magic of artificial neural networks!
Before You proceed, take a look at Deep Mind learning to play Atari Breakout
The first contact
Since I had no idea whatsoever how does a Artificial Neural Network (ANN) operate or what do I even need to have one, I had to do some background studying. Also, I searched for “ready to use” networks such as Caffe or Project Oxford but stumbled on something called Fast Artificial Neural Network (FANN). It had a long history and latest commits in GitHub were few days old. I decided to use FANN and started to read more about it on FANN homepage.
ANNs are basically systems that take some input data, take also the desired result and using this knowledge, it can reason about some new piece of data, whether it is probable or not. Read more about ANNs at Wikipedia.
Installing the Skynet
After initial research I was ready to install my very first very own ANN! This was a process containing the following procedures:
# apt-get install libfann-dev
By this point the core system was ready to go! Since my potential implementing systems were written in PHP, i needed a language binding. This process was as heavy as the previous one:
# pecl install fann
The PHP binding GitHub page contained an XOR switch example I could use to test if my system was set up successfully. It was!
Training the AI to suggest, not to kill
All I needed now was to train the ANN according to my requirements. For this I had the perfect guinea pig: DrinkPlanet. This is a site somewhat similar to RateBeer but it has taste patterns for each drink. I used those to generate a training file for each user. This data is then used to suggest drinks that user has not tried yet! In order to do so, I had to divide all my values by 100, to make the input between -1 and 1. When my ANN reasons about drinks, I filter out everything that has a probability greater than 0.9!
In conclusion I’d like to emphasize that this is all quite easy to get started, the next steps are harder. I have to figure out the optimal number of cycles to train, the optimal number of layers, hidden neurons etc. But until then I have 10+ beers to taste, so I can be sure whether my current parameters are good or not!
- Neural Networks Made Simple
- Implementation of a Fast Artificial Neural Network Library (FANN)
- Training an ANN