Home Page

28 April 2019 – update, since i put pages together i have realized:
1. I should probably have a blog page to log over time what i am doing and finding – hence now link from main menu to blog.
2. the site name NN4 perhaps should not be, as NN6 is current code base with much evolved code and far more rigor in experiments etc – this as said above BLOG pages better
In any case will leave pages that can be reached from this menu in place as they are historical trace of how NNx came to be.


I remember in 1989 coming across neural networks for the first time and being fascinated by explanation of what they are all about. Then life took over my focus. Now, years later, I have some free time (kids growing up and all that) and interest is back. After much time, on and off, research, reading etc i have put together a generic Neural Network application in Free Pascal.

I have general interest in Artificial Intelligence (AI), Neural Networks and machine learning and in general simulators and decision assisting tools. I have in the past written a simulators for worker <> task distribution and prediction, how ants find food and leave trace with pheromone some other smaller projects to satisfy my curiosity what is inside the black box.

The pages on this site contain description of NN4, distilled knowledge of various formulas and results of various experiments. As of April 2019 pages are still messy as i am in parallel documenting and putting in some finishing touches.

NN4 is a compiled program written in pascal (free pascal + lazarus IDE). NN4 is driven partly by command line and partly by configuration and uses data files for learning and testing data. NN4 is generic in some sense in that it builds in memory a neural network model based on parameters found in configuration file. Based on command line “command” it will either initialize (randomize) weights for a model (and save weights to file), or will “run” meaning will do a forward pass through neural network with given weights for all data lines found in data file, and save away with errors. With command learn NN4 will iterate to max cycles or better then error, line by line from data file, using back propagation. changed weights are written back to configuration file. NN4 is easy to drive by batches and multiple instances can be run in multiple OS windows.

NN4 is a modest home experiment, thus there are no fancy user interfaces (UI), instead i prepare data using Excel. Results NN4 can achieve are impressive – or rather – the concept of neural networks is impressive overall!

This exercise was about me wanting to know how things work under the hood. I could have used R or Phyton with already provided libraries – but i wanted to first understand the guts and maths of Neural Nets. The set of pages is about distilling the key points about NN, e.g. activation functions, derivatives, errors calculations at nodes, forward and back propagation, all in context of making some code work.

Pages are focused on implementing the guts of ordinary fully connected Neural Network. (Maybe in future i will give Convolution and Recurrent a try). If you are after theory, or math, or latest research on activation functions etc – please google.

There is large amount of info about all this i write already on internet, lot of it very useful. No one article will give you full view. There is no substitute putting in time and reading a lot of different articles to gain the full insight. If you feel NN4 can be useful to you and your experiments, contact me, you are welcome to it.

29March – (as a trivia, i used Sigmoid activation, approx 3 million iterations to predict min and max daily temperatures for Oxford for Month of March 2019. Model said min somewhere between 3.1 and 3.4 and max 10.4 – 11.2 🙂 lets see what happens when it is published in few days time! Basic Oxford weather data for last 150 years is freely available here :
https://www.metoffice.gov.uk/public/weather/climate-historic/#?tab=climateHistoric ).

Copyright (C) Zoran Zmajkovic 2019