Wednesday, July 11, 2018

Introducing myself into Deep Learning

Overview

It has been some time since my last blog post. Actually, it has been more than two years. The main reason is that something changed in my life when I started to train a neural network. Her name is Alexa.

Just to avoid confusion let me clarify that I am not a member of Amazon Alexa team. Alexa is the female version of my own name and it is the name of my little girl ;)

She is one of the reasons for this deep learning journey.

How did the journey begin?

Every journey has a motivation and this one is not the exception. It started on September 19 of 2016 when I carried her for the very first time. After saw her for a while I asked myself: How is possible that she can learn something in the future?

Months passed, and I saw her learning so fast effortless without too much “computational power” (apparently). She learned to walk, to dance, to almost talk, to solve a puzzle, to play basketball, to solve pyramid-piling rings puzzle, and even to scramble a Rubik’s cube ;)

Solving puzzle
 Defending vs. Elizabeth
Scrambling a Rubik's cube
Beyond the obvious answers that she will learn by design, there is a lot of trial and error in her learning process. Several attempts to look for the best fit before she can learn something. I love to see her “computing the distance” from the expected result and her attempts at solving the pyramid-piling rings puzzle by removing a wrong ring and replacing it with the right one.

Dad is trying to build something
with blocks but I'm interested
in the neighbor’s dog ;)
What really happened is that her learning process motivated me to explore something “new”. A discipline that is called to be (if it is not really is) the new toolset for every single software developer. It turns out that the new hobby came with a practitioner approach but it required to be found first.

What happened next?

At this point, I started to watch the machine learning course videos of Andrew Ng using alternatives sources. Cubans (that live in Cuba) are not able to access the certification program at Coursera. In some way, the USA embargo to Cuba – (specifically the USA exportation laws) – also affects the deep learning global democratization process.

However, it doesn't worry me too much, actually never does. I can't get the certification but I can get the knowledge. Andrew’s course is actually motivating and didactically insuperable. It was able to bring back to life some math and algebra that I thought was dead in my brain and made me felt very comfortable implementing a vectorized version of the Stochastic Gradient Descent algorithm with Octave.


After understand how this works (almost just like Andrew used to say), and what kind of problems were possible to solve, just I wanted to put this in practice at the production level. Some researchers (friends of mine) sold me Tensorflow as the Holy Grail but I had some doubts about Python performance (still have).

The new hobby comes with the
practitioner approach
After some research, I found exactly what I was looking for. A deep learning library for JVM.  Deeplearnig4j (DL4j) is an excellent library and comes with this excellent book Deep Learning: A Practitioner's Approach from Adam Gibson and Josh Patterson. I just needed to read the preface, to identify me as a deep learning practitioner. It wasn't to hard notice that was the right book for me. I'm pretty sure that is also the right book for you.

DL4j also comes with a lot of helpful features and tools to assist the training process including Training UI, datavec, early stopping, even GPU support, and more, but we can talk about this in forthcoming posts.
DL4J Performance - examples/sec on Intel(R) Core(TM) i7-4770 CPU @ 3.40GHz (No GPU required in this case)
Nice normal distribution shape for weights in the Layer Parameters Histogram (So, no regularization issues)
Some network details
Recently, I also started to work in build a proof of concept of an anomaly detection system built on top of DL4j, specifically by using Autoencoder networks with promising results but I can't give you anything in advance yet (just wait for it).

Autoencoders are supported by DL4j

Conclusions

I have a lot to learn about deep learning, but the journey has already begun and I have the intention to share it with you.

Btw, if you are not motivated to go “deep” with machine learning yet, be a father (or mother) first, and then let me know. I'm sure you will find the same "biological" inspiration when you witness how to "train a neural network" looks like ;)

Alexa talking with a large predator cat of stone

5 comments:

  1. An excellent post from an excellent father and a great engineer, I'm truly motivated after reading this post, good job my friend :)

    ReplyDelete
  2. Great Post, I look forward to the second! The suspense is too big! LOL Congratulations Igr

    ReplyDelete
  3. Congratulations!!! I think your ideas and innovations are very useful in our enterprise, and also your way to motivate people to learn.

    ReplyDelete

X-ray StoneAssemblies.MassAuth with NDepend

Introduction A long time ago, I wrote this post  Why should you start using NDepend?  which I consider as the best post I have ever...