Concept behind the complexity of neural networks from a different aspect within 10 lines of code.

A few words before starting to explain about neural networks. This post was inspired by Milo Harper's post on Medium. Code below is also written in Python only.

Our human brain has hundred of billions of neuron cells. These cells are connected each together by synapses. Because of those things, we are able to think. A neural network is an approach for devices to think as humans. This is where artificial intelligence and machine learning runs on. Neural networks.

Let's first begin by importing natural exponential, array, dot method's and random modules from the core dependency **NumPy**.

`from numpy import exp, array, random, dot`

Now, our goal is to make a neural network which can predict the digit in the "?" place.

Input | Output |
---|---|

0 0 1 | 1 |

1 1 1 | 1 |

1 0 1 | 1 |

0 1 1 | 0 |

1 0 0 | ? |

So, first of all, let's recreate following table of inputs and outputs in Python by using NumPy's array.

```
from numpy import exp, array, random, dot
training_set_inputs = array([[0, 0, 1], [1, 1, 1], [1, 0, 1], [0, 1, 1]])
training_set_outputs = array([[0, 1, 1, 0]]).T
```

As simple as that, we have made two matrices. One holding inputs and the second one holding outputs. Also, you may have noticed the `.T`

at the end. It is used for transposing the matrix from horizontal to vertical...

Now let's seed the the random number generator before starting to work on the "core functionality".

```
from numpy import exp, array, random, dot
training_set_inputs = array([[0, 0, 1], [1, 1, 1], [1, 0, 1], [0, 1, 1]])
training_set_outputs = array([[0, 1, 1, 0]]).T
random.seed(1)
```

Weights! In other words, the thing that we will train. Simply, we set it to a matrix with random numbers in the beginning.

```
from numpy import exp, array, random, dot
training_set_inputs = array([[0, 0, 1], [1, 1, 1], [1, 0, 1], [0, 1, 1]])
training_set_outputs = array([[0, 1, 1, 0]]).T
random.seed(1)
synaptic_weights = 2 * random.random((3, 1)) - 1
```

Time to begin the training. First of all, let's make a `for`

loop, which will be used to train our neural network. It also should iterate about 10,000 times.

```
from numpy import exp, array, random, dot
training_set_inputs = array([[0, 0, 1], [1, 1, 1], [1, 0, 1], [0, 1, 1]])
training_set_outputs = array([[0, 1, 1, 0]]).T
random.seed(1)
synaptic_weights = 2 * random.random((3, 1)) - 1
for iteration in range(10000):
```

Afterwards, we need to find the output that our neural network provides everytime after 1 training session (yup, there will be 10000 training sessions). To do so, let's use a *Sigmoid function*. It allows us to normalize the result between 0 and 1. The function in mathematical form looks like this.

To calculate the output, we simply need to multiply input by synaptic weights we've made before. So, let's do that in the code.

```
from numpy import exp, array, random, dot
training_set_inputs = array([[0, 0, 1], [1, 1, 1], [1, 0, 1], [0, 1, 1]])
training_set_outputs = array([[0, 1, 1, 0]]).T
random.seed(1)
synaptic_weights = 2 * random.random((3, 1)) - 1
for iteration in range(10000):
output = 1 / (1 + exp(-(dot(training_set_inputs, synaptic_weights))))
```

After doing so, let's just adjust the weight and training process is done. Adjusting formula is simple: input × error × Sigmoid curve gradient. To calculate the error, we have to subtract output from the training output and to calculate Sigmoid's curve gradient, we need to use a simple derivative. So at the end, the code should look like this.

```
from numpy import exp, array, random, dot
training_set_inputs = array([[0, 0, 1], [1, 1, 1], [1, 0, 1], [0, 1, 1]])
training_set_outputs = array([[0, 1, 1, 0]]).T
random.seed(1)
synaptic_weights = 2 * random.random((3, 1)) - 1
for iteration in range(10000):
output = 1 / (1 + exp(-(dot(training_set_inputs, synaptic_weights))))
synaptic_weights += dot(training_set_inputs.T, (training_set_outputs - output) * output * (1 - output))
```

Last, but not least, let's ask our neural network to predict the "?" digit!

```
from numpy import exp, array, random, dot
training_set_inputs = array([[0, 0, 1], [1, 1, 1], [1, 0, 1], [0, 1, 1]])
training_set_outputs = array([[0, 1, 1, 0]]).T
random.seed(1)
synaptic_weights = 2 * random.random((3, 1)) - 1
for iteration in range(10000):
output = 1 / (1 + exp(-(dot(training_set_inputs, synaptic_weights))))
synaptic_weights += dot(training_set_inputs.T, (training_set_outputs - output) * output * (1 - output))
print(1 / (1 + exp(-(dot(array([1, 0, 0]), synaptic_weights)))))
```

We're done! Now, if we run the code, it should return something like 0.99999704 and etc. We can see that the number is extremely close to 1, which means that our neural network is predicting the right digit.

Thanks everyone for reading the internepost (slang...), hope to see you sometime soon in another blog post from us.