My Math Notes
  • Home
  • Calculus Notebooks
    • Backpropagation
    • Optimizing Functions of One Variable: Cost Minimization
    • Differentiation in Python: Symbolic, Numerical and Automatic
    • Optimization Using Gradient Descent: Linear Regression
    • Optimization Using Gradient Descent in One Variable
    • Optimization Using Gradient Descent in Two Variables
    • Neural Network with Two Layers
    • Regression with Perceptron
    • Classification with Perceptron
    • Optimization Using Newton's Method
    • Fitting the distribution of heights data
    • Gradient descent in a sandpit
    • The Sandpit - Part 2
    • The Sandpit
  • Deeplearning.ai
    • Calculus
    • Linear Algebra
    • Probability and Statistics
  • Imperial College
    • Linear Algebra
    • Multivariate Calculus
    • PCA
  • Linear Algebra Notebooks
    • Single Perceptron Neural Networks for Linear Regression
    • Single Perceptron Neural Networks for Linear Regression
    • Eigenvalues and Eigenvectors
    • Gram-Schmidt process
    • Identifying special matrices
    • PageRank
    • Reflecting Bear
    • Linear Transformations
    • Matrix Multiplication
    • Solving Linear Systems: 2 variables
    • Vector Operations: Scalar Multiplication, Sum and Dot Product of Vectors
  • PCA notebooks
    • Distances and Angles between Images
    • K nearest Neighbors Algorithm
    • Week 1: Mean/Covariance of a data set and effect of a linear transformation
    • Orthogonal Projections
    • Principal Component Analysis (PCA)
    • Demonstration of PCA
  • Previous
  • Next
  • Neural Network with Two Layers
  • Table of Contents
    • Packages
    • 1 - Classification Problem
    • 2 - Neural Network Model with Two Layers
    • 3 - Implementation of the Neural Network Model with Two Layers

Neural Network with Two Layers¶

Welcome to your week three programming assignment. You are ready to build a neural network with two layers and train it to solve a classification problem.

After this assignment, you will be able to:

  • Implement a neural network with two layers to a classification problem
  • Implement forward propagation using matrix multiplication
  • Perform backward propagation

Table of Contents¶

  • 1 - Classification Problem
  • 2 - Neural Network Model with Two Layers
    • 2.1 - Neural Network Model with Two Layers for a Single Training Example
    • 2.2 - Neural Network Model with Two Layers for Multiple Training Examples
    • 2.3 - Cost Function and Training
    • 2.4 - Dataset
    • 2.5 - Define Activation Function
      • Exercise 1
  • 3 - Implementation of the Neural Network Model with Two Layers
    • 3.1 - Defining the Neural Network Structure
      • Exercise 2
    • 3.2 - Initialize the Model's Parameters
      • Exercise 3
    • 3.3 - The Loop
      • Exercise 4
      • Exercise 5
      • Exercise 6
    • 3.4 - Integrate parts 3.1, 3.2 and 3.3 in nn_model()
      • Exercise 7
      • Exercise 8
  • 4 - Optional: Other Dataset

Packages¶

First, import all the packages you will need during this assignment.

In [1]:
graded
Copied!
import numpy as np
import matplotlib.pyplot as plt
from matplotlib import colors
# A function to create a dataset.
from sklearn.datasets import make_blobs

# Output of plotting commands is displayed inline within the Jupyter notebook.
%matplotlib inline 

# Set a seed so that the results are consistent.
np.random.seed(3)
import numpy as np import matplotlib.pyplot as plt from matplotlib import colors # A function to create a dataset. from sklearn.datasets import make_blobs # Output of plotting commands is displayed inline within the Jupyter notebook. %matplotlib inline # Set a seed so that the results are consistent. np.random.seed(3)

Load the unit tests defined for this notebook.

In [2]:
Copied!
import w3_unittest
import w3_unittest

1 - Classification Problem¶

In one of the labs this week, you trained a neural network with a single perceptron, performing forward and backward propagation. That simple structure was enough to solve a "linear" classification problem - finding a straight line in a plane that would serve as a decision boundary to separate two classes.

Imagine that now you have a more complicated problem: you still have two classes, but one line will not be enough to separate them.

In [3]:
graded
Copied!
fig, ax = plt.subplots()
xmin, xmax = -0.2, 1.4
x_line = np.arange(xmin, xmax, 0.1)
# Data points (observations) from two classes.
ax.scatter(0, 0, color="r")
ax.scatter(0, 1, color="b")
ax.scatter(1, 0, color="b")
ax.scatter(1, 1, color="r")
ax.set_xlim([xmin, xmax])
ax.set_ylim([-0.1, 1.1])
ax.set_xlabel('$x_1$')
ax.set_ylabel('$x_2$')
# Example of the lines which can be used as a decision boundary to separate two classes.
ax.plot(x_line, -1 * x_line + 1.5, color="black")
ax.plot(x_line, -1 * x_line + 0.5, color="black")
plt.plot()
fig, ax = plt.subplots() xmin, xmax = -0.2, 1.4 x_line = np.arange(xmin, xmax, 0.1) # Data points (observations) from two classes. ax.scatter(0, 0, color="r") ax.scatter(0, 1, color="b") ax.scatter(1, 0, color="b") ax.scatter(1, 1, color="r") ax.set_xlim([xmin, xmax]) ax.set_ylim([-0.1, 1.1]) ax.set_xlabel('$x_1$') ax.set_ylabel('$x_2$') # Example of the lines which can be used as a decision boundary to separate two classes. ax.plot(x_line, -1 * x_line + 1.5, color="black") ax.plot(x_line, -1 * x_line + 0.5, color="black") plt.plot()
Out[3]:
[]
No description has been provided for this image

This logic can appear in many applications. For example, if you train a model to predict whether you should buy a house knowing its size and the year it was built. A big new house will not be affordable, while a small old house will not be worth buying. So, you might be interested in either a big old house, or a small new house.

The one perceptron neural network is not enough to solve such classification problem. Let's look at how you can adjust that model to find the solution.

In the plot above, two lines can serve as a decision boundary. Your intuition might tell you that you should also increase the number of perceptrons. And that is absolutely right! You need to feed your data points (coordinates $x_1$, $x_2$) into two nodes separately and then unify them somehow with another one to make a decision.

Now let's figure out the details, build and train your first multi-layer neural network!

2 - Neural Network Model with Two Layers¶

2.1 - Neural Network Model with Two Layers for a Single Training Example¶

No description has been provided for this image

The input and output layers of the neural network are the same as for one perceptron model, but there is a hidden layer now in between them. The training examples $x^{(i)}=\begin{bmatrix}x_1^{(i)} \\ x_2^{(i)}\end{bmatrix}$ from the input layer of size $n_x = 2$ are first fed into the hidden layer of size $n_h = 2$. They are simultaneously fed into the first perceptron with weights $W_1^{[1]}=\begin{bmatrix}w_{1,1}^{[1]} & w_{2,1}^{[1]}\end{bmatrix}$, bias $b_1^{[1]}$; and into the second perceptron with weights $W_2^{[1]}=\begin{bmatrix}w_{1,2}^{[1]} & w_{2,2}^{[1]}\end{bmatrix}$, bias $b_2^{[1]}$. The integer in the square brackets $^{[1]}$ denotes the layer number, because there are two layers now with their own parameters and outputs, which need to be distinguished.

\begin{align} z_1^{[1](i)} &= w_{1,1}^{[1]} x_1^{(i)} + w_{2,1}^{[1]} x_2^{(i)} + b_1^{[1]} = W_1^{[1]}x^{(i)} + b_1^{[1]},\\ z_2^{[1](i)} &= w_{1,2}^{[1]} x_1^{(i)} + w_{2,2}^{[1]} x_2^{(i)} + b_2^{[1]} = W_2^{[1]}x^{(i)} + b_2^{[1]}.\tag{1} \end{align}

These expressions for one training example $x^{(i)}$ can be rewritten in a matrix form :

$$z^{[1](i)} = W^{[1]} x^{(i)} + b^{[1]},\tag{2}$$

where

    $z^{[1](i)} = \begin{bmatrix}z_1^{[1](i)} \\ z_2^{[1](i)}\end{bmatrix}$ is vector of size $\left(n_h \times 1\right) = \left(2 \times 1\right)$;

    $W^{[1]} = \begin{bmatrix}W_1^{[1]} \\ W_2^{[1]}\end{bmatrix} = \begin{bmatrix}w_{1,1}^{[1]} & w_{2,1}^{[1]} \\ w_{1,2}^{[1]} & w_{2,2}^{[1]}\end{bmatrix}$ is matrix of size $\left(n_h \times n_x\right) = \left(2 \times 2\right)$;

    $b^{[1]} = \begin{bmatrix}b_1^{[1]} \\ b_2^{[1]}\end{bmatrix}$ is vector of size $\left(n_h \times 1\right) = \left(2 \times 1\right)$.

Next, the hidden layer activation function needs to be applied for each of the elements in the vector $z^{[1](i)}$. Various activation functions can be used here and in this model you will take the sigmoid function $\sigma\left(x\right) = \frac{1}{1 + e^{-x}}$. Remember that its derivative is $\frac{d\sigma}{dx} = \sigma\left(x\right)\left(1-\sigma\left(x\right)\right)$. The output of the hidden layer is a vector of size $\left(n_h \times 1\right) = \left(2 \times 1\right)$:

$$a^{[1](i)} = \sigma\left(z^{[1](i)}\right) = \begin{bmatrix}\sigma\left(z_1^{[1](i)}\right) \\ \sigma\left(z_2^{[1](i)}\right)\end{bmatrix}.\tag{3}$$

Then the hidden layer output gets fed into the output layer of size $n_y = 1$. This was covered in the previous lab, the only difference are: $a^{[1](i)}$ is taken instead of $x^{(i)}$ and layer notation $^{[2]}$ appears to identify all parameters and outputs:

$$z^{[2](i)} = w_1^{[2]} a_1^{[1](i)} + w_2^{[2]} a_2^{[1](i)} + b^{[2]}= W^{[2]} a^{[1](i)} + b^{[2]},\tag{4}$$

    $z^{[2](i)}$ and $b^{[2]}$ are scalars for this model, as $\left(n_y \times 1\right) = \left(1 \times 1\right)$;

    $W^{[2]} = \begin{bmatrix}w_1^{[2]} & w_2^{[2]}\end{bmatrix}$ is vector of size $\left(n_y \times n_h\right) = \left(1 \times 2\right)$.

Finally, the same sigmoid function is used as the output layer activation function:

$$a^{[2](i)} = \sigma\left(z^{[2](i)}\right).\tag{5}$$

Mathematically the two layer neural network model for each training example $x^{(i)}$ can be written with the expressions $(2) - (5)$. Let's rewrite them next to each other for convenience:

\begin{align} z^{[1](i)} &= W^{[1]} x^{(i)} + b^{[1]},\\ a^{[1](i)} &= \sigma\left(z^{[1](i)}\right),\\ z^{[2](i)} &= W^{[2]} a^{[1](i)} + b^{[2]},\\ a^{[2](i)} &= \sigma\left(z^{[2](i)}\right).\\ \tag{6} \end{align}

Note, that all of the parameters to be trained in the model are without $^{(i)}$ index - they are independent on the input data.

Finally, the predictions for some example $x^{(i)}$ can be made taking the output $a^{[2](i)}$ and calculating $\hat{y}$ as: $\hat{y} = \begin{cases} 1 & \mbox{if } a^{[2](i)} > 0.5, \\ 0 & \mbox{otherwise }. \end{cases}$.

2.2 - Neural Network Model with Two Layers for Multiple Training Examples¶

Similarly to the single perceptron model, $m$ training examples can be organised in a matrix $X$ of a shape ($2 \times m$), putting $x^{(i)}$ into columns. Then the model $(6)$ can be rewritten in terms of matrix multiplications:

\begin{align} Z^{[1]} &= W^{[1]} X + b^{[1]},\\ A^{[1]} &= \sigma\left(Z^{[1]}\right),\\ Z^{[2]} &= W^{[2]} A^{[1]} + b^{[2]},\\ A^{[2]} &= \sigma\left(Z^{[2]}\right),\\ \tag{7} \end{align}

where $b^{[1]}$ is broadcasted to the matrix of size $\left(n_h \times m\right) = \left(2 \times m\right)$ and $b^{[2]}$ to the vector of size $\left(n_y \times m\right) = \left(1 \times m\right)$. It would be a good exercise for you to have a look at the expressions $(7)$ and check that sizes of the matrices will actually match to perform required multiplications.

You have derived expressions to perform forward propagation. Time to evaluate your model and train it.

2.3 - Cost Function and Training¶

For the evaluation of this simple neural network you can use the same cost function as for the single perceptron case - log loss function. Originally initialized weights were just some random values, now you need to perform training of the model: find such set of parameters $W^{[1]}$, $b^{[1]}$, $W^{[2]}$, $b^{[2]}$, that will minimize the cost function.

Like in the previous example of a single perceptron neural network, the cost function can be written as:

$$\mathcal{L}\left(W^{[1]}, b^{[1]}, W^{[2]}, b^{[2]}\right) = \frac{1}{m}\sum_{i=1}^{m} L\left(W^{[1]}, b^{[1]}, W^{[2]}, b^{[2]}\right) = \frac{1}{m}\sum_{i=1}^{m} \large\left(\small - y^{(i)}\log\left(a^{[2](i)}\right) - (1-y^{(i)})\log\left(1- a^{[2](i)}\right) \large \right), \small\tag{8}$$

where $y^{(i)} \in \{0,1\}$ are the original labels and $a^{[2](i)}$ are the continuous output values of the forward propagation step (elements of array $A^{[2]}$).

To minimize it, you can use gradient descent, updating the parameters with the following expressions:

\begin{align} W^{[1]} &= W^{[1]} - \alpha \frac{\partial \mathcal{L} }{ \partial W^{[1]} },\\ b^{[1]} &= b^{[1]} - \alpha \frac{\partial \mathcal{L} }{ \partial b^{[1]} },\\ W^{[2]} &= W^{[2]} - \alpha \frac{\partial \mathcal{L} }{ \partial W^{[2]} },\\ b^{[2]} &= b^{[2]} - \alpha \frac{\partial \mathcal{L} }{ \partial b^{[2]} },\\ \tag{9} \end{align}

where $\alpha$ is the learning rate.

To perform training of the model you need to calculate now $\frac{\partial \mathcal{L} }{ \partial W^{[1]}}$, $\frac{\partial \mathcal{L} }{ \partial b^{[1]}}$, $\frac{\partial \mathcal{L} }{ \partial W^{[2]}}$, $\frac{\partial \mathcal{L} }{ \partial b^{[2]}}$.

Let's start from the end of the neural network. You can rewrite here the corresponding expressions for $\frac{\partial \mathcal{L} }{ \partial W }$ and $\frac{\partial \mathcal{L} }{ \partial b }$ from the single perceptron neural network:

\begin{align} \frac{\partial \mathcal{L} }{ \partial W } &= \frac{1}{m}\left(A-Y\right)X^T,\\ \frac{\partial \mathcal{L} }{ \partial b } &= \frac{1}{m}\left(A-Y\right)\mathbf{1},\\ \end{align}

where $\mathbf{1}$ is just a ($m \times 1$) vector of ones. Your one perceptron is in the second layer now, so $W$ will be exchanged with $W^{[2]}$, $b$ with $b^{[2]}$, $A$ with $A^{[2]}$, $X$ with $A^{[1]}$:

\begin{align} \frac{\partial \mathcal{L} }{ \partial W^{[2]} } &= \frac{1}{m}\left(A^{[2]}-Y\right)\left(A^{[1]}\right)^T,\\ \frac{\partial \mathcal{L} }{ \partial b^{[2]} } &= \frac{1}{m}\left(A^{[2]}-Y\right)\mathbf{1}.\\ \tag{10} \end{align}

Let's now find $\frac{\partial \mathcal{L} }{ \partial W^{[1]}} = \begin{bmatrix} \frac{\partial \mathcal{L} }{ \partial w_{1,1}^{[1]}} & \frac{\partial \mathcal{L} }{ \partial w_{2,1}^{[1]}} \\ \frac{\partial \mathcal{L} }{ \partial w_{1,2}^{[1]}} & \frac{\partial \mathcal{L} }{ \partial w_{2,2}^{[1]}} \end{bmatrix}$. It was shown in the videos that $$\frac{\partial \mathcal{L} }{ \partial w_{1,1}^{[1]}}=\frac{1}{m}\sum_{i=1}^{m} \left( \left(a^{[2](i)} - y^{(i)}\right) w_1^{[2]} \left(a_1^{[1](i)}\left(1-a_1^{[1](i)}\right)\right) x_1^{(i)}\right)\tag{11}$$

If you do this accurately for each of the elements $\frac{\partial \mathcal{L} }{ \partial W^{[1]}}$, you will get the following matrix:

$$\frac{\partial \mathcal{L} }{ \partial W^{[1]}} = \begin{bmatrix} \frac{\partial \mathcal{L} }{ \partial w_{1,1}^{[1]}} & \frac{\partial \mathcal{L} }{ \partial w_{2,1}^{[1]}} \\ \frac{\partial \mathcal{L} }{ \partial w_{1,2}^{[1]}} & \frac{\partial \mathcal{L} }{ \partial w_{2,2}^{[1]}} \end{bmatrix}$$ $$= \frac{1}{m}\begin{bmatrix} \sum_{i=1}^{m} \left( \left(a^{[2](i)} - y^{(i)}\right) w_1^{[2]} \left(a_1^{[1](i)}\left(1-a_1^{[1](i)}\right)\right) x_1^{(i)}\right) & \sum_{i=1}^{m} \left( \left(a^{[2](i)} - y^{(i)}\right) w_1^{[2]} \left(a_1^{[1](i)}\left(1-a_1^{[1](i)}\right)\right) x_2^{(i)}\right) \\ \sum_{i=1}^{m} \left( \left(a^{[2](i)} - y^{(i)}\right) w_2^{[2]} \left(a_2^{[1](i)}\left(1-a_2^{[1](i)}\right)\right) x_1^{(i)}\right) & \sum_{i=1}^{m} \left( \left(a^{[2](i)} - y^{(i)}\right) w_2^{[2]} \left(a_2^{[1](i)}\left(1-a_2^{[1](i)}\right)\right) x_2^{(i)}\right)\end{bmatrix}\tag{12}$$

Looking at this, you can notice that all terms and indices somehow are very consistent, so it all can be unified into a matrix form. And that's true! $\left(W^{[2]}\right)^T = \begin{bmatrix}w_1^{[2]} \\ w_2^{[2]}\end{bmatrix}$ of size $\left(n_h \times n_y\right) = \left(2 \times 1\right)$ can be multiplied with the vector $A^{[2]} - Y$ of size $\left(n_y \times m\right) = \left(1 \times m\right)$, resulting in a matrix of size $\left(n_h \times m\right) = \left(2 \times m\right)$:

$$\left(W^{[2]}\right)^T \left(A^{[2]} - Y\right)= \begin{bmatrix}w_1^{[2]} \\ w_2^{[2]}\end{bmatrix} \begin{bmatrix}\left(a^{[2](1)} - y^{(1)}\right) & \cdots & \left(a^{[2](m)} - y^{(m)}\right)\end{bmatrix} =\begin{bmatrix} \left(a^{[2](1)} - y^{(1)}\right) w_1^{[2]} & \cdots & \left(a^{[2](m)} - y^{(m)}\right) w_1^{[2]} \\ \left(a^{[2](1)} - y^{(1)}\right) w_2^{[2]} & \cdots & \left(a^{[2](m)} - y^{(m)}\right) w_2^{[2]} \end{bmatrix}$$.

Now taking matrix $A^{[1]}$ of the same size $\left(n_h \times m\right) = \left(2 \times m\right)$,

$$A^{[1]} =\begin{bmatrix} a_1^{[1](1)} & \cdots & a_1^{[1](m)} \\ a_2^{[1](1)} & \cdots & a_2^{[1](m)} \end{bmatrix},$$

you can calculate:

$$A^{[1]}\cdot\left(1-A^{[1]}\right) =\begin{bmatrix} a_1^{[1](1)}\left(1 - a_1^{[1](1)}\right) & \cdots & a_1^{[1](m)}\left(1 - a_1^{[1](m)}\right) \\ a_2^{[1](1)}\left(1 - a_2^{[1](1)}\right) & \cdots & a_2^{[1](m)}\left(1 - a_2^{[1](m)}\right) \end{bmatrix},$$

where "$\cdot$" denotes element by element multiplication.

With the element by element multiplication,

$$\left(W^{[2]}\right)^T \left(A^{[2]} - Y\right)\cdot \left(A^{[1]}\cdot\left(1-A^{[1]}\right)\right)=\begin{bmatrix} \left(a^{[2](1)} - y^{(1)}\right) w_1^{[2]}\left(a_1^{[1](1)}\left(1 - a_1^{[1](1)}\right)\right) & \cdots & \left(a^{[2](m)} - y^{(m)}\right) w_1^{[2]}\left(a_1^{[1](m)}\left(1 - a_1^{[1](m)}\right)\right) \\ \left(a^{[2](1)} - y^{(1)}\right) w_2^{[2]}\left(a_2^{[1](1)}\left(1 - a_2^{[1](1)}\right)\right) & \cdots & \left(a^{[2](m)} - y^{(m)}\right) w_2^{[2]} \left(a_2^{[1](m)}\left(1 - a_2^{[1](m)}\right)\right) \end{bmatrix}.$$

If you perform matrix multiplication with $X^T$ of size $\left(m \times n_x\right) = \left(m \times 2\right)$, you will get matrix of size $\left(n_h \times n_x\right) = \left(2 \times 2\right)$:

$$\left(\left(W^{[2]}\right)^T \left(A^{[2]} - Y\right)\cdot \left(A^{[1]}\cdot\left(1-A^{[1]}\right)\right)\right)X^T = \begin{bmatrix} \left(a^{[2](1)} - y^{(1)}\right) w_1^{[2]}\left(a_1^{[1](1)}\left(1 - a_1^{[1](1)}\right)\right) & \cdots & \left(a^{[2](m)} - y^{(m)}\right) w_1^{[2]}\left(a_1^{[1](m)}\left(1 - a_1^{[1](m)}\right)\right) \\ \left(a^{[2](1)} - y^{(1)}\right) w_2^{[2]}\left(a_2^{[1](1)}\left(1 - a_2^{[1](1)}\right)\right) & \cdots & \left(a^{[2](m)} - y^{(m)}\right) w_2^{[2]} \left(a_2^{[1](m)}\left(1 - a_2^{[1](m)}\right)\right) \end{bmatrix} \begin{bmatrix} x_1^{(1)} & x_2^{(1)} \\ \cdots & \cdots \\ x_1^{(m)} & x_2^{(m)} \end{bmatrix}$$ $$=\begin{bmatrix} \sum_{i=1}^{m} \left( \left(a^{[2](i)} - y^{(i)}\right) w_1^{[2]} \left(a_1^{[1](i)}\left(1 - a_1^{[1](i)}\right) \right) x_1^{(i)}\right) & \sum_{i=1}^{m} \left( \left(a^{[2](i)} - y^{(i)}\right) w_1^{[2]} \left(a_1^{[1](i)}\left(1-a_1^{[1](i)}\right)\right) x_2^{(i)}\right) \\ \sum_{i=1}^{m} \left( \left(a^{[2](i)} - y^{(i)}\right) w_2^{[2]} \left(a_2^{[1](i)}\left(1-a_2^{[1](i)}\right)\right) x_1^{(i)}\right) & \sum_{i=1}^{m} \left( \left(a^{[2](i)} - y^{(i)}\right) w_2^{[2]} \left(a_2^{[1](i)}\left(1-a_2^{[1](i)}\right)\right) x_2^{(i)}\right)\end{bmatrix}$$

This is exactly like in the expression $(12)$! So, $\frac{\partial \mathcal{L} }{ \partial W^{[1]}}$ can be written as a mixture of multiplications:

$$\frac{\partial \mathcal{L} }{ \partial W^{[1]}} = \frac{1}{m}\left(\left(W^{[2]}\right)^T \left(A^{[2]} - Y\right)\cdot \left(A^{[1]}\cdot\left(1-A^{[1]}\right)\right)\right)X^T\tag{13},$$

where "$\cdot$" denotes element by element multiplications.

Vector $\frac{\partial \mathcal{L} }{ \partial b^{[1]}}$ can be found very similarly, but the last terms in the chain rule will be equal to $1$, i.e. $\frac{\partial z_1^{[1](i)}}{ \partial b_1^{[1]}} = 1$. Thus,

$$\frac{\partial \mathcal{L} }{ \partial b^{[1]}} = \frac{1}{m}\left(\left(W^{[2]}\right)^T \left(A^{[2]} - Y\right)\cdot \left(A^{[1]}\cdot\left(1-A^{[1]}\right)\right)\right)\mathbf{1},\tag{14}$$

where $\mathbf{1}$ is a ($m \times 1$) vector of ones.

Expressions $(10)$, $(13)$ and $(14)$ can be used for the parameters update $(9)$ performing backward propagation:

\begin{align} \frac{\partial \mathcal{L} }{ \partial W^{[2]} } &= \frac{1}{m}\left(A^{[2]}-Y\right)\left(A^{[1]}\right)^T,\\ \frac{\partial \mathcal{L} }{ \partial b^{[2]} } &= \frac{1}{m}\left(A^{[2]}-Y\right)\mathbf{1},\\ \frac{\partial \mathcal{L} }{ \partial W^{[1]}} &= \frac{1}{m}\left(\left(W^{[2]}\right)^T \left(A^{[2]} - Y\right)\cdot \left(A^{[1]}\cdot\left(1-A^{[1]}\right)\right)\right)X^T,\\ \frac{\partial \mathcal{L} }{ \partial b^{[1]}} &= \frac{1}{m}\left(\left(W^{[2]}\right)^T \left(A^{[2]} - Y\right)\cdot \left(A^{[1]}\cdot\left(1-A^{[1]}\right)\right)\right)\mathbf{1},\\ \tag{15} \end{align}

where $\mathbf{1}$ is a ($m \times 1$) vector of ones.

So, to understand deeply and properly how neural networks perform and get trained, you do need knowledge of linear algebra and calculus joined together! But do not worry! All together it is not that scary if you do it step by step accurately with understanding of maths.

Time to implement this all in the code!

2.2 - Dataset¶

First, let's get the dataset you will work on. The following code will create $m=2000$ data points $(x_1, x_2)$ and save them in the NumPy array X of a shape $(2 \times m)$ (in the columns of the array). The labels ($0$: blue, $1$: red) will be saved in the NumPy array Y of a shape $(1 \times m)$.

In [4]:
graded
Copied!
m = 2000
samples, labels = make_blobs(n_samples=m, 
                             centers=([2.5, 3], [6.7, 7.9], [2.1, 7.9], [7.4, 2.8]), 
                             cluster_std=1.1,
                             random_state=0)
labels[(labels == 0) | (labels == 1)] = 1
labels[(labels == 2) | (labels == 3)] = 0
X = np.transpose(samples)
Y = labels.reshape((1, m))

plt.scatter(X[0, :], X[1, :], c=Y, cmap=colors.ListedColormap(['blue', 'red']));

print ('The shape of X is: ' + str(X.shape))
print ('The shape of Y is: ' + str(Y.shape))
print ('I have m = %d training examples!' % (m))
m = 2000 samples, labels = make_blobs(n_samples=m, centers=([2.5, 3], [6.7, 7.9], [2.1, 7.9], [7.4, 2.8]), cluster_std=1.1, random_state=0) labels[(labels == 0) | (labels == 1)] = 1 labels[(labels == 2) | (labels == 3)] = 0 X = np.transpose(samples) Y = labels.reshape((1, m)) plt.scatter(X[0, :], X[1, :], c=Y, cmap=colors.ListedColormap(['blue', 'red'])); print ('The shape of X is: ' + str(X.shape)) print ('The shape of Y is: ' + str(Y.shape)) print ('I have m = %d training examples!' % (m))
The shape of X is: (2, 2000)
The shape of Y is: (1, 2000)
I have m = 2000 training examples!
No description has been provided for this image

2.3 - Define Activation Function¶

Exercise 1¶

Define sigmoid activation function $\sigma\left(z\right) =\frac{1}{1+e^{-z}} $.

In [5]:
graded
Copied!
def sigmoid(z):
    ### START CODE HERE ### (~ 1 line of code)
    res = 1/(1 + np.exp(-z))
    ### END CODE HERE ###
    
    return res
def sigmoid(z): ### START CODE HERE ### (~ 1 line of code) res = 1/(1 + np.exp(-z)) ### END CODE HERE ### return res
In [6]:
graded
Copied!
print("sigmoid(-2) = " + str(sigmoid(-2)))
print("sigmoid(0) = " + str(sigmoid(0)))
print("sigmoid(3.5) = " + str(sigmoid(3.5)))
print("sigmoid(-2) = " + str(sigmoid(-2))) print("sigmoid(0) = " + str(sigmoid(0))) print("sigmoid(3.5) = " + str(sigmoid(3.5)))
sigmoid(-2) = 0.11920292202211755
sigmoid(0) = 0.5
sigmoid(3.5) = 0.9706877692486436
Expected Output¶

Note: the values may vary in the last decimal places.

sigmoid(-2) = 0.11920292202211755
sigmoid(0) = 0.5
sigmoid(3.5) = 0.9706877692486436
In [7]:
Copied!
w3_unittest.test_sigmoid(sigmoid)
w3_unittest.test_sigmoid(sigmoid)
 All tests passed

3 - Implementation of the Neural Network Model with Two Layers¶

3.1 - Defining the Neural Network Structure¶

Exercise 2¶

Define three variables:

  • n_x: the size of the input layer
  • n_h: the size of the hidden layer (set it equal to 2 for now)
  • n_y: the size of the output layer
Hint

    Use shapes of X and Y to find n_x and n_y:
  • the size of the input layer n_x equals to the size of the input vectors placed in the columns of the array X,
  • the outpus for each of the data point will be saved in the columns of the the array Y.

In [8]:
graded
Copied!
# GRADED FUNCTION: layer_sizes

def layer_sizes(X, Y):
    """
    Arguments:
    X -- input dataset of shape (input size, number of examples)
    Y -- labels of shape (output size, number of examples)
    
    Returns:
    n_x -- the size of the input layer
    n_h -- the size of the hidden layer
    n_y -- the size of the output layer
    """
    ### START CODE HERE ### (~ 3 lines of code)
    # Size of input layer.
    n_x = X.shape[0]
    # Size of hidden layer.
    n_h = 2
    # Size of output layer.
    n_y = Y.shape[0] 
    ### END CODE HERE ###
    return (n_x, n_h, n_y)
# GRADED FUNCTION: layer_sizes def layer_sizes(X, Y): """ Arguments: X -- input dataset of shape (input size, number of examples) Y -- labels of shape (output size, number of examples) Returns: n_x -- the size of the input layer n_h -- the size of the hidden layer n_y -- the size of the output layer """ ### START CODE HERE ### (~ 3 lines of code) # Size of input layer. n_x = X.shape[0] # Size of hidden layer. n_h = 2 # Size of output layer. n_y = Y.shape[0] ### END CODE HERE ### return (n_x, n_h, n_y)
In [9]:
graded
Copied!
(n_x, n_h, n_y) = layer_sizes(X, Y)
print("The size of the input layer is: n_x = " + str(n_x))
print("The size of the hidden layer is: n_h = " + str(n_h))
print("The size of the output layer is: n_y = " + str(n_y))
(n_x, n_h, n_y) = layer_sizes(X, Y) print("The size of the input layer is: n_x = " + str(n_x)) print("The size of the hidden layer is: n_h = " + str(n_h)) print("The size of the output layer is: n_y = " + str(n_y))
The size of the input layer is: n_x = 2
The size of the hidden layer is: n_h = 2
The size of the output layer is: n_y = 1
Expected Output¶
The size of the input layer is: n_x = 2
The size of the hidden layer is: n_h = 2
The size of the output layer is: n_y = 1
In [10]:
Copied!
w3_unittest.test_layer_sizes(layer_sizes)
w3_unittest.test_layer_sizes(layer_sizes)
 All tests passed

3.2 - Initialize the Model's Parameters¶

Exercise 3¶

Implement the function initialize_parameters().

Instructions:

  • Make sure your parameters' sizes are right. Refer to the neural network figure above if needed.
  • You will initialize the weights matrix with random values.
    • Use: np.random.randn(a,b) * 0.01 to randomly initialize a matrix of shape (a,b).
  • You will initialize the bias vector as zeros.
    • Use: np.zeros((a,b)) to initialize a matrix of shape (a,b) with zeros.
In [11]:
graded
Copied!
# GRADED FUNCTION: initialize_parameters

def initialize_parameters(n_x, n_h, n_y):
    """
    Argument:
    n_x -- size of the input layer
    n_h -- size of the hidden layer
    n_y -- size of the output layer
    
    Returns:
    params -- python dictionary containing your parameters:
                    W1 -- weight matrix of shape (n_h, n_x)
                    b1 -- bias vector of shape (n_h, 1)
                    W2 -- weight matrix of shape (n_y, n_h)
                    b2 -- bias vector of shape (n_y, 1)
    """
    
    ### START CODE HERE ### (~ 4 lines of code)
    W1 = np.random.randn(n_h, n_x) * 0.01
    b1 = np.zeros((n_h, 1))
    W2 = np.random.randn(n_y, n_h) * 0.01
    b2 = np.zeros((n_y, 1))
    ### END CODE HERE ###
    
    assert (W1.shape == (n_h, n_x))
    assert (b1.shape == (n_h, 1))
    assert (W2.shape == (n_y, n_h))
    assert (b2.shape == (n_y, 1))
    
    parameters = {"W1": W1,
                  "b1": b1,
                  "W2": W2,
                  "b2": b2}
    
    return parameters
# GRADED FUNCTION: initialize_parameters def initialize_parameters(n_x, n_h, n_y): """ Argument: n_x -- size of the input layer n_h -- size of the hidden layer n_y -- size of the output layer Returns: params -- python dictionary containing your parameters: W1 -- weight matrix of shape (n_h, n_x) b1 -- bias vector of shape (n_h, 1) W2 -- weight matrix of shape (n_y, n_h) b2 -- bias vector of shape (n_y, 1) """ ### START CODE HERE ### (~ 4 lines of code) W1 = np.random.randn(n_h, n_x) * 0.01 b1 = np.zeros((n_h, 1)) W2 = np.random.randn(n_y, n_h) * 0.01 b2 = np.zeros((n_y, 1)) ### END CODE HERE ### assert (W1.shape == (n_h, n_x)) assert (b1.shape == (n_h, 1)) assert (W2.shape == (n_y, n_h)) assert (b2.shape == (n_y, 1)) parameters = {"W1": W1, "b1": b1, "W2": W2, "b2": b2} return parameters
In [12]:
graded
Copied!
parameters = initialize_parameters(n_x, n_h, n_y)

print("W1 = " + str(parameters["W1"]))
print("b1 = " + str(parameters["b1"]))
print("W2 = " + str(parameters["W2"]))
print("b2 = " + str(parameters["b2"]))
parameters = initialize_parameters(n_x, n_h, n_y) print("W1 = " + str(parameters["W1"])) print("b1 = " + str(parameters["b1"])) print("W2 = " + str(parameters["W2"])) print("b2 = " + str(parameters["b2"]))
W1 = [[ 0.01788628  0.0043651 ]
 [ 0.00096497 -0.01863493]]
b1 = [[0.]
 [0.]]
W2 = [[-0.00277388 -0.00354759]]
b2 = [[0.]]
Expected Output¶

Note: the elements of the arrays W1 and W2 maybe be different due to random initialization. You can try to restart the kernel to get the same values.

W1 = [[ 0.01788628  0.0043651 ]
 [ 0.00096497 -0.01863493]]
b1 = [[0.]
 [0.]]
W2 = [[-0.00277388 -0.00354759]]
b2 = [[0.]]
In [13]:
Copied!
# Note: 
# Actual values are not checked here in the unit tests (due to random initialization).
w3_unittest.test_initialize_parameters(initialize_parameters)
# Note: # Actual values are not checked here in the unit tests (due to random initialization). w3_unittest.test_initialize_parameters(initialize_parameters)
 All tests passed

3.3 - The Loop¶

Exercise 4¶

Implement forward_propagation().

Instructions:

  • Look above at the mathematical representation $(7)$ of your classifier (section 2.2): \begin{align} Z^{[1]} &= W^{[1]} X + b^{[1]},\\ A^{[1]} &= \sigma\left(Z^{[1]}\right),\\ Z^{[2]} &= W^{[2]} A^{[1]} + b^{[2]},\\ A^{[2]} &= \sigma\left(Z^{[2]}\right).\\ \end{align}
  • The steps you have to implement are:
    1. Retrieve each parameter from the dictionary "parameters" (which is the output of initialize_parameters()) by using parameters[".."].
    2. Implement Forward Propagation. Compute Z1 multiplying matrices W1, X and adding vector b1. Then find A1 using the sigmoid activation function. Perform similar computations for Z2 and A2.
In [14]:
graded
Copied!
# GRADED FUNCTION: forward_propagation

def forward_propagation(X, parameters):
    """
    Argument:
    X -- input data of size (n_x, m)
    parameters -- python dictionary containing your parameters (output of initialization function)
    
    Returns:
    A2 -- the sigmoid output of the second activation
    cache -- python dictionary containing Z1, A1, Z2, A2 
    (that simplifies the calculations in the back propagation step)
    """
    # Retrieve each parameter from the dictionary "parameters".
    ### START CODE HERE ### (~ 4 lines of code)
    W1 = parameters["W1"]
    b1 = parameters["b1"]
    W2 = parameters["W2"]
    b2 = parameters["b2"]
    ### END CODE HERE ###
    
    # Implement forward propagation to calculate A2.
    ### START CODE HERE ### (~ 4 lines of code)
    Z1 = np.matmul(W1, X) + b1
    A1 = sigmoid(Z1)
    Z2 = np.matmul(W2, A1) + b2
    A2 = sigmoid(Z2)
    ### END CODE HERE ###
    
    assert(A2.shape == (n_y, X.shape[1]))

    cache = {"Z1": Z1,
             "A1": A1,
             "Z2": Z2,
             "A2": A2}
    
    return A2, cache
# GRADED FUNCTION: forward_propagation def forward_propagation(X, parameters): """ Argument: X -- input data of size (n_x, m) parameters -- python dictionary containing your parameters (output of initialization function) Returns: A2 -- the sigmoid output of the second activation cache -- python dictionary containing Z1, A1, Z2, A2 (that simplifies the calculations in the back propagation step) """ # Retrieve each parameter from the dictionary "parameters". ### START CODE HERE ### (~ 4 lines of code) W1 = parameters["W1"] b1 = parameters["b1"] W2 = parameters["W2"] b2 = parameters["b2"] ### END CODE HERE ### # Implement forward propagation to calculate A2. ### START CODE HERE ### (~ 4 lines of code) Z1 = np.matmul(W1, X) + b1 A1 = sigmoid(Z1) Z2 = np.matmul(W2, A1) + b2 A2 = sigmoid(Z2) ### END CODE HERE ### assert(A2.shape == (n_y, X.shape[1])) cache = {"Z1": Z1, "A1": A1, "Z2": Z2, "A2": A2} return A2, cache
In [15]:
graded
Copied!
A2, cache = forward_propagation(X, parameters)

print(A2)
A2, cache = forward_propagation(X, parameters) print(A2)
[[0.49920157 0.49922234 0.49921223 ... 0.49921215 0.49921043 0.49920665]]
Expected Output¶

Note: the elements of the array A2 maybe be different depending on the initial parameters. If you would like to get exactly the same output, try to restart the Kernel and rerun the notebook.

[[0.49920157 0.49922234 0.49921223 ... 0.49921215 0.49921043 0.49920665]]
In [16]:
Copied!
# Note: 
# Actual values are not checked here in the unit tests (due to random initialization).
w3_unittest.test_forward_propagation(forward_propagation)
# Note: # Actual values are not checked here in the unit tests (due to random initialization). w3_unittest.test_forward_propagation(forward_propagation)
Test case "change_weights_check". Wrong output of Z1 for X = 
[[ 5.46584646  6.71120407  7.21301753 ...  1.77559174  3.52245562
   7.86492998]
 [ 2.91868287 10.31812597  7.79616824 ...  2.43434264  3.64044705
   6.77517917]]
Test for i = 1, j = 1999. 
	Expected: 
-0.03577862954823146
	Got: 
-0.03577862954823145
 47  Tests passed
 1  Tests failed

Remember, that your weights were just initialized with some random values, so the model has not been trained yet.

Exercise 5¶

Define a cost function $(8)$ which will be used to train the model:

$$\mathcal{L}\left(W, b\right) = \frac{1}{m}\sum_{i=1}^{m} \large\left(\small - y^{(i)}\log\left(a^{(i)}\right) - (1-y^{(i)})\log\left(1- a^{(i)}\right) \large \right) \small.$$

In [17]:
graded
Copied!
def compute_cost(A2, Y):
    """
    Computes the cost function as a log loss
    
    Arguments:
    A2 -- The output of the neural network of shape (1, number of examples)
    Y -- "true" labels vector of shape (1, number of examples)
    
    Returns:
    cost -- log loss
    
    """
    # Number of examples.
    m = Y.shape[1]
    
    ### START CODE HERE ### (~ 2 lines of code)
    logloss = np.multiply(np.log(A2),Y) - np.multiply(np.log(1 - A2),1 - Y)
    cost = 1/m * np.sum(logloss)
    ### END CODE HERE ###

    assert(isinstance(cost, float))
    
    return cost
def compute_cost(A2, Y): """ Computes the cost function as a log loss Arguments: A2 -- The output of the neural network of shape (1, number of examples) Y -- "true" labels vector of shape (1, number of examples) Returns: cost -- log loss """ # Number of examples. m = Y.shape[1] ### START CODE HERE ### (~ 2 lines of code) logloss = np.multiply(np.log(A2),Y) - np.multiply(np.log(1 - A2),1 - Y) cost = 1/m * np.sum(logloss) ### END CODE HERE ### assert(isinstance(cost, float)) return cost
In [18]:
graded
Copied!
print("cost = " + str(compute_cost(A2, Y)))
print("cost = " + str(compute_cost(A2, Y)))
cost = -0.0015746396019415112
Expected Output¶

Note: the elements of the arrays W1 and W2 maybe be different!

cost = 0.6931477703826823
In [19]:
Copied!
# Note: 
# Actual values are not checked here in the unit tests (due to random initialization).
w3_unittest.test_compute_cost(compute_cost, A2)
# Note: # Actual values are not checked here in the unit tests (due to random initialization). w3_unittest.test_compute_cost(compute_cost, A2)
Test case "default_check". Wrong output of compute_cost. 
	Expected: 
0.6931477703826823
	Got: 
-0.0015746396019415112
Test case "extra_check". Wrong output of compute_cost. 
	Expected: 
0.5901032749748385
	Got: 
0.10571558462768871
 0  Tests passed
 2  Tests failed

Calculate partial derivatives as shown in $(15)$:

\begin{align} \frac{\partial \mathcal{L} }{ \partial W^{[2]} } &= \frac{1}{m}\left(A^{[2]}-Y\right)\left(A^{[1]}\right)^T,\\ \frac{\partial \mathcal{L} }{ \partial b^{[2]} } &= \frac{1}{m}\left(A^{[2]}-Y\right)\mathbf{1},\\ \frac{\partial \mathcal{L} }{ \partial W^{[1]}} &= \frac{1}{m}\left(\left(W^{[2]}\right)^T \left(A^{[2]} - Y\right)\cdot \left(A^{[1]}\cdot\left(1-A^{[1]}\right)\right)\right)X^T,\\ \frac{\partial \mathcal{L} }{ \partial b^{[1]}} &= \frac{1}{m}\left(\left(W^{[2]}\right)^T \left(A^{[2]} - Y\right)\cdot \left(A^{[1]}\cdot\left(1-A^{[1]}\right)\right)\right)\mathbf{1}.\\ \end{align}

In [20]:
graded
Copied!
def backward_propagation(parameters, cache, X, Y):
    """
    Implements the backward propagation, calculating gradients
    
    Arguments:
    parameters -- python dictionary containing our parameters 
    cache -- python dictionary containing Z1, A1, Z2, A2
    X -- input data of shape (n_x, number of examples)
    Y -- "true" labels vector of shape (n_y, number of examples)
    
    Returns:
    grads -- python dictionary containing gradients with respect to different parameters
    """
    m = X.shape[1]
    
    # First, retrieve W from the dictionary "parameters".
    W1 = parameters["W1"]
    W2 = parameters["W2"]
    
    # Retrieve also A1 and A2 from dictionary "cache".
    A1 = cache["A1"]
    A2 = cache["A2"]
    
    # Backward propagation: calculate partial derivatives denoted as dW1, db1, dW2, db2 for simplicity. 
    dZ2 = A2 - Y
    dW2 = 1/m * np.dot(dZ2, A1.T)
    db2 = 1/m * np.sum(dZ2, axis = 1, keepdims = True)
    dZ1 = np.dot(W2.T, dZ2) * A1 * (1 - A1)
    dW1 = 1/m * np.dot(dZ1, X.T)
    db1 = 1/m * np.sum(dZ1, axis = 1, keepdims = True)
    
    grads = {"dW1": dW1,
             "db1": db1,
             "dW2": dW2,
             "db2": db2}
    
    return grads

grads = backward_propagation(parameters, cache, X, Y)

print("dW1 = " + str(grads["dW1"]))
print("db1 = " + str(grads["db1"]))
print("dW2 = " + str(grads["dW2"]))
print("db2 = " + str(grads["db2"]))
def backward_propagation(parameters, cache, X, Y): """ Implements the backward propagation, calculating gradients Arguments: parameters -- python dictionary containing our parameters cache -- python dictionary containing Z1, A1, Z2, A2 X -- input data of shape (n_x, number of examples) Y -- "true" labels vector of shape (n_y, number of examples) Returns: grads -- python dictionary containing gradients with respect to different parameters """ m = X.shape[1] # First, retrieve W from the dictionary "parameters". W1 = parameters["W1"] W2 = parameters["W2"] # Retrieve also A1 and A2 from dictionary "cache". A1 = cache["A1"] A2 = cache["A2"] # Backward propagation: calculate partial derivatives denoted as dW1, db1, dW2, db2 for simplicity. dZ2 = A2 - Y dW2 = 1/m * np.dot(dZ2, A1.T) db2 = 1/m * np.sum(dZ2, axis = 1, keepdims = True) dZ1 = np.dot(W2.T, dZ2) * A1 * (1 - A1) dW1 = 1/m * np.dot(dZ1, X.T) db1 = 1/m * np.sum(dZ1, axis = 1, keepdims = True) grads = {"dW1": dW1, "db1": db1, "dW2": dW2, "db2": db2} return grads grads = backward_propagation(parameters, cache, X, Y) print("dW1 = " + str(grads["dW1"])) print("db1 = " + str(grads["db1"])) print("dW2 = " + str(grads["dW2"])) print("db2 = " + str(grads["db2"]))
dW1 = [[-1.49856632e-05  1.67791519e-05]
 [-2.12394543e-05  2.43895135e-05]]
db1 = [[5.11207671e-07]
 [7.06236219e-07]]
dW2 = [[-0.00032641 -0.0002606 ]]
db2 = [[-0.00078732]]

Exercise 6¶

Implement update_parameters().

Instructions:

  • Update parameters as shown in $(9)$ (section 2.3): \begin{align} W^{[1]} &= W^{[1]} - \alpha \frac{\partial \mathcal{L} }{ \partial W^{[1]} },\\ b^{[1]} &= b^{[1]} - \alpha \frac{\partial \mathcal{L} }{ \partial b^{[1]} },\\ W^{[2]} &= W^{[2]} - \alpha \frac{\partial \mathcal{L} }{ \partial W^{[2]} },\\ b^{[2]} &= b^{[2]} - \alpha \frac{\partial \mathcal{L} }{ \partial b^{[2]} }.\\ \end{align}
  • The steps you have to implement are:
    1. Retrieve each parameter from the dictionary "parameters" (which is the output of initialize_parameters()) by using parameters[".."].
    2. Retrieve each derivative from the dictionary "grads" (which is the output of backward_propagation()) by using grads[".."].
    3. Update parameters.
In [21]:
graded
Copied!
def update_parameters(parameters, grads, learning_rate=1.2):
    """
    Updates parameters using the gradient descent update rule
    
    Arguments:
    parameters -- python dictionary containing parameters 
    grads -- python dictionary containing gradients
    learning_rate -- learning rate for gradient descent
    
    Returns:
    parameters -- python dictionary containing updated parameters 
    """
    # Retrieve each parameter from the dictionary "parameters".
    ### START CODE HERE ### (~ 4 lines of code)
    W1 = parameters["W1"]
    b1 = parameters["b1"]
    W2 = parameters["W2"]
    b2 = parameters["b2"]
    ### END CODE HERE ###
    
    # Retrieve each gradient from the dictionary "grads".
    ### START CODE HERE ### (~ 4 lines of code)
    dW1 = grads["dW1"]
    db1 = grads["db1"]
    dW2 = grads["dW2"]
    db2 = grads["db2"]
    ### END CODE HERE ###
    
    # Update rule for each parameter.
    ### START CODE HERE ### (~ 4 lines of code)
    W1 = W1 - learning_rate * dW1
    b1 = b1 - learning_rate * db1
    W2 = W2 - learning_rate * dW2
    b2 = b2 - learning_rate * db2
    ### END CODE HERE ###
    
    parameters = {"W1": W1,
                  "b1": b1,
                  "W2": W2,
                  "b2": b2}
    
    return parameters
def update_parameters(parameters, grads, learning_rate=1.2): """ Updates parameters using the gradient descent update rule Arguments: parameters -- python dictionary containing parameters grads -- python dictionary containing gradients learning_rate -- learning rate for gradient descent Returns: parameters -- python dictionary containing updated parameters """ # Retrieve each parameter from the dictionary "parameters". ### START CODE HERE ### (~ 4 lines of code) W1 = parameters["W1"] b1 = parameters["b1"] W2 = parameters["W2"] b2 = parameters["b2"] ### END CODE HERE ### # Retrieve each gradient from the dictionary "grads". ### START CODE HERE ### (~ 4 lines of code) dW1 = grads["dW1"] db1 = grads["db1"] dW2 = grads["dW2"] db2 = grads["db2"] ### END CODE HERE ### # Update rule for each parameter. ### START CODE HERE ### (~ 4 lines of code) W1 = W1 - learning_rate * dW1 b1 = b1 - learning_rate * db1 W2 = W2 - learning_rate * dW2 b2 = b2 - learning_rate * db2 ### END CODE HERE ### parameters = {"W1": W1, "b1": b1, "W2": W2, "b2": b2} return parameters
In [22]:
graded
Copied!
parameters_updated = update_parameters(parameters, grads)

print("W1 updated = " + str(parameters_updated["W1"]))
print("b1 updated = " + str(parameters_updated["b1"]))
print("W2 updated = " + str(parameters_updated["W2"]))
print("b2 updated = " + str(parameters_updated["b2"]))
parameters_updated = update_parameters(parameters, grads) print("W1 updated = " + str(parameters_updated["W1"])) print("b1 updated = " + str(parameters_updated["b1"])) print("W2 updated = " + str(parameters_updated["W2"])) print("b2 updated = " + str(parameters_updated["b2"]))
W1 updated = [[ 0.01790427  0.00434496]
 [ 0.00099046 -0.01866419]]
b1 updated = [[-6.13449205e-07]
 [-8.47483463e-07]]
W2 updated = [[-0.00238219 -0.00323487]]
b2 updated = [[0.00094478]]
Expected Output¶

Note: the actual values can be different!

W1 updated = [[ 0.01790427  0.00434496]
 [ 0.00099046 -0.01866419]]
b1 updated = [[-6.13449205e-07]
 [-8.47483463e-07]]
W2 updated = [[-0.00238219 -0.00323487]]
b2 updated = [[0.00094478]]
In [23]:
Copied!
w3_unittest.test_update_parameters(update_parameters)
w3_unittest.test_update_parameters(update_parameters)
 All tests passed

3.4 - Integrate parts 3.1, 3.2 and 3.3 in nn_model()¶

Exercise 7¶

Build your neural network model in nn_model().

Instructions: The neural network model has to use the previous functions in the right order.

In [24]:
graded
Copied!
# GRADED FUNCTION: nn_model

def nn_model(X, Y, n_h, num_iterations=10, learning_rate=1.2, print_cost=False):
    """
    Arguments:
    X -- dataset of shape (n_x, number of examples)
    Y -- labels of shape (n_y, number of examples)
    num_iterations -- number of iterations in the loop
    learning_rate -- learning rate parameter for gradient descent
    print_cost -- if True, print the cost every iteration
    
    Returns:
    parameters -- parameters learnt by the model. They can then be used to predict.
    """
    
    n_x = layer_sizes(X, Y)[0]
    n_y = layer_sizes(X, Y)[2]
    
    # Initialize parameters.
    ### START CODE HERE ### (~ 1 line of code)
    parameters = initialize_parameters(n_x,n_h,n_y)
    ### END CODE HERE ###
    
    # Loop.
    for i in range(0, num_iterations):
         
        ### START CODE HERE ### (~ 4 lines of code)
        # Forward propagation. Inputs: "X, parameters". Outputs: "A2, cache".
        A2, cache = forward_propagation(X, parameters)
        
        # Cost function. Inputs: "A2, Y". Outputs: "cost".
        cost = compute_cost(A2, Y)
        
        # Backpropagation. Inputs: "parameters, cache, X, Y". Outputs: "grads".
        grads = backward_propagation(parameters, cache, X, Y)
        
        # Gradient descent parameter update. Inputs: "parameters, grads, learning_rate". Outputs: "parameters".
        parameters = update_parameters(parameters, grads, learning_rate)
        ### END CODE HERE ###
        
        # Print the cost every iteration.
        if print_cost:
            print ("Cost after iteration %i: %f" %(i, cost))

    return parameters
# GRADED FUNCTION: nn_model def nn_model(X, Y, n_h, num_iterations=10, learning_rate=1.2, print_cost=False): """ Arguments: X -- dataset of shape (n_x, number of examples) Y -- labels of shape (n_y, number of examples) num_iterations -- number of iterations in the loop learning_rate -- learning rate parameter for gradient descent print_cost -- if True, print the cost every iteration Returns: parameters -- parameters learnt by the model. They can then be used to predict. """ n_x = layer_sizes(X, Y)[0] n_y = layer_sizes(X, Y)[2] # Initialize parameters. ### START CODE HERE ### (~ 1 line of code) parameters = initialize_parameters(n_x,n_h,n_y) ### END CODE HERE ### # Loop. for i in range(0, num_iterations): ### START CODE HERE ### (~ 4 lines of code) # Forward propagation. Inputs: "X, parameters". Outputs: "A2, cache". A2, cache = forward_propagation(X, parameters) # Cost function. Inputs: "A2, Y". Outputs: "cost". cost = compute_cost(A2, Y) # Backpropagation. Inputs: "parameters, cache, X, Y". Outputs: "grads". grads = backward_propagation(parameters, cache, X, Y) # Gradient descent parameter update. Inputs: "parameters, grads, learning_rate". Outputs: "parameters". parameters = update_parameters(parameters, grads, learning_rate) ### END CODE HERE ### # Print the cost every iteration. if print_cost: print ("Cost after iteration %i: %f" %(i, cost)) return parameters
In [25]:
graded
Copied!
parameters = nn_model(X, Y, n_h=2, num_iterations=3000, learning_rate=1.2, print_cost=True)
print("W1 = " + str(parameters["W1"]))
print("b1 = " + str(parameters["b1"]))
print("W2 = " + str(parameters["W2"]))
print("b2 = " + str(parameters["b2"]))

W1 = parameters["W1"]
b1 = parameters["b1"]
W2 = parameters["W2"]
b2 = parameters["b2"]
parameters = nn_model(X, Y, n_h=2, num_iterations=3000, learning_rate=1.2, print_cost=True) print("W1 = " + str(parameters["W1"])) print("b1 = " + str(parameters["b1"])) print("W2 = " + str(parameters["W2"])) print("b2 = " + str(parameters["b2"])) W1 = parameters["W1"] b1 = parameters["b1"] W2 = parameters["W2"] b2 = parameters["b2"]
Cost after iteration 0: -0.006110
Cost after iteration 1: -0.003385
Cost after iteration 2: -0.001868
Cost after iteration 3: -0.001024
Cost after iteration 4: -0.000554
Cost after iteration 5: -0.000293
Cost after iteration 6: -0.000148
Cost after iteration 7: -0.000068
Cost after iteration 8: -0.000023
Cost after iteration 9: 0.000001
Cost after iteration 10: 0.000014
Cost after iteration 11: 0.000021
Cost after iteration 12: 0.000024
Cost after iteration 13: 0.000026
Cost after iteration 14: 0.000026
Cost after iteration 15: 0.000026
Cost after iteration 16: 0.000025
Cost after iteration 17: 0.000025
Cost after iteration 18: 0.000024
Cost after iteration 19: 0.000023
Cost after iteration 20: 0.000022
Cost after iteration 21: 0.000021
Cost after iteration 22: 0.000020
Cost after iteration 23: 0.000019
Cost after iteration 24: 0.000018
Cost after iteration 25: 0.000017
Cost after iteration 26: 0.000016
Cost after iteration 27: 0.000015
Cost after iteration 28: 0.000014
Cost after iteration 29: 0.000013
Cost after iteration 30: 0.000012
Cost after iteration 31: 0.000011
Cost after iteration 32: 0.000011
Cost after iteration 33: 0.000010
Cost after iteration 34: 0.000009
Cost after iteration 35: 0.000008
Cost after iteration 36: 0.000007
Cost after iteration 37: 0.000006
Cost after iteration 38: 0.000005
Cost after iteration 39: 0.000004
Cost after iteration 40: 0.000003
Cost after iteration 41: 0.000002
Cost after iteration 42: 0.000001
Cost after iteration 43: -0.000000
Cost after iteration 44: -0.000001
Cost after iteration 45: -0.000002
Cost after iteration 46: -0.000003
Cost after iteration 47: -0.000004
Cost after iteration 48: -0.000005
Cost after iteration 49: -0.000006
Cost after iteration 50: -0.000007
Cost after iteration 51: -0.000008
Cost after iteration 52: -0.000009
Cost after iteration 53: -0.000010
Cost after iteration 54: -0.000011
Cost after iteration 55: -0.000012
Cost after iteration 56: -0.000013
Cost after iteration 57: -0.000014
Cost after iteration 58: -0.000015
Cost after iteration 59: -0.000016
Cost after iteration 60: -0.000017
Cost after iteration 61: -0.000018
Cost after iteration 62: -0.000019
Cost after iteration 63: -0.000020
Cost after iteration 64: -0.000021
Cost after iteration 65: -0.000022
Cost after iteration 66: -0.000023
Cost after iteration 67: -0.000024
Cost after iteration 68: -0.000025
Cost after iteration 69: -0.000026
Cost after iteration 70: -0.000027
Cost after iteration 71: -0.000028
Cost after iteration 72: -0.000029
Cost after iteration 73: -0.000030
Cost after iteration 74: -0.000031
Cost after iteration 75: -0.000032
Cost after iteration 76: -0.000033
Cost after iteration 77: -0.000034
Cost after iteration 78: -0.000035
Cost after iteration 79: -0.000036
Cost after iteration 80: -0.000037
Cost after iteration 81: -0.000038
Cost after iteration 82: -0.000039
Cost after iteration 83: -0.000041
Cost after iteration 84: -0.000042
Cost after iteration 85: -0.000043
Cost after iteration 86: -0.000044
Cost after iteration 87: -0.000045
Cost after iteration 88: -0.000046
Cost after iteration 89: -0.000047
Cost after iteration 90: -0.000048
Cost after iteration 91: -0.000049
Cost after iteration 92: -0.000050
Cost after iteration 93: -0.000052
Cost after iteration 94: -0.000053
Cost after iteration 95: -0.000054
Cost after iteration 96: -0.000055
Cost after iteration 97: -0.000056
Cost after iteration 98: -0.000057
Cost after iteration 99: -0.000059
Cost after iteration 100: -0.000060
Cost after iteration 101: -0.000061
Cost after iteration 102: -0.000062
Cost after iteration 103: -0.000063
Cost after iteration 104: -0.000064
Cost after iteration 105: -0.000066
Cost after iteration 106: -0.000067
Cost after iteration 107: -0.000068
Cost after iteration 108: -0.000069
Cost after iteration 109: -0.000071
Cost after iteration 110: -0.000072
Cost after iteration 111: -0.000073
Cost after iteration 112: -0.000074
Cost after iteration 113: -0.000076
Cost after iteration 114: -0.000077
Cost after iteration 115: -0.000078
Cost after iteration 116: -0.000079
Cost after iteration 117: -0.000081
Cost after iteration 118: -0.000082
Cost after iteration 119: -0.000083
Cost after iteration 120: -0.000085
Cost after iteration 121: -0.000086
Cost after iteration 122: -0.000087
Cost after iteration 123: -0.000089
Cost after iteration 124: -0.000090
Cost after iteration 125: -0.000091
Cost after iteration 126: -0.000093
Cost after iteration 127: -0.000094
Cost after iteration 128: -0.000096
Cost after iteration 129: -0.000097
Cost after iteration 130: -0.000098
Cost after iteration 131: -0.000100
Cost after iteration 132: -0.000101
Cost after iteration 133: -0.000103
Cost after iteration 134: -0.000104
Cost after iteration 135: -0.000106
Cost after iteration 136: -0.000107
Cost after iteration 137: -0.000109
Cost after iteration 138: -0.000110
Cost after iteration 139: -0.000112
Cost after iteration 140: -0.000113
Cost after iteration 141: -0.000115
Cost after iteration 142: -0.000116
Cost after iteration 143: -0.000118
Cost after iteration 144: -0.000119
Cost after iteration 145: -0.000121
Cost after iteration 146: -0.000123
Cost after iteration 147: -0.000124
Cost after iteration 148: -0.000126
Cost after iteration 149: -0.000128
Cost after iteration 150: -0.000129
Cost after iteration 151: -0.000131
Cost after iteration 152: -0.000133
Cost after iteration 153: -0.000134
Cost after iteration 154: -0.000136
Cost after iteration 155: -0.000138
Cost after iteration 156: -0.000140
Cost after iteration 157: -0.000142
Cost after iteration 158: -0.000143
Cost after iteration 159: -0.000145
Cost after iteration 160: -0.000147
Cost after iteration 161: -0.000149
Cost after iteration 162: -0.000151
Cost after iteration 163: -0.000153
Cost after iteration 164: -0.000155
Cost after iteration 165: -0.000157
Cost after iteration 166: -0.000159
Cost after iteration 167: -0.000161
Cost after iteration 168: -0.000163
Cost after iteration 169: -0.000165
Cost after iteration 170: -0.000167
Cost after iteration 171: -0.000169
Cost after iteration 172: -0.000172
Cost after iteration 173: -0.000174
Cost after iteration 174: -0.000176
Cost after iteration 175: -0.000179
Cost after iteration 176: -0.000181
Cost after iteration 177: -0.000183
Cost after iteration 178: -0.000186
Cost after iteration 179: -0.000188
Cost after iteration 180: -0.000191
Cost after iteration 181: -0.000193
Cost after iteration 182: -0.000196
Cost after iteration 183: -0.000199
Cost after iteration 184: -0.000202
Cost after iteration 185: -0.000204
Cost after iteration 186: -0.000207
Cost after iteration 187: -0.000210
Cost after iteration 188: -0.000213
Cost after iteration 189: -0.000217
Cost after iteration 190: -0.000220
Cost after iteration 191: -0.000223
Cost after iteration 192: -0.000227
Cost after iteration 193: -0.000230
Cost after iteration 194: -0.000234
Cost after iteration 195: -0.000238
Cost after iteration 196: -0.000241
Cost after iteration 197: -0.000245
Cost after iteration 198: -0.000250
Cost after iteration 199: -0.000254
Cost after iteration 200: -0.000258
Cost after iteration 201: -0.000263
Cost after iteration 202: -0.000268
Cost after iteration 203: -0.000273
Cost after iteration 204: -0.000278
Cost after iteration 205: -0.000283
Cost after iteration 206: -0.000289
Cost after iteration 207: -0.000295
Cost after iteration 208: -0.000301
Cost after iteration 209: -0.000307
Cost after iteration 210: -0.000314
Cost after iteration 211: -0.000321
Cost after iteration 212: -0.000329
Cost after iteration 213: -0.000336
Cost after iteration 214: -0.000345
Cost after iteration 215: -0.000353
Cost after iteration 216: -0.000363
Cost after iteration 217: -0.000372
Cost after iteration 218: -0.000382
Cost after iteration 219: -0.000393
Cost after iteration 220: -0.000405
Cost after iteration 221: -0.000417
Cost after iteration 222: -0.000430
Cost after iteration 223: -0.000444
Cost after iteration 224: -0.000459
Cost after iteration 225: -0.000475
Cost after iteration 226: -0.000492
Cost after iteration 227: -0.000511
Cost after iteration 228: -0.000530
Cost after iteration 229: -0.000552
Cost after iteration 230: -0.000575
Cost after iteration 231: -0.000600
Cost after iteration 232: -0.000627
Cost after iteration 233: -0.000656
Cost after iteration 234: -0.000688
Cost after iteration 235: -0.000723
Cost after iteration 236: -0.000761
Cost after iteration 237: -0.000803
Cost after iteration 238: -0.000850
Cost after iteration 239: -0.000901
Cost after iteration 240: -0.000958
Cost after iteration 241: -0.001021
Cost after iteration 242: -0.001091
Cost after iteration 243: -0.001169
Cost after iteration 244: -0.001257
Cost after iteration 245: -0.001356
Cost after iteration 246: -0.001467
Cost after iteration 247: -0.001593
Cost after iteration 248: -0.001735
Cost after iteration 249: -0.001896
Cost after iteration 250: -0.002079
Cost after iteration 251: -0.002286
Cost after iteration 252: -0.002520
Cost after iteration 253: -0.002783
Cost after iteration 254: -0.003079
Cost after iteration 255: -0.003409
Cost after iteration 256: -0.003773
Cost after iteration 257: -0.004170
Cost after iteration 258: -0.004599
Cost after iteration 259: -0.005054
Cost after iteration 260: -0.005528
Cost after iteration 261: -0.006011
Cost after iteration 262: -0.006491
Cost after iteration 263: -0.006955
Cost after iteration 264: -0.007388
Cost after iteration 265: -0.007776
Cost after iteration 266: -0.008105
Cost after iteration 267: -0.008363
Cost after iteration 268: -0.008538
Cost after iteration 269: -0.008624
Cost after iteration 270: -0.008615
Cost after iteration 271: -0.008509
Cost after iteration 272: -0.008304
Cost after iteration 273: -0.008004
Cost after iteration 274: -0.007610
Cost after iteration 275: -0.007127
Cost after iteration 276: -0.006562
Cost after iteration 277: -0.005919
Cost after iteration 278: -0.005206
Cost after iteration 279: -0.004427
Cost after iteration 280: -0.003589
Cost after iteration 281: -0.002697
Cost after iteration 282: -0.001755
Cost after iteration 283: -0.000768
Cost after iteration 284: 0.000261
Cost after iteration 285: 0.001329
Cost after iteration 286: 0.002436
Cost after iteration 287: 0.003579
Cost after iteration 288: 0.004762
Cost after iteration 289: 0.005986
Cost after iteration 290: 0.007256
Cost after iteration 291: 0.008581
Cost after iteration 292: 0.009971
Cost after iteration 293: 0.011440
Cost after iteration 294: 0.013002
Cost after iteration 295: 0.014675
Cost after iteration 296: 0.016473
Cost after iteration 297: 0.018401
Cost after iteration 298: 0.020455
Cost after iteration 299: 0.022610
Cost after iteration 300: 0.024825
Cost after iteration 301: 0.027045
Cost after iteration 302: 0.029216
Cost after iteration 303: 0.031298
Cost after iteration 304: 0.033272
Cost after iteration 305: 0.035134
Cost after iteration 306: 0.036882
Cost after iteration 307: 0.038511
Cost after iteration 308: 0.040004
Cost after iteration 309: 0.041342
Cost after iteration 310: 0.042505
Cost after iteration 311: 0.043478
Cost after iteration 312: 0.044248
Cost after iteration 313: 0.044811
Cost after iteration 314: 0.045171
Cost after iteration 315: 0.045333
Cost after iteration 316: 0.045310
Cost after iteration 317: 0.045118
Cost after iteration 318: 0.044772
Cost after iteration 319: 0.044291
Cost after iteration 320: 0.043693
Cost after iteration 321: 0.042995
Cost after iteration 322: 0.042214
Cost after iteration 323: 0.041363
Cost after iteration 324: 0.040458
Cost after iteration 325: 0.039510
Cost after iteration 326: 0.038529
Cost after iteration 327: 0.037526
Cost after iteration 328: 0.036507
Cost after iteration 329: 0.035481
Cost after iteration 330: 0.034452
Cost after iteration 331: 0.033426
Cost after iteration 332: 0.032408
Cost after iteration 333: 0.031399
Cost after iteration 334: 0.030403
Cost after iteration 335: 0.029422
Cost after iteration 336: 0.028459
Cost after iteration 337: 0.027514
Cost after iteration 338: 0.026588
Cost after iteration 339: 0.025683
Cost after iteration 340: 0.024799
Cost after iteration 341: 0.023935
Cost after iteration 342: 0.023094
Cost after iteration 343: 0.022273
Cost after iteration 344: 0.021475
Cost after iteration 345: 0.020697
Cost after iteration 346: 0.019941
Cost after iteration 347: 0.019205
Cost after iteration 348: 0.018490
Cost after iteration 349: 0.017796
Cost after iteration 350: 0.017121
Cost after iteration 351: 0.016465
Cost after iteration 352: 0.015829
Cost after iteration 353: 0.015211
Cost after iteration 354: 0.014611
Cost after iteration 355: 0.014028
Cost after iteration 356: 0.013463
Cost after iteration 357: 0.012915
Cost after iteration 358: 0.012383
Cost after iteration 359: 0.011867
Cost after iteration 360: 0.011367
Cost after iteration 361: 0.010881
Cost after iteration 362: 0.010410
Cost after iteration 363: 0.009954
Cost after iteration 364: 0.009511
Cost after iteration 365: 0.009082
Cost after iteration 366: 0.008667
Cost after iteration 367: 0.008263
Cost after iteration 368: 0.007873
Cost after iteration 369: 0.007494
Cost after iteration 370: 0.007128
Cost after iteration 371: 0.006772
Cost after iteration 372: 0.006428
Cost after iteration 373: 0.006095
Cost after iteration 374: 0.005773
Cost after iteration 375: 0.005460
Cost after iteration 376: 0.005158
Cost after iteration 377: 0.004866
Cost after iteration 378: 0.004583
Cost after iteration 379: 0.004309
Cost after iteration 380: 0.004045
Cost after iteration 381: 0.003789
Cost after iteration 382: 0.003542
Cost after iteration 383: 0.003303
Cost after iteration 384: 0.003072
Cost after iteration 385: 0.002850
Cost after iteration 386: 0.002635
Cost after iteration 387: 0.002428
Cost after iteration 388: 0.002228
Cost after iteration 389: 0.002036
Cost after iteration 390: 0.001850
Cost after iteration 391: 0.001672
Cost after iteration 392: 0.001500
Cost after iteration 393: 0.001335
Cost after iteration 394: 0.001176
Cost after iteration 395: 0.001023
Cost after iteration 396: 0.000877
Cost after iteration 397: 0.000736
Cost after iteration 398: 0.000602
Cost after iteration 399: 0.000473
Cost after iteration 400: 0.000350
Cost after iteration 401: 0.000232
Cost after iteration 402: 0.000120
Cost after iteration 403: 0.000013
Cost after iteration 404: -0.000090
Cost after iteration 405: -0.000187
Cost after iteration 406: -0.000279
Cost after iteration 407: -0.000366
Cost after iteration 408: -0.000449
Cost after iteration 409: -0.000527
Cost after iteration 410: -0.000601
Cost after iteration 411: -0.000670
Cost after iteration 412: -0.000735
Cost after iteration 413: -0.000796
Cost after iteration 414: -0.000853
Cost after iteration 415: -0.000906
Cost after iteration 416: -0.000955
Cost after iteration 417: -0.001001
Cost after iteration 418: -0.001042
Cost after iteration 419: -0.001080
Cost after iteration 420: -0.001115
Cost after iteration 421: -0.001146
Cost after iteration 422: -0.001173
Cost after iteration 423: -0.001198
Cost after iteration 424: -0.001219
Cost after iteration 425: -0.001237
Cost after iteration 426: -0.001252
Cost after iteration 427: -0.001264
Cost after iteration 428: -0.001273
Cost after iteration 429: -0.001279
Cost after iteration 430: -0.001283
Cost after iteration 431: -0.001284
Cost after iteration 432: -0.001282
Cost after iteration 433: -0.001278
Cost after iteration 434: -0.001272
Cost after iteration 435: -0.001263
Cost after iteration 436: -0.001252
Cost after iteration 437: -0.001238
Cost after iteration 438: -0.001223
Cost after iteration 439: -0.001205
Cost after iteration 440: -0.001186
Cost after iteration 441: -0.001164
Cost after iteration 442: -0.001141
Cost after iteration 443: -0.001116
Cost after iteration 444: -0.001089
Cost after iteration 445: -0.001060
Cost after iteration 446: -0.001030
Cost after iteration 447: -0.000998
Cost after iteration 448: -0.000965
Cost after iteration 449: -0.000930
Cost after iteration 450: -0.000894
Cost after iteration 451: -0.000857
Cost after iteration 452: -0.000818
Cost after iteration 453: -0.000778
Cost after iteration 454: -0.000737
Cost after iteration 455: -0.000696
Cost after iteration 456: -0.000653
Cost after iteration 457: -0.000609
Cost after iteration 458: -0.000564
Cost after iteration 459: -0.000519
Cost after iteration 460: -0.000472
Cost after iteration 461: -0.000425
Cost after iteration 462: -0.000378
Cost after iteration 463: -0.000329
Cost after iteration 464: -0.000281
Cost after iteration 465: -0.000231
Cost after iteration 466: -0.000181
Cost after iteration 467: -0.000131
Cost after iteration 468: -0.000081
Cost after iteration 469: -0.000030
Cost after iteration 470: 0.000021
Cost after iteration 471: 0.000073
Cost after iteration 472: 0.000124
Cost after iteration 473: 0.000176
Cost after iteration 474: 0.000228
Cost after iteration 475: 0.000279
Cost after iteration 476: 0.000331
Cost after iteration 477: 0.000383
Cost after iteration 478: 0.000435
Cost after iteration 479: 0.000486
Cost after iteration 480: 0.000538
Cost after iteration 481: 0.000589
Cost after iteration 482: 0.000640
Cost after iteration 483: 0.000691
Cost after iteration 484: 0.000742
Cost after iteration 485: 0.000792
Cost after iteration 486: 0.000842
Cost after iteration 487: 0.000892
Cost after iteration 488: 0.000941
Cost after iteration 489: 0.000990
Cost after iteration 490: 0.001039
Cost after iteration 491: 0.001087
Cost after iteration 492: 0.001134
Cost after iteration 493: 0.001181
Cost after iteration 494: 0.001228
Cost after iteration 495: 0.001274
Cost after iteration 496: 0.001320
Cost after iteration 497: 0.001365
Cost after iteration 498: 0.001409
Cost after iteration 499: 0.001453
Cost after iteration 500: 0.001496
Cost after iteration 501: 0.001539
Cost after iteration 502: 0.001581
Cost after iteration 503: 0.001622
Cost after iteration 504: 0.001663
Cost after iteration 505: 0.001703
Cost after iteration 506: 0.001743
Cost after iteration 507: 0.001782
Cost after iteration 508: 0.001820
Cost after iteration 509: 0.001858
Cost after iteration 510: 0.001894
Cost after iteration 511: 0.001931
Cost after iteration 512: 0.001966
Cost after iteration 513: 0.002002
Cost after iteration 514: 0.002033
Cost after iteration 515: 0.002073
Cost after iteration 516: 0.002095
Cost after iteration 517: 0.002148
Cost after iteration 518: 0.002139
Cost after iteration 519: 0.002248
Cost after iteration 520: 0.002125
Cost after iteration 521: 0.002458
Cost after iteration 522: 0.001885
Cost after iteration 523: 0.003113
Cost after iteration 524: 0.000733
Cost after iteration 525: 0.005620
Cost after iteration 526: -0.004213
Cost after iteration 527: 0.015984
Cost after iteration 528: -0.025523
Cost after iteration 529: 0.062823
Cost after iteration 530: -0.139427
Cost after iteration 531: 0.448499
Cost after iteration 532: -0.733372
Cost after iteration 533: 1.316679
Cost after iteration 534: -0.036884
Cost after iteration 535: -0.038385
Cost after iteration 536: -0.041186
Cost after iteration 537: -0.029265
Cost after iteration 538: -0.036512
Cost after iteration 539: -0.024970
Cost after iteration 540: -0.031464
Cost after iteration 541: -0.021597
Cost after iteration 542: -0.027179
Cost after iteration 543: -0.018580
Cost after iteration 544: -0.023661
Cost after iteration 545: -0.015820
Cost after iteration 546: -0.020789
Cost after iteration 547: -0.013274
Cost after iteration 548: -0.018465
Cost after iteration 549: -0.010893
Cost after iteration 550: -0.016634
Cost after iteration 551: -0.008611
Cost after iteration 552: -0.015278
Cost after iteration 553: -0.006338
Cost after iteration 554: -0.014421
Cost after iteration 555: -0.003953
Cost after iteration 556: -0.014147
Cost after iteration 557: -0.001272
Cost after iteration 558: -0.014620
Cost after iteration 559: 0.001983
Cost after iteration 560: -0.016141
Cost after iteration 561: 0.006271
Cost after iteration 562: -0.019242
Cost after iteration 563: 0.012374
Cost after iteration 564: -0.024870
Cost after iteration 565: 0.021694
Cost after iteration 566: -0.034727
Cost after iteration 567: 0.036895
Cost after iteration 568: -0.051822
Cost after iteration 569: 0.063061
Cost after iteration 570: -0.079696
Cost after iteration 571: 0.106195
Cost after iteration 572: -0.113940
Cost after iteration 573: 0.155437
Cost after iteration 574: -0.139886
Cost after iteration 575: 0.176298
Cost after iteration 576: -0.150973
Cost after iteration 577: 0.173169
Cost after iteration 578: -0.176356
Cost after iteration 579: 0.216804
Cost after iteration 580: -0.252699
Cost after iteration 581: 0.320606
Cost after iteration 582: -0.292918
Cost after iteration 583: 0.261081
Cost after iteration 584: -0.188249
Cost after iteration 585: 0.113602
Cost after iteration 586: -0.095153
Cost after iteration 587: 0.044753
Cost after iteration 588: -0.048699
Cost after iteration 589: 0.019948
Cost after iteration 590: -0.031814
Cost after iteration 591: 0.010168
Cost after iteration 592: -0.023484
Cost after iteration 593: 0.005437
Cost after iteration 594: -0.018373
Cost after iteration 595: 0.002930
Cost after iteration 596: -0.014934
Cost after iteration 597: 0.001606
Cost after iteration 598: -0.012505
Cost after iteration 599: 0.000973
Cost after iteration 600: -0.010738
Cost after iteration 601: 0.000769
Cost after iteration 602: -0.009431
Cost after iteration 603: 0.000851
Cost after iteration 604: -0.008468
Cost after iteration 605: 0.001138
Cost after iteration 606: -0.007784
Cost after iteration 607: 0.001596
Cost after iteration 608: -0.007350
Cost after iteration 609: 0.002219
Cost after iteration 610: -0.007165
Cost after iteration 611: 0.003029
Cost after iteration 612: -0.007258
Cost after iteration 613: 0.004081
Cost after iteration 614: -0.007694
Cost after iteration 615: 0.005471
Cost after iteration 616: -0.008592
Cost after iteration 617: 0.007363
Cost after iteration 618: -0.010150
Cost after iteration 619: 0.010025
Cost after iteration 620: -0.012706
Cost after iteration 621: 0.013919
Cost after iteration 622: -0.016839
Cost after iteration 623: 0.019845
Cost after iteration 624: -0.023568
Cost after iteration 625: 0.029269
Cost after iteration 626: -0.034763
Cost after iteration 627: 0.045008
Cost after iteration 628: -0.053955
Cost after iteration 629: 0.072680
Cost after iteration 630: -0.087077
Cost after iteration 631: 0.122131
Cost after iteration 632: -0.138343
Cost after iteration 633: 0.201790
Cost after iteration 634: -0.217638
Cost after iteration 635: 0.347757
Cost after iteration 636: -0.398656
Cost after iteration 637: 0.607390
Cost after iteration 638: -0.294902
Cost after iteration 639: 0.232367
Cost after iteration 640: -0.148671
Cost after iteration 641: 0.066478
Cost after iteration 642: -0.063923
Cost after iteration 643: 0.018599
Cost after iteration 644: -0.033062
Cost after iteration 645: 0.002797
Cost after iteration 646: -0.021312
Cost after iteration 647: -0.002862
Cost after iteration 648: -0.015504
Cost after iteration 649: -0.004963
Cost after iteration 650: -0.012103
Cost after iteration 651: -0.005599
Cost after iteration 652: -0.009863
Cost after iteration 653: -0.005578
Cost after iteration 654: -0.008250
Cost after iteration 655: -0.005254
Cost after iteration 656: -0.007006
Cost after iteration 657: -0.004793
Cost after iteration 658: -0.005998
Cost after iteration 659: -0.004281
Cost after iteration 660: -0.005153
Cost after iteration 661: -0.003758
Cost after iteration 662: -0.004426
Cost after iteration 663: -0.003247
Cost after iteration 664: -0.003792
Cost after iteration 665: -0.002755
Cost after iteration 666: -0.003233
Cost after iteration 667: -0.002288
Cost after iteration 668: -0.002738
Cost after iteration 669: -0.001845
Cost after iteration 670: -0.002299
Cost after iteration 671: -0.001425
Cost after iteration 672: -0.001912
Cost after iteration 673: -0.001022
Cost after iteration 674: -0.001575
Cost after iteration 675: -0.000631
Cost after iteration 676: -0.001288
Cost after iteration 677: -0.000243
Cost after iteration 678: -0.001058
Cost after iteration 679: 0.000154
Cost after iteration 680: -0.000895
Cost after iteration 681: 0.000582
Cost after iteration 682: -0.000821
Cost after iteration 683: 0.001073
Cost after iteration 684: -0.000874
Cost after iteration 685: 0.001683
Cost after iteration 686: -0.001123
Cost after iteration 687: 0.002508
Cost after iteration 688: -0.001696
Cost after iteration 689: 0.003727
Cost after iteration 690: -0.002834
Cost after iteration 691: 0.005676
Cost after iteration 692: -0.004998
Cost after iteration 693: 0.009014
Cost after iteration 694: -0.009112
Cost after iteration 695: 0.015079
Cost after iteration 696: -0.017089
Cost after iteration 697: 0.026734
Cost after iteration 698: -0.033125
Cost after iteration 699: 0.050666
Cost after iteration 700: -0.067708
Cost after iteration 701: 0.106531
Cost after iteration 702: -0.158704
Cost after iteration 703: 0.300726
Cost after iteration 704: -0.487258
Cost after iteration 705: 1.115513
Cost after iteration 706: -0.235622
Cost after iteration 707: 0.235093
Cost after iteration 708: -0.074296
Cost after iteration 709: 0.006576
Cost after iteration 710: -0.038883
Cost after iteration 711: -0.010588
Cost after iteration 712: -0.025650
Cost after iteration 713: -0.014807
Cost after iteration 714: -0.019845
Cost after iteration 715: -0.015093
Cost after iteration 716: -0.016625
Cost after iteration 717: -0.014176
Cost after iteration 718: -0.014421
Cost after iteration 719: -0.012929
Cost after iteration 720: -0.012689
Cost after iteration 721: -0.011640
Cost after iteration 722: -0.011224
Cost after iteration 723: -0.010408
Cost after iteration 724: -0.009941
Cost after iteration 725: -0.009264
Cost after iteration 726: -0.008797
Cost after iteration 727: -0.008215
Cost after iteration 728: -0.007770
Cost after iteration 729: -0.007257
Cost after iteration 730: -0.006842
Cost after iteration 731: -0.006386
Cost after iteration 732: -0.006003
Cost after iteration 733: -0.005593
Cost after iteration 734: -0.005243
Cost after iteration 735: -0.004872
Cost after iteration 736: -0.004553
Cost after iteration 737: -0.004218
Cost after iteration 738: -0.003928
Cost after iteration 739: -0.003622
Cost after iteration 740: -0.003360
Cost after iteration 741: -0.003080
Cost after iteration 742: -0.002844
Cost after iteration 743: -0.002587
Cost after iteration 744: -0.002375
Cost after iteration 745: -0.002137
Cost after iteration 746: -0.001949
Cost after iteration 747: -0.001727
Cost after iteration 748: -0.001563
Cost after iteration 749: -0.001351
Cost after iteration 750: -0.001212
Cost after iteration 751: -0.001005
Cost after iteration 752: -0.000896
Cost after iteration 753: -0.000684
Cost after iteration 754: -0.000614
Cost after iteration 755: -0.000382
Cost after iteration 756: -0.000367
Cost after iteration 757: -0.000091
Cost after iteration 758: -0.000163
Cost after iteration 759: 0.000204
Cost after iteration 760: -0.000015
Cost after iteration 761: 0.000531
Cost after iteration 762: 0.000042
Cost after iteration 763: 0.000942
Cost after iteration 764: -0.000065
Cost after iteration 765: 0.001548
Cost after iteration 766: -0.000493
Cost after iteration 767: 0.002587
Cost after iteration 768: -0.001596
Cost after iteration 769: 0.004593
Cost after iteration 770: -0.004174
Cost after iteration 771: 0.008793
Cost after iteration 772: -0.010104
Cost after iteration 773: 0.018128
Cost after iteration 774: -0.024006
Cost after iteration 775: 0.040370
Cost after iteration 776: -0.059206
Cost after iteration 777: 0.104386
Cost after iteration 778: -0.182714
Cost after iteration 779: 0.405727
Cost after iteration 780: -0.470780
Cost after iteration 781: 0.626908
Cost after iteration 782: -0.262575
Cost after iteration 783: 0.313981
Cost after iteration 784: -0.175559
Cost after iteration 785: 0.143060
Cost after iteration 786: -0.086782
Cost after iteration 787: 0.043847
Cost after iteration 788: -0.053617
Cost after iteration 789: 0.018057
Cost after iteration 790: -0.035906
Cost after iteration 791: 0.006060
Cost after iteration 792: -0.025396
Cost after iteration 793: 0.000028
Cost after iteration 794: -0.018997
Cost after iteration 795: -0.002916
Cost after iteration 796: -0.014919
Cost after iteration 797: -0.004238
Cost after iteration 798: -0.012162
Cost after iteration 799: -0.004701
Cost after iteration 800: -0.010182
Cost after iteration 801: -0.004705
Cost after iteration 802: -0.008681
Cost after iteration 803: -0.004462
Cost after iteration 804: -0.007492
Cost after iteration 805: -0.004091
Cost after iteration 806: -0.006518
Cost after iteration 807: -0.003655
Cost after iteration 808: -0.005702
Cost after iteration 809: -0.003190
Cost after iteration 810: -0.005008
Cost after iteration 811: -0.002714
Cost after iteration 812: -0.004414
Cost after iteration 813: -0.002237
Cost after iteration 814: -0.003908
Cost after iteration 815: -0.001760
Cost after iteration 816: -0.003482
Cost after iteration 817: -0.001280
Cost after iteration 818: -0.003136
Cost after iteration 819: -0.000788
Cost after iteration 820: -0.002876
Cost after iteration 821: -0.000269
Cost after iteration 822: -0.002712
Cost after iteration 823: 0.000298
Cost after iteration 824: -0.002670
Cost after iteration 825: 0.000951
Cost after iteration 826: -0.002788
Cost after iteration 827: 0.001745
Cost after iteration 828: -0.003132
Cost after iteration 829: 0.002770
Cost after iteration 830: -0.003815
Cost after iteration 831: 0.004176
Cost after iteration 832: -0.005021
Cost after iteration 833: 0.006214
Cost after iteration 834: -0.007074
Cost after iteration 835: 0.009321
Cost after iteration 836: -0.010538
Cost after iteration 837: 0.014277
Cost after iteration 838: -0.016436
Cost after iteration 839: 0.022535
Cost after iteration 840: -0.026700
Cost after iteration 841: 0.036964
Cost after iteration 842: -0.045272
Cost after iteration 843: 0.063792
Cost after iteration 844: -0.081063
Cost after iteration 845: 0.118553
Cost after iteration 846: -0.153835
Cost after iteration 847: 0.248365
Cost after iteration 848: -0.312999
Cost after iteration 849: 0.631565
Cost after iteration 850: -0.502117
Cost after iteration 851: 0.711097
Cost after iteration 852: -0.216877
Cost after iteration 853: 0.102810
Cost after iteration 854: -0.079645
Cost after iteration 855: 0.014905
Cost after iteration 856: -0.033590
Cost after iteration 857: -0.006052
Cost after iteration 858: -0.020311
Cost after iteration 859: -0.010992
Cost after iteration 860: -0.015459
Cost after iteration 861: -0.011720
Cost after iteration 862: -0.013046
Cost after iteration 863: -0.011250
Cost after iteration 864: -0.011480
Cost after iteration 865: -0.010437
Cost after iteration 866: -0.010268
Cost after iteration 867: -0.009550
Cost after iteration 868: -0.009238
Cost after iteration 869: -0.008680
Cost after iteration 870: -0.008325
Cost after iteration 871: -0.007856
Cost after iteration 872: -0.007498
Cost after iteration 873: -0.007087
Cost after iteration 874: -0.006742
Cost after iteration 875: -0.006374
Cost after iteration 876: -0.006049
Cost after iteration 877: -0.005715
Cost after iteration 878: -0.005411
Cost after iteration 879: -0.005106
Cost after iteration 880: -0.004824
Cost after iteration 881: -0.004544
Cost after iteration 882: -0.004283
Cost after iteration 883: -0.004026
Cost after iteration 884: -0.003784
Cost after iteration 885: -0.003548
Cost after iteration 886: -0.003324
Cost after iteration 887: -0.003106
Cost after iteration 888: -0.002900
Cost after iteration 889: -0.002699
Cost after iteration 890: -0.002508
Cost after iteration 891: -0.002322
Cost after iteration 892: -0.002146
Cost after iteration 893: -0.001974
Cost after iteration 894: -0.001811
Cost after iteration 895: -0.001653
Cost after iteration 896: -0.001502
Cost after iteration 897: -0.001356
Cost after iteration 898: -0.001216
Cost after iteration 899: -0.001080
Cost after iteration 900: -0.000951
Cost after iteration 901: -0.000825
Cost after iteration 902: -0.000706
Cost after iteration 903: -0.000589
Cost after iteration 904: -0.000478
Cost after iteration 905: -0.000370
Cost after iteration 906: -0.000267
Cost after iteration 907: -0.000166
Cost after iteration 908: -0.000071
Cost after iteration 909: 0.000023
Cost after iteration 910: 0.000111
Cost after iteration 911: 0.000199
Cost after iteration 912: 0.000280
Cost after iteration 913: 0.000363
Cost after iteration 914: 0.000437
Cost after iteration 915: 0.000515
Cost after iteration 916: 0.000583
Cost after iteration 917: 0.000658
Cost after iteration 918: 0.000719
Cost after iteration 919: 0.000791
Cost after iteration 920: 0.000845
Cost after iteration 921: 0.000917
Cost after iteration 922: 0.000961
Cost after iteration 923: 0.001036
Cost after iteration 924: 0.001068
Cost after iteration 925: 0.001150
Cost after iteration 926: 0.001163
Cost after iteration 927: 0.001261
Cost after iteration 928: 0.001246
Cost after iteration 929: 0.001374
Cost after iteration 930: 0.001311
Cost after iteration 931: 0.001497
Cost after iteration 932: 0.001349
Cost after iteration 933: 0.001645
Cost after iteration 934: 0.001340
Cost after iteration 935: 0.001845
Cost after iteration 936: 0.001244
Cost after iteration 937: 0.002155
Cost after iteration 938: 0.000982
Cost after iteration 939: 0.002691
Cost after iteration 940: 0.000387
Cost after iteration 941: 0.003694
Cost after iteration 942: -0.000890
Cost after iteration 943: 0.005675
Cost after iteration 944: -0.003597
Cost after iteration 945: 0.009737
Cost after iteration 946: -0.009356
Cost after iteration 947: 0.018317
Cost after iteration 948: -0.021809
Cost after iteration 949: 0.037091
Cost after iteration 950: -0.049736
Cost after iteration 951: 0.081691
Cost after iteration 952: -0.120704
Cost after iteration 953: 0.226062
Cost after iteration 954: -0.388987
Cost after iteration 955: 0.861571
Cost after iteration 956: -0.418904
Cost after iteration 957: 0.662295
Cost after iteration 958: -0.166095
Cost after iteration 959: 0.076757
Cost after iteration 960: -0.083090
Cost after iteration 961: 0.016754
Cost after iteration 962: -0.040743
Cost after iteration 963: -0.005096
Cost after iteration 964: -0.024537
Cost after iteration 965: -0.011657
Cost after iteration 966: -0.018258
Cost after iteration 967: -0.012993
Cost after iteration 968: -0.015205
Cost after iteration 969: -0.012693
Cost after iteration 970: -0.013306
Cost after iteration 971: -0.011889
Cost after iteration 972: -0.011890
Cost after iteration 973: -0.010954
Cost after iteration 974: -0.010715
Cost after iteration 975: -0.010015
Cost after iteration 976: -0.009687
Cost after iteration 977: -0.009119
Cost after iteration 978: -0.008764
Cost after iteration 979: -0.008278
Cost after iteration 980: -0.007925
Cost after iteration 981: -0.007496
Cost after iteration 982: -0.007157
Cost after iteration 983: -0.006772
Cost after iteration 984: -0.006452
Cost after iteration 985: -0.006103
Cost after iteration 986: -0.005803
Cost after iteration 987: -0.005484
Cost after iteration 988: -0.005205
Cost after iteration 989: -0.004913
Cost after iteration 990: -0.004654
Cost after iteration 991: -0.004385
Cost after iteration 992: -0.004145
Cost after iteration 993: -0.003898
Cost after iteration 994: -0.003676
Cost after iteration 995: -0.003447
Cost after iteration 996: -0.003242
Cost after iteration 997: -0.003031
Cost after iteration 998: -0.002842
Cost after iteration 999: -0.002646
Cost after iteration 1000: -0.002472
Cost after iteration 1001: -0.002289
Cost after iteration 1002: -0.002130
Cost after iteration 1003: -0.001960
Cost after iteration 1004: -0.001814
Cost after iteration 1005: -0.001654
Cost after iteration 1006: -0.001522
Cost after iteration 1007: -0.001371
Cost after iteration 1008: -0.001253
Cost after iteration 1009: -0.001108
Cost after iteration 1010: -0.001003
Cost after iteration 1011: -0.000864
Cost after iteration 1012: -0.000774
Cost after iteration 1013: -0.000636
Cost after iteration 1014: -0.000562
Cost after iteration 1015: -0.000422
Cost after iteration 1016: -0.000369
Cost after iteration 1017: -0.000220
Cost after iteration 1018: -0.000194
Cost after iteration 1019: -0.000027
Cost after iteration 1020: -0.000038
Cost after iteration 1021: 0.000162
Cost after iteration 1022: 0.000094
Cost after iteration 1023: 0.000354
Cost after iteration 1024: 0.000197
Cost after iteration 1025: 0.000558
Cost after iteration 1026: 0.000258
Cost after iteration 1027: 0.000795
Cost after iteration 1028: 0.000253
Cost after iteration 1029: 0.001098
Cost after iteration 1030: 0.000140
Cost after iteration 1031: 0.001527
Cost after iteration 1032: -0.000164
Cost after iteration 1033: 0.002196
Cost after iteration 1034: -0.000814
Cost after iteration 1035: 0.003321
Cost after iteration 1036: -0.002108
Cost after iteration 1037: 0.005319
Cost after iteration 1038: -0.004628
Cost after iteration 1039: 0.009012
Cost after iteration 1040: -0.009535
Cost after iteration 1041: 0.016057
Cost after iteration 1042: -0.019207
Cost after iteration 1043: 0.029961
Cost after iteration 1044: -0.038845
Cost after iteration 1045: 0.059264
Cost after iteration 1046: -0.082156
Cost after iteration 1047: 0.135244
Cost after iteration 1048: -0.212280
Cost after iteration 1049: 0.429218
Cost after iteration 1050: -0.465660
Cost after iteration 1051: 0.755148
Cost after iteration 1052: -0.335932
Cost after iteration 1053: 0.408880
Cost after iteration 1054: -0.135477
Cost after iteration 1055: 0.058406
Cost after iteration 1056: -0.058968
Cost after iteration 1057: 0.008496
Cost after iteration 1058: -0.030694
Cost after iteration 1059: -0.006119
Cost after iteration 1060: -0.020101
Cost after iteration 1061: -0.010394
Cost after iteration 1062: -0.015545
Cost after iteration 1063: -0.011282
Cost after iteration 1064: -0.013142
Cost after iteration 1065: -0.011018
Cost after iteration 1066: -0.011585
Cost after iteration 1067: -0.010366
Cost after iteration 1068: -0.010405
Cost after iteration 1069: -0.009599
Cost after iteration 1070: -0.009419
Cost after iteration 1071: -0.008820
Cost after iteration 1072: -0.008553
Cost after iteration 1073: -0.008068
Cost after iteration 1074: -0.007771
Cost after iteration 1075: -0.007357
Cost after iteration 1076: -0.007056
Cost after iteration 1077: -0.006691
Cost after iteration 1078: -0.006398
Cost after iteration 1079: -0.006069
Cost after iteration 1080: -0.005791
Cost after iteration 1081: -0.005491
Cost after iteration 1082: -0.005228
Cost after iteration 1083: -0.004954
Cost after iteration 1084: -0.004707
Cost after iteration 1085: -0.004454
Cost after iteration 1086: -0.004224
Cost after iteration 1087: -0.003990
Cost after iteration 1088: -0.003775
Cost after iteration 1089: -0.003559
Cost after iteration 1090: -0.003359
Cost after iteration 1091: -0.003159
Cost after iteration 1092: -0.002973
Cost after iteration 1093: -0.002787
Cost after iteration 1094: -0.002614
Cost after iteration 1095: -0.002441
Cost after iteration 1096: -0.002280
Cost after iteration 1097: -0.002120
Cost after iteration 1098: -0.001971
Cost after iteration 1099: -0.001821
Cost after iteration 1100: -0.001682
Cost after iteration 1101: -0.001543
Cost after iteration 1102: -0.001414
Cost after iteration 1103: -0.001284
Cost after iteration 1104: -0.001165
Cost after iteration 1105: -0.001044
Cost after iteration 1106: -0.000933
Cost after iteration 1107: -0.000819
Cost after iteration 1108: -0.000717
Cost after iteration 1109: -0.000610
Cost after iteration 1110: -0.000515
Cost after iteration 1111: -0.000415
Cost after iteration 1112: -0.000328
Cost after iteration 1113: -0.000233
Cost after iteration 1114: -0.000153
Cost after iteration 1115: -0.000062
Cost after iteration 1116: 0.000009
Cost after iteration 1117: 0.000097
Cost after iteration 1118: 0.000161
Cost after iteration 1119: 0.000247
Cost after iteration 1120: 0.000301
Cost after iteration 1121: 0.000388
Cost after iteration 1122: 0.000431
Cost after iteration 1123: 0.000521
Cost after iteration 1124: 0.000551
Cost after iteration 1125: 0.000649
Cost after iteration 1126: 0.000659
Cost after iteration 1127: 0.000772
Cost after iteration 1128: 0.000756
Cost after iteration 1129: 0.000895
Cost after iteration 1130: 0.000837
Cost after iteration 1131: 0.001023
Cost after iteration 1132: 0.000897
Cost after iteration 1133: 0.001162
Cost after iteration 1134: 0.000927
Cost after iteration 1135: 0.001329
Cost after iteration 1136: 0.000909
Cost after iteration 1137: 0.001547
Cost after iteration 1138: 0.000808
Cost after iteration 1139: 0.001865
Cost after iteration 1140: 0.000562
Cost after iteration 1141: 0.002369
Cost after iteration 1142: 0.000052
Cost after iteration 1143: 0.003225
Cost after iteration 1144: -0.000951
Cost after iteration 1145: 0.004754
Cost after iteration 1146: -0.002893
Cost after iteration 1147: 0.007588
Cost after iteration 1148: -0.006656
Cost after iteration 1149: 0.013001
Cost after iteration 1150: -0.014019
Cost after iteration 1151: 0.023634
Cost after iteration 1152: -0.028722
Cost after iteration 1153: 0.045351
Cost after iteration 1154: -0.059496
Cost after iteration 1155: 0.093206
Cost after iteration 1156: -0.130485
Cost after iteration 1157: 0.221994
Cost after iteration 1158: -0.292730
Cost after iteration 1159: 0.658923
Cost after iteration 1160: -0.610109
Cost after iteration 1161: 1.164123
Cost after iteration 1162: -0.013590
Cost after iteration 1163: -0.035949
Cost after iteration 1164: -0.023863
Cost after iteration 1165: -0.026127
Cost after iteration 1166: -0.023608
Cost after iteration 1167: -0.023007
Cost after iteration 1168: -0.021810
Cost after iteration 1169: -0.020914
Cost after iteration 1170: -0.019981
Cost after iteration 1171: -0.019129
Cost after iteration 1172: -0.018309
Cost after iteration 1173: -0.017534
Cost after iteration 1174: -0.016795
Cost after iteration 1175: -0.016090
Cost after iteration 1176: -0.015418
Cost after iteration 1177: -0.014775
Cost after iteration 1178: -0.014161
Cost after iteration 1179: -0.013572
Cost after iteration 1180: -0.013009
Cost after iteration 1181: -0.012468
Cost after iteration 1182: -0.011950
Cost after iteration 1183: -0.011452
Cost after iteration 1184: -0.010974
Cost after iteration 1185: -0.010515
Cost after iteration 1186: -0.010074
Cost after iteration 1187: -0.009650
Cost after iteration 1188: -0.009242
Cost after iteration 1189: -0.008849
Cost after iteration 1190: -0.008472
Cost after iteration 1191: -0.008108
Cost after iteration 1192: -0.007758
Cost after iteration 1193: -0.007421
Cost after iteration 1194: -0.007097
Cost after iteration 1195: -0.006784
Cost after iteration 1196: -0.006483
Cost after iteration 1197: -0.006193
Cost after iteration 1198: -0.005913
Cost after iteration 1199: -0.005644
Cost after iteration 1200: -0.005384
Cost after iteration 1201: -0.005134
Cost after iteration 1202: -0.004893
Cost after iteration 1203: -0.004660
Cost after iteration 1204: -0.004436
Cost after iteration 1205: -0.004220
Cost after iteration 1206: -0.004012
Cost after iteration 1207: -0.003811
Cost after iteration 1208: -0.003617
Cost after iteration 1209: -0.003430
Cost after iteration 1210: -0.003249
Cost after iteration 1211: -0.003075
Cost after iteration 1212: -0.002908
Cost after iteration 1213: -0.002746
Cost after iteration 1214: -0.002589
Cost after iteration 1215: -0.002438
Cost after iteration 1216: -0.002293
Cost after iteration 1217: -0.002152
Cost after iteration 1218: -0.002017
Cost after iteration 1219: -0.001886
Cost after iteration 1220: -0.001759
Cost after iteration 1221: -0.001637
Cost after iteration 1222: -0.001519
Cost after iteration 1223: -0.001405
Cost after iteration 1224: -0.001295
Cost after iteration 1225: -0.001188
Cost after iteration 1226: -0.001086
Cost after iteration 1227: -0.000986
Cost after iteration 1228: -0.000890
Cost after iteration 1229: -0.000797
Cost after iteration 1230: -0.000707
Cost after iteration 1231: -0.000620
Cost after iteration 1232: -0.000536
Cost after iteration 1233: -0.000455
Cost after iteration 1234: -0.000376
Cost after iteration 1235: -0.000300
Cost after iteration 1236: -0.000226
Cost after iteration 1237: -0.000155
Cost after iteration 1238: -0.000086
Cost after iteration 1239: -0.000019
Cost after iteration 1240: 0.000046
Cost after iteration 1241: 0.000109
Cost after iteration 1242: 0.000170
Cost after iteration 1243: 0.000229
Cost after iteration 1244: 0.000287
Cost after iteration 1245: 0.000342
Cost after iteration 1246: 0.000396
Cost after iteration 1247: 0.000448
Cost after iteration 1248: 0.000499
Cost after iteration 1249: 0.000548
Cost after iteration 1250: 0.000596
Cost after iteration 1251: 0.000643
Cost after iteration 1252: 0.000688
Cost after iteration 1253: 0.000732
Cost after iteration 1254: 0.000774
Cost after iteration 1255: 0.000816
Cost after iteration 1256: 0.000856
Cost after iteration 1257: 0.000895
Cost after iteration 1258: 0.000933
Cost after iteration 1259: 0.000970
Cost after iteration 1260: 0.001006
Cost after iteration 1261: 0.001041
Cost after iteration 1262: 0.001075
Cost after iteration 1263: 0.001109
Cost after iteration 1264: 0.001141
Cost after iteration 1265: 0.001173
Cost after iteration 1266: 0.001203
Cost after iteration 1267: 0.001233
Cost after iteration 1268: 0.001262
Cost after iteration 1269: 0.001291
Cost after iteration 1270: 0.001318
Cost after iteration 1271: 0.001345
Cost after iteration 1272: 0.001372
Cost after iteration 1273: 0.001397
Cost after iteration 1274: 0.001422
Cost after iteration 1275: 0.001447
Cost after iteration 1276: 0.001471
Cost after iteration 1277: 0.001494
Cost after iteration 1278: 0.001517
Cost after iteration 1279: 0.001539
Cost after iteration 1280: 0.001561
Cost after iteration 1281: 0.001581
Cost after iteration 1282: 0.001603
Cost after iteration 1283: 0.001621
Cost after iteration 1284: 0.001644
Cost after iteration 1285: 0.001659
Cost after iteration 1286: 0.001684
Cost after iteration 1287: 0.001694
Cost after iteration 1288: 0.001724
Cost after iteration 1289: 0.001725
Cost after iteration 1290: 0.001766
Cost after iteration 1291: 0.001749
Cost after iteration 1292: 0.001813
Cost after iteration 1293: 0.001761
Cost after iteration 1294: 0.001874
Cost after iteration 1295: 0.001751
Cost after iteration 1296: 0.001966
Cost after iteration 1297: 0.001693
Cost after iteration 1298: 0.002123
Cost after iteration 1299: 0.001535
Cost after iteration 1300: 0.002424
Cost after iteration 1301: 0.001160
Cost after iteration 1302: 0.003043
Cost after iteration 1303: 0.000311
Cost after iteration 1304: 0.004371
Cost after iteration 1305: -0.001600
Cost after iteration 1306: 0.007296
Cost after iteration 1307: -0.005911
Cost after iteration 1308: 0.013883
Cost after iteration 1309: -0.015739
Cost after iteration 1310: 0.029095
Cost after iteration 1311: -0.038705
Cost after iteration 1312: 0.065964
Cost after iteration 1313: -0.097049
Cost after iteration 1314: 0.170856
Cost after iteration 1315: -0.254359
Cost after iteration 1316: 0.597570
Cost after iteration 1317: -0.619834
Cost after iteration 1318: 1.479987
Cost after iteration 1319: 0.291628
Cost after iteration 1320: -0.114393
Cost after iteration 1321: -0.003724
Cost after iteration 1322: -0.052759
Cost after iteration 1323: -0.025602
Cost after iteration 1324: -0.035679
Cost after iteration 1325: -0.028098
Cost after iteration 1326: -0.029499
Cost after iteration 1327: -0.026576
Cost after iteration 1328: -0.026034
Cost after iteration 1329: -0.024394
Cost after iteration 1330: -0.023467
Cost after iteration 1331: -0.022269
Cost after iteration 1332: -0.021326
Cost after iteration 1333: -0.020335
Cost after iteration 1334: -0.019459
Cost after iteration 1335: -0.018595
Cost after iteration 1336: -0.017798
Cost after iteration 1337: -0.017029
Cost after iteration 1338: -0.016305
Cost after iteration 1339: -0.015612
Cost after iteration 1340: -0.014955
Cost after iteration 1341: -0.014325
Cost after iteration 1342: -0.013726
Cost after iteration 1343: -0.013152
Cost after iteration 1344: -0.012604
Cost after iteration 1345: -0.012079
Cost after iteration 1346: -0.011576
Cost after iteration 1347: -0.011094
Cost after iteration 1348: -0.010631
Cost after iteration 1349: -0.010187
Cost after iteration 1350: -0.009761
Cost after iteration 1351: -0.009351
Cost after iteration 1352: -0.008957
Cost after iteration 1353: -0.008579
Cost after iteration 1354: -0.008215
Cost after iteration 1355: -0.007864
Cost after iteration 1356: -0.007527
Cost after iteration 1357: -0.007203
Cost after iteration 1358: -0.006891
Cost after iteration 1359: -0.006590
Cost after iteration 1360: -0.006300
Cost after iteration 1361: -0.006021
Cost after iteration 1362: -0.005752
Cost after iteration 1363: -0.005493
Cost after iteration 1364: -0.005243
Cost after iteration 1365: -0.005002
Cost after iteration 1366: -0.004770
Cost after iteration 1367: -0.004546
Cost after iteration 1368: -0.004331
Cost after iteration 1369: -0.004123
Cost after iteration 1370: -0.003922
Cost after iteration 1371: -0.003728
Cost after iteration 1372: -0.003542
Cost after iteration 1373: -0.003362
Cost after iteration 1374: -0.003188
Cost after iteration 1375: -0.003021
Cost after iteration 1376: -0.002859
Cost after iteration 1377: -0.002703
Cost after iteration 1378: -0.002553
Cost after iteration 1379: -0.002407
Cost after iteration 1380: -0.002267
Cost after iteration 1381: -0.002132
Cost after iteration 1382: -0.002001
Cost after iteration 1383: -0.001875
Cost after iteration 1384: -0.001753
Cost after iteration 1385: -0.001635
Cost after iteration 1386: -0.001522
Cost after iteration 1387: -0.001412
Cost after iteration 1388: -0.001306
Cost after iteration 1389: -0.001203
Cost after iteration 1390: -0.001104
Cost after iteration 1391: -0.001009
Cost after iteration 1392: -0.000916
Cost after iteration 1393: -0.000826
Cost after iteration 1394: -0.000740
Cost after iteration 1395: -0.000656
Cost after iteration 1396: -0.000575
Cost after iteration 1397: -0.000497
Cost after iteration 1398: -0.000421
Cost after iteration 1399: -0.000348
Cost after iteration 1400: -0.000277
Cost after iteration 1401: -0.000208
Cost after iteration 1402: -0.000141
Cost after iteration 1403: -0.000076
Cost after iteration 1404: -0.000014
Cost after iteration 1405: 0.000047
Cost after iteration 1406: 0.000105
Cost after iteration 1407: 0.000162
Cost after iteration 1408: 0.000217
Cost after iteration 1409: 0.000271
Cost after iteration 1410: 0.000322
Cost after iteration 1411: 0.000374
Cost after iteration 1412: 0.000421
Cost after iteration 1413: 0.000471
Cost after iteration 1414: 0.000513
Cost after iteration 1415: 0.000563
Cost after iteration 1416: 0.000600
Cost after iteration 1417: 0.000650
Cost after iteration 1418: 0.000682
Cost after iteration 1419: 0.000733
Cost after iteration 1420: 0.000756
Cost after iteration 1421: 0.000815
Cost after iteration 1422: 0.000825
Cost after iteration 1423: 0.000895
Cost after iteration 1424: 0.000884
Cost after iteration 1425: 0.000978
Cost after iteration 1426: 0.000931
Cost after iteration 1427: 0.001070
Cost after iteration 1428: 0.000958
Cost after iteration 1429: 0.001179
Cost after iteration 1430: 0.000953
Cost after iteration 1431: 0.001325
Cost after iteration 1432: 0.000891
Cost after iteration 1433: 0.001543
Cost after iteration 1434: 0.000723
Cost after iteration 1435: 0.001899
Cost after iteration 1436: 0.000359
Cost after iteration 1437: 0.002521
Cost after iteration 1438: -0.000381
Cost after iteration 1439: 0.003659
Cost after iteration 1440: -0.001846
Cost after iteration 1441: 0.005804
Cost after iteration 1442: -0.004727
Cost after iteration 1443: 0.009930
Cost after iteration 1444: -0.010410
Cost after iteration 1445: 0.018017
Cost after iteration 1446: -0.021742
Cost after iteration 1447: 0.034243
Cost after iteration 1448: -0.044903
Cost after iteration 1449: 0.068669
Cost after iteration 1450: -0.095729
Cost after iteration 1451: 0.157740
Cost after iteration 1452: -0.246158
Cost after iteration 1453: 0.509462
Cost after iteration 1454: -0.508784
Cost after iteration 1455: 1.020744
Cost after iteration 1456: -0.213196
Cost after iteration 1457: 0.127073
Cost after iteration 1458: -0.077840
Cost after iteration 1459: 0.014752
Cost after iteration 1460: -0.038637
Cost after iteration 1461: -0.005404
Cost after iteration 1462: -0.024481
Cost after iteration 1463: -0.011406
Cost after iteration 1464: -0.018597
Cost after iteration 1465: -0.012869
Cost after iteration 1466: -0.015602
Cost after iteration 1467: -0.012784
Cost after iteration 1468: -0.013736
Cost after iteration 1469: -0.012159
Cost after iteration 1470: -0.012367
Cost after iteration 1471: -0.011361
Cost after iteration 1472: -0.011251
Cost after iteration 1473: -0.010529
Cost after iteration 1474: -0.010283
Cost after iteration 1475: -0.009715
Cost after iteration 1476: -0.009415
Cost after iteration 1477: -0.008939
Cost after iteration 1478: -0.008625
Cost after iteration 1479: -0.008210
Cost after iteration 1480: -0.007897
Cost after iteration 1481: -0.007526
Cost after iteration 1482: -0.007225
Cost after iteration 1483: -0.006888
Cost after iteration 1484: -0.006601
Cost after iteration 1485: -0.006292
Cost after iteration 1486: -0.006021
Cost after iteration 1487: -0.005737
Cost after iteration 1488: -0.005482
Cost after iteration 1489: -0.005218
Cost after iteration 1490: -0.004979
Cost after iteration 1491: -0.004735
Cost after iteration 1492: -0.004512
Cost after iteration 1493: -0.004285
Cost after iteration 1494: -0.004076
Cost after iteration 1495: -0.003865
Cost after iteration 1496: -0.003669
Cost after iteration 1497: -0.003473
Cost after iteration 1498: -0.003290
Cost after iteration 1499: -0.003107
Cost after iteration 1500: -0.002937
Cost after iteration 1501: -0.002766
Cost after iteration 1502: -0.002607
Cost after iteration 1503: -0.002447
Cost after iteration 1504: -0.002300
Cost after iteration 1505: -0.002150
Cost after iteration 1506: -0.002013
Cost after iteration 1507: -0.001873
Cost after iteration 1508: -0.001745
Cost after iteration 1509: -0.001614
Cost after iteration 1510: -0.001495
Cost after iteration 1511: -0.001372
Cost after iteration 1512: -0.001262
Cost after iteration 1513: -0.001145
Cost after iteration 1514: -0.001044
Cost after iteration 1515: -0.000934
Cost after iteration 1516: -0.000841
Cost after iteration 1517: -0.000735
Cost after iteration 1518: -0.000651
Cost after iteration 1519: -0.000550
Cost after iteration 1520: -0.000474
Cost after iteration 1521: -0.000376
Cost after iteration 1522: -0.000309
Cost after iteration 1523: -0.000212
Cost after iteration 1524: -0.000155
Cost after iteration 1525: -0.000057
Cost after iteration 1526: -0.000013
Cost after iteration 1527: 0.000089
Cost after iteration 1528: 0.000117
Cost after iteration 1529: 0.000230
Cost after iteration 1530: 0.000236
Cost after iteration 1531: 0.000366
Cost after iteration 1532: 0.000342
Cost after iteration 1533: 0.000501
Cost after iteration 1534: 0.000431
Cost after iteration 1535: 0.000640
Cost after iteration 1536: 0.000499
Cost after iteration 1537: 0.000790
Cost after iteration 1538: 0.000538
Cost after iteration 1539: 0.000965
Cost after iteration 1540: 0.000529
Cost after iteration 1541: 0.001188
Cost after iteration 1542: 0.000445
Cost after iteration 1543: 0.001498
Cost after iteration 1544: 0.000233
Cost after iteration 1545: 0.001968
Cost after iteration 1546: -0.000206
Cost after iteration 1547: 0.002732
Cost after iteration 1548: -0.001049
Cost after iteration 1549: 0.004037
Cost after iteration 1550: -0.002636
Cost after iteration 1551: 0.006359
Cost after iteration 1552: -0.005608
Cost after iteration 1553: 0.010619
Cost after iteration 1554: -0.011217
Cost after iteration 1555: 0.018671
Cost after iteration 1556: -0.021991
Cost after iteration 1557: 0.034475
Cost after iteration 1558: -0.043640
Cost after iteration 1559: 0.067563
Cost after iteration 1560: -0.091139
Cost after iteration 1561: 0.147217
Cost after iteration 1562: -0.179592
Cost after iteration 1563: 0.328583
Cost after iteration 1564: -0.263138
Cost after iteration 1565: 0.418919
Cost after iteration 1566: -0.338148
Cost after iteration 1567: 0.489429
Cost after iteration 1568: -0.269155
Cost after iteration 1569: 0.256084
Cost after iteration 1570: -0.168499
Cost after iteration 1571: 0.098474
Cost after iteration 1572: -0.085164
Cost after iteration 1573: 0.029838
Cost after iteration 1574: -0.038811
Cost after iteration 1575: 0.002956
Cost after iteration 1576: -0.021263
Cost after iteration 1577: -0.005741
Cost after iteration 1578: -0.014697
Cost after iteration 1579: -0.008264
Cost after iteration 1580: -0.011757
Cost after iteration 1581: -0.008756
Cost after iteration 1582: -0.010124
Cost after iteration 1583: -0.008539
Cost after iteration 1584: -0.009023
Cost after iteration 1585: -0.008071
Cost after iteration 1586: -0.008167
Cost after iteration 1587: -0.007524
Cost after iteration 1588: -0.007443
Cost after iteration 1589: -0.006962
Cost after iteration 1590: -0.006803
Cost after iteration 1591: -0.006415
Cost after iteration 1592: -0.006221
Cost after iteration 1593: -0.005892
Cost after iteration 1594: -0.005686
Cost after iteration 1595: -0.005397
Cost after iteration 1596: -0.005191
Cost after iteration 1597: -0.004931
Cost after iteration 1598: -0.004731
Cost after iteration 1599: -0.004495
Cost after iteration 1600: -0.004302
Cost after iteration 1601: -0.004085
Cost after iteration 1602: -0.003902
Cost after iteration 1603: -0.003702
Cost after iteration 1604: -0.003529
Cost after iteration 1605: -0.003343
Cost after iteration 1606: -0.003180
Cost after iteration 1607: -0.003007
Cost after iteration 1608: -0.002853
Cost after iteration 1609: -0.002692
Cost after iteration 1610: -0.002548
Cost after iteration 1611: -0.002398
Cost after iteration 1612: -0.002263
Cost after iteration 1613: -0.002122
Cost after iteration 1614: -0.001996
Cost after iteration 1615: -0.001864
Cost after iteration 1616: -0.001746
Cost after iteration 1617: -0.001622
Cost after iteration 1618: -0.001512
Cost after iteration 1619: -0.001395
Cost after iteration 1620: -0.001292
Cost after iteration 1621: -0.001182
Cost after iteration 1622: -0.001087
Cost after iteration 1623: -0.000983
Cost after iteration 1624: -0.000894
Cost after iteration 1625: -0.000796
Cost after iteration 1626: -0.000714
Cost after iteration 1627: -0.000620
Cost after iteration 1628: -0.000545
Cost after iteration 1629: -0.000455
Cost after iteration 1630: -0.000386
Cost after iteration 1631: -0.000300
Cost after iteration 1632: -0.000238
Cost after iteration 1633: -0.000153
Cost after iteration 1634: -0.000099
Cost after iteration 1635: -0.000015
Cost after iteration 1636: 0.000031
Cost after iteration 1637: 0.000116
Cost after iteration 1638: 0.000152
Cost after iteration 1639: 0.000240
Cost after iteration 1640: 0.000264
Cost after iteration 1641: 0.000360
Cost after iteration 1642: 0.000368
Cost after iteration 1643: 0.000475
Cost after iteration 1644: 0.000461
Cost after iteration 1645: 0.000589
Cost after iteration 1646: 0.000543
Cost after iteration 1647: 0.000703
Cost after iteration 1648: 0.000611
Cost after iteration 1649: 0.000824
Cost after iteration 1650: 0.000659
Cost after iteration 1651: 0.000957
Cost after iteration 1652: 0.000680
Cost after iteration 1653: 0.001116
Cost after iteration 1654: 0.000656
Cost after iteration 1655: 0.001321
Cost after iteration 1656: 0.000563
Cost after iteration 1657: 0.001609
Cost after iteration 1658: 0.000351
Cost after iteration 1659: 0.002045
Cost after iteration 1660: -0.000066
Cost after iteration 1661: 0.002746
Cost after iteration 1662: -0.000844
Cost after iteration 1663: 0.003928
Cost after iteration 1664: -0.002274
Cost after iteration 1665: 0.005994
Cost after iteration 1666: -0.004899
Cost after iteration 1667: 0.009717
Cost after iteration 1668: -0.009758
Cost after iteration 1669: 0.016617
Cost after iteration 1670: -0.018902
Cost after iteration 1671: 0.029858
Cost after iteration 1672: -0.036776
Cost after iteration 1673: 0.056737
Cost after iteration 1674: -0.074803
Cost after iteration 1675: 0.117859
Cost after iteration 1676: -0.150515
Cost after iteration 1677: 0.268238
Cost after iteration 1678: -0.240138
Cost after iteration 1679: 0.393659
Cost after iteration 1680: -0.266787
Cost after iteration 1681: 0.389838
Cost after iteration 1682: -0.354138
Cost after iteration 1683: 0.392308
Cost after iteration 1684: -0.204855
Cost after iteration 1685: 0.145530
Cost after iteration 1686: -0.109700
Cost after iteration 1687: 0.046970
Cost after iteration 1688: -0.048781
Cost after iteration 1689: 0.009215
Cost after iteration 1690: -0.024451
Cost after iteration 1691: -0.003428
Cost after iteration 1692: -0.015742
Cost after iteration 1693: -0.007244
Cost after iteration 1694: -0.012085
Cost after iteration 1695: -0.008216
Cost after iteration 1696: -0.010188
Cost after iteration 1697: -0.008206
Cost after iteration 1698: -0.008985
Cost after iteration 1699: -0.007839
Cost after iteration 1700: -0.008091
Cost after iteration 1701: -0.007347
Cost after iteration 1702: -0.007355
Cost after iteration 1703: -0.006821
Cost after iteration 1704: -0.006714
Cost after iteration 1705: -0.006297
Cost after iteration 1706: -0.006138
Cost after iteration 1707: -0.005792
Cost after iteration 1708: -0.005610
Cost after iteration 1709: -0.005311
Cost after iteration 1710: -0.005123
Cost after iteration 1711: -0.004858
Cost after iteration 1712: -0.004670
Cost after iteration 1713: -0.004431
Cost after iteration 1714: -0.004248
Cost after iteration 1715: -0.004030
Cost after iteration 1716: -0.003855
Cost after iteration 1717: -0.003654
Cost after iteration 1718: -0.003487
Cost after iteration 1719: -0.003301
Cost after iteration 1720: -0.003144
Cost after iteration 1721: -0.002971
Cost after iteration 1722: -0.002823
Cost after iteration 1723: -0.002661
Cost after iteration 1724: -0.002522
Cost after iteration 1725: -0.002371
Cost after iteration 1726: -0.002240
Cost after iteration 1727: -0.002099
Cost after iteration 1728: -0.001977
Cost after iteration 1729: -0.001844
Cost after iteration 1730: -0.001729
Cost after iteration 1731: -0.001605
Cost after iteration 1732: -0.001498
Cost after iteration 1733: -0.001381
Cost after iteration 1734: -0.001281
Cost after iteration 1735: -0.001170
Cost after iteration 1736: -0.001078
Cost after iteration 1737: -0.000973
Cost after iteration 1738: -0.000887
Cost after iteration 1739: -0.000787
Cost after iteration 1740: -0.000708
Cost after iteration 1741: -0.000612
Cost after iteration 1742: -0.000540
Cost after iteration 1743: -0.000448
Cost after iteration 1744: -0.000383
Cost after iteration 1745: -0.000293
Cost after iteration 1746: -0.000236
Cost after iteration 1747: -0.000147
Cost after iteration 1748: -0.000098
Cost after iteration 1749: -0.000009
Cost after iteration 1750: 0.000031
Cost after iteration 1751: 0.000122
Cost after iteration 1752: 0.000150
Cost after iteration 1753: 0.000248
Cost after iteration 1754: 0.000261
Cost after iteration 1755: 0.000369
Cost after iteration 1756: 0.000361
Cost after iteration 1757: 0.000487
Cost after iteration 1758: 0.000451
Cost after iteration 1759: 0.000605
Cost after iteration 1760: 0.000527
Cost after iteration 1761: 0.000727
Cost after iteration 1762: 0.000586
Cost after iteration 1763: 0.000859
Cost after iteration 1764: 0.000621
Cost after iteration 1765: 0.001010
Cost after iteration 1766: 0.000618
Cost after iteration 1767: 0.001198
Cost after iteration 1768: 0.000557
Cost after iteration 1769: 0.001453
Cost after iteration 1770: 0.000400
Cost after iteration 1771: 0.001825
Cost after iteration 1772: 0.000079
Cost after iteration 1773: 0.002406
Cost after iteration 1774: -0.000529
Cost after iteration 1775: 0.003364
Cost after iteration 1776: -0.001646
Cost after iteration 1777: 0.005007
Cost after iteration 1778: -0.003688
Cost after iteration 1779: 0.007921
Cost after iteration 1780: -0.007439
Cost after iteration 1781: 0.013242
Cost after iteration 1782: -0.014419
Cost after iteration 1783: 0.023282
Cost after iteration 1784: -0.027797
Cost after iteration 1785: 0.043160
Cost after iteration 1786: -0.055329
Cost after iteration 1787: 0.086188
Cost after iteration 1788: -0.114388
Cost after iteration 1789: 0.194880
Cost after iteration 1790: -0.200786
Cost after iteration 1791: 0.350569
Cost after iteration 1792: -0.215212
Cost after iteration 1793: 0.267006
Cost after iteration 1794: -0.229281
Cost after iteration 1795: 0.325138
Cost after iteration 1796: -0.309451
Cost after iteration 1797: 0.342493
Cost after iteration 1798: -0.204918
Cost after iteration 1799: 0.154194
Cost after iteration 1800: -0.111614
Cost after iteration 1801: 0.050265
Cost after iteration 1802: -0.049424
Cost after iteration 1803: 0.011106
Cost after iteration 1804: -0.024379
Cost after iteration 1805: -0.002068
Cost after iteration 1806: -0.015462
Cost after iteration 1807: -0.006148
Cost after iteration 1808: -0.011703
Cost after iteration 1809: -0.007304
Cost after iteration 1810: -0.009751
Cost after iteration 1811: -0.007428
Cost after iteration 1812: -0.008524
Cost after iteration 1813: -0.007161
Cost after iteration 1814: -0.007626
Cost after iteration 1815: -0.006745
Cost after iteration 1816: -0.006901
Cost after iteration 1817: -0.006278
Cost after iteration 1818: -0.006278
Cost after iteration 1819: -0.005803
Cost after iteration 1820: -0.005722
Cost after iteration 1821: -0.005338
Cost after iteration 1822: -0.005217
Cost after iteration 1823: -0.004893
Cost after iteration 1824: -0.004752
Cost after iteration 1825: -0.004470
Cost after iteration 1826: -0.004322
Cost after iteration 1827: -0.004071
Cost after iteration 1828: -0.003922
Cost after iteration 1829: -0.003696
Cost after iteration 1830: -0.003549
Cost after iteration 1831: -0.003342
Cost after iteration 1832: -0.003201
Cost after iteration 1833: -0.003011
Cost after iteration 1834: -0.002876
Cost after iteration 1835: -0.002699
Cost after iteration 1836: -0.002572
Cost after iteration 1837: -0.002407
Cost after iteration 1838: -0.002287
Cost after iteration 1839: -0.002133
Cost after iteration 1840: -0.002020
Cost after iteration 1841: -0.001875
Cost after iteration 1842: -0.001771
Cost after iteration 1843: -0.001634
Cost after iteration 1844: -0.001537
Cost after iteration 1845: -0.001407
Cost after iteration 1846: -0.001318
Cost after iteration 1847: -0.001194
Cost after iteration 1848: -0.001113
Cost after iteration 1849: -0.000993
Cost after iteration 1850: -0.000921
Cost after iteration 1851: -0.000804
Cost after iteration 1852: -0.000741
Cost after iteration 1853: -0.000626
Cost after iteration 1854: -0.000573
Cost after iteration 1855: -0.000458
Cost after iteration 1856: -0.000416
Cost after iteration 1857: -0.000299
Cost after iteration 1858: -0.000270
Cost after iteration 1859: -0.000147
Cost after iteration 1860: -0.000134
Cost after iteration 1861: -0.000002
Cost after iteration 1862: -0.000010
Cost after iteration 1863: 0.000138
Cost after iteration 1864: 0.000102
Cost after iteration 1865: 0.000276
Cost after iteration 1866: 0.000201
Cost after iteration 1867: 0.000414
Cost after iteration 1868: 0.000284
Cost after iteration 1869: 0.000558
Cost after iteration 1870: 0.000346
Cost after iteration 1871: 0.000714
Cost after iteration 1872: 0.000378
Cost after iteration 1873: 0.000894
Cost after iteration 1874: 0.000367
Cost after iteration 1875: 0.001117
Cost after iteration 1876: 0.000288
Cost after iteration 1877: 0.001415
Cost after iteration 1878: 0.000101
Cost after iteration 1879: 0.001844
Cost after iteration 1880: -0.000267
Cost after iteration 1881: 0.002501
Cost after iteration 1882: -0.000943
Cost after iteration 1883: 0.003557
Cost after iteration 1884: -0.002153
Cost after iteration 1885: 0.005325
Cost after iteration 1886: -0.004311
Cost after iteration 1887: 0.008384
Cost after iteration 1888: -0.008181
Cost after iteration 1889: 0.013844
Cost after iteration 1890: -0.015225
Cost after iteration 1891: 0.023928
Cost after iteration 1892: -0.028473
Cost after iteration 1893: 0.043532
Cost after iteration 1894: -0.055327
Cost after iteration 1895: 0.085255
Cost after iteration 1896: -0.111031
Cost after iteration 1897: 0.186274
Cost after iteration 1898: -0.186737
Cost after iteration 1899: 0.317392
Cost after iteration 1900: -0.192579
Cost after iteration 1901: 0.225863
Cost after iteration 1902: -0.171470
Cost after iteration 1903: 0.199705
Cost after iteration 1904: -0.223392
Cost after iteration 1905: 0.301234
Cost after iteration 1906: -0.256839
Cost after iteration 1907: 0.266228
Cost after iteration 1908: -0.173421
Cost after iteration 1909: 0.120696
Cost after iteration 1910: -0.093844
Cost after iteration 1911: 0.041214
Cost after iteration 1912: -0.041887
Cost after iteration 1913: 0.009321
Cost after iteration 1914: -0.021645
Cost after iteration 1915: -0.001438
Cost after iteration 1916: -0.014093
Cost after iteration 1917: -0.004973
Cost after iteration 1918: -0.010717
Cost after iteration 1919: -0.006078
Cost after iteration 1920: -0.008884
Cost after iteration 1921: -0.006274
Cost after iteration 1922: -0.007706
Cost after iteration 1923: -0.006102
Cost after iteration 1924: -0.006844
Cost after iteration 1925: -0.005777
Cost after iteration 1926: -0.006152
Cost after iteration 1927: -0.005391
Cost after iteration 1928: -0.005563
Cost after iteration 1929: -0.004986
Cost after iteration 1930: -0.005044
Cost after iteration 1931: -0.004584
Cost after iteration 1932: -0.004576
Cost after iteration 1933: -0.004195
Cost after iteration 1934: -0.004149
Cost after iteration 1935: -0.003821
Cost after iteration 1936: -0.003755
Cost after iteration 1937: -0.003467
Cost after iteration 1938: -0.003389
Cost after iteration 1939: -0.003132
Cost after iteration 1940: -0.003050
Cost after iteration 1941: -0.002816
Cost after iteration 1942: -0.002734
Cost after iteration 1943: -0.002518
Cost after iteration 1944: -0.002439
Cost after iteration 1945: -0.002237
Cost after iteration 1946: -0.002163
Cost after iteration 1947: -0.001974
Cost after iteration 1948: -0.001906
Cost after iteration 1949: -0.001725
Cost after iteration 1950: -0.001665
Cost after iteration 1951: -0.001491
Cost after iteration 1952: -0.001441
Cost after iteration 1953: -0.001271
Cost after iteration 1954: -0.001231
Cost after iteration 1955: -0.001063
Cost after iteration 1956: -0.001035
Cost after iteration 1957: -0.000866
Cost after iteration 1958: -0.000854
Cost after iteration 1959: -0.000678
Cost after iteration 1960: -0.000685
Cost after iteration 1961: -0.000500
Cost after iteration 1962: -0.000530
Cost after iteration 1963: -0.000328
Cost after iteration 1964: -0.000388
Cost after iteration 1965: -0.000162
Cost after iteration 1966: -0.000261
Cost after iteration 1967: 0.000002
Cost after iteration 1968: -0.000151
Cost after iteration 1969: 0.000168
Cost after iteration 1970: -0.000060
Cost after iteration 1971: 0.000340
Cost after iteration 1972: 0.000005
Cost after iteration 1973: 0.000527
Cost after iteration 1974: 0.000036
Cost after iteration 1975: 0.000741
Cost after iteration 1976: 0.000018
Cost after iteration 1977: 0.001003
Cost after iteration 1978: -0.000074
Cost after iteration 1979: 0.001344
Cost after iteration 1980: -0.000280
Cost after iteration 1981: 0.001819
Cost after iteration 1982: -0.000669
Cost after iteration 1983: 0.002519
Cost after iteration 1984: -0.001359
Cost after iteration 1985: 0.003602
Cost after iteration 1986: -0.002554
Cost after iteration 1987: 0.005345
Cost after iteration 1988: -0.004613
Cost after iteration 1989: 0.008246
Cost after iteration 1990: -0.008179
Cost after iteration 1991: 0.013227
Cost after iteration 1992: -0.014447
Cost after iteration 1993: 0.022080
Cost after iteration 1994: -0.025801
Cost after iteration 1995: 0.038590
Cost after iteration 1996: -0.047778
Cost after iteration 1997: 0.071899
Cost after iteration 1998: -0.092736
Cost after iteration 1999: 0.147274
Cost after iteration 2000: -0.157436
Cost after iteration 2001: 0.265495
Cost after iteration 2002: -0.184340
Cost after iteration 2003: 0.238041
Cost after iteration 2004: -0.152441
Cost after iteration 2005: 0.153181
Cost after iteration 2006: -0.150922
Cost after iteration 2007: 0.168801
Cost after iteration 2008: -0.187166
Cost after iteration 2009: 0.238474
Cost after iteration 2010: -0.218943
Cost after iteration 2011: 0.239816
Cost after iteration 2012: -0.171348
Cost after iteration 2013: 0.135355
Cost after iteration 2014: -0.103924
Cost after iteration 2015: 0.054346
Cost after iteration 2016: -0.049790
Cost after iteration 2017: 0.016991
Cost after iteration 2018: -0.024958
Cost after iteration 2019: 0.002782
Cost after iteration 2020: -0.015287
Cost after iteration 2021: -0.002311
Cost after iteration 2022: -0.010994
Cost after iteration 2023: -0.004180
Cost after iteration 2024: -0.008738
Cost after iteration 2025: -0.004799
Cost after iteration 2026: -0.007356
Cost after iteration 2027: -0.004888
Cost after iteration 2028: -0.006395
Cost after iteration 2029: -0.004737
Cost after iteration 2030: -0.005660
Cost after iteration 2031: -0.004478
Cost after iteration 2032: -0.005059
Cost after iteration 2033: -0.004170
Cost after iteration 2034: -0.004545
Cost after iteration 2035: -0.003845
Cost after iteration 2036: -0.004092
Cost after iteration 2037: -0.003518
Cost after iteration 2038: -0.003685
Cost after iteration 2039: -0.003199
Cost after iteration 2040: -0.003315
Cost after iteration 2041: -0.002891
Cost after iteration 2042: -0.002975
Cost after iteration 2043: -0.002596
Cost after iteration 2044: -0.002662
Cost after iteration 2045: -0.002315
Cost after iteration 2046: -0.002372
Cost after iteration 2047: -0.002048
Cost after iteration 2048: -0.002104
Cost after iteration 2049: -0.001795
Cost after iteration 2050: -0.001855
Cost after iteration 2051: -0.001555
Cost after iteration 2052: -0.001624
Cost after iteration 2053: -0.001326
Cost after iteration 2054: -0.001411
Cost after iteration 2055: -0.001109
Cost after iteration 2056: -0.001214
Cost after iteration 2057: -0.000900
Cost after iteration 2058: -0.001034
Cost after iteration 2059: -0.000699
Cost after iteration 2060: -0.000870
Cost after iteration 2061: -0.000504
Cost after iteration 2062: -0.000724
Cost after iteration 2063: -0.000311
Cost after iteration 2064: -0.000598
Cost after iteration 2065: -0.000117
Cost after iteration 2066: -0.000494
Cost after iteration 2067: 0.000084
Cost after iteration 2068: -0.000418
Cost after iteration 2069: 0.000300
Cost after iteration 2070: -0.000379
Cost after iteration 2071: 0.000541
Cost after iteration 2072: -0.000388
Cost after iteration 2073: 0.000826
Cost after iteration 2074: -0.000467
Cost after iteration 2075: 0.001182
Cost after iteration 2076: -0.000650
Cost after iteration 2077: 0.001652
Cost after iteration 2078: -0.000991
Cost after iteration 2079: 0.002309
Cost after iteration 2080: -0.001579
Cost after iteration 2081: 0.003270
Cost after iteration 2082: -0.002563
Cost after iteration 2083: 0.004733
Cost after iteration 2084: -0.004194
Cost after iteration 2085: 0.007039
Cost after iteration 2086: -0.006905
Cost after iteration 2087: 0.010792
Cost after iteration 2088: -0.011459
Cost after iteration 2089: 0.017098
Cost after iteration 2090: -0.019275
Cost after iteration 2091: 0.028123
Cost after iteration 2092: -0.033286
Cost after iteration 2093: 0.048539
Cost after iteration 2094: -0.060430
Cost after iteration 2095: 0.089804
Cost after iteration 2096: -0.110310
Cost after iteration 2097: 0.175968
Cost after iteration 2098: -0.164947
Cost after iteration 2099: 0.256696
Cost after iteration 2100: -0.163833
Cost after iteration 2101: 0.183172
Cost after iteration 2102: -0.131537
Cost after iteration 2103: 0.127303
Cost after iteration 2104: -0.129169
Cost after iteration 2105: 0.132489
Cost after iteration 2106: -0.145035
Cost after iteration 2107: 0.166434
Cost after iteration 2108: -0.174326
Cost after iteration 2109: 0.206289
Cost after iteration 2110: -0.178513
Cost after iteration 2111: 0.178686
Cost after iteration 2112: -0.137137
Cost after iteration 2113: 0.100255
Cost after iteration 2114: -0.083637
Cost after iteration 2115: 0.044436
Cost after iteration 2116: -0.042370
Cost after iteration 2117: 0.016171
Cost after iteration 2118: -0.023043
Cost after iteration 2119: 0.004554
Cost after iteration 2120: -0.014650
Cost after iteration 2121: -0.000173
Cost after iteration 2122: -0.010564
Cost after iteration 2123: -0.002172
Cost after iteration 2124: -0.008288
Cost after iteration 2125: -0.003000
Cost after iteration 2126: -0.006857
Cost after iteration 2127: -0.003284
Cost after iteration 2128: -0.005860
Cost after iteration 2129: -0.003300
Cost after iteration 2130: -0.005109
Cost after iteration 2131: -0.003178
Cost after iteration 2132: -0.004508
Cost after iteration 2133: -0.002984
Cost after iteration 2134: -0.004007
Cost after iteration 2135: -0.002754
Cost after iteration 2136: -0.003575
Cost after iteration 2137: -0.002508
Cost after iteration 2138: -0.003196
Cost after iteration 2139: -0.002256
Cost after iteration 2140: -0.002858
Cost after iteration 2141: -0.002004
Cost after iteration 2142: -0.002555
Cost after iteration 2143: -0.001757
Cost after iteration 2144: -0.002282
Cost after iteration 2145: -0.001514
Cost after iteration 2146: -0.002036
Cost after iteration 2147: -0.001276
Cost after iteration 2148: -0.001815
Cost after iteration 2149: -0.001042
Cost after iteration 2150: -0.001619
Cost after iteration 2151: -0.000811
Cost after iteration 2152: -0.001447
Cost after iteration 2153: -0.000579
Cost after iteration 2154: -0.001303
Cost after iteration 2155: -0.000342
Cost after iteration 2156: -0.001188
Cost after iteration 2157: -0.000095
Cost after iteration 2158: -0.001110
Cost after iteration 2159: 0.000171
Cost after iteration 2160: -0.001075
Cost after iteration 2161: 0.000468
Cost after iteration 2162: -0.001099
Cost after iteration 2163: 0.000815
Cost after iteration 2164: -0.001201
Cost after iteration 2165: 0.001239
Cost after iteration 2166: -0.001414
Cost after iteration 2167: 0.001781
Cost after iteration 2168: -0.001788
Cost after iteration 2169: 0.002506
Cost after iteration 2170: -0.002402
Cost after iteration 2171: 0.003517
Cost after iteration 2172: -0.003383
Cost after iteration 2173: 0.004980
Cost after iteration 2174: -0.004934
Cost after iteration 2175: 0.007168
Cost after iteration 2176: -0.007393
Cost after iteration 2177: 0.010545
Cost after iteration 2178: -0.011333
Cost after iteration 2179: 0.015923
Cost after iteration 2180: -0.017767
Cost after iteration 2181: 0.024810
Cost after iteration 2182: -0.028656
Cost after iteration 2183: 0.040241
Cost after iteration 2184: -0.048321
Cost after iteration 2185: 0.068993
Cost after iteration 2186: -0.085009
Cost after iteration 2187: 0.125956
Cost after iteration 2188: -0.133011
Cost after iteration 2189: 0.204714
Cost after iteration 2190: -0.159265
Cost after iteration 2191: 0.207177
Cost after iteration 2192: -0.137546
Cost after iteration 2193: 0.136767
Cost after iteration 2194: -0.117673
Cost after iteration 2195: 0.111722
Cost after iteration 2196: -0.114522
Cost after iteration 2197: 0.111875
Cost after iteration 2198: -0.120016
Cost after iteration 2199: 0.124679
Cost after iteration 2200: -0.133837
Cost after iteration 2201: 0.146991
Cost after iteration 2202: -0.147622
Cost after iteration 2203: 0.158237
Cost after iteration 2204: -0.142141
Cost after iteration 2205: 0.131773
Cost after iteration 2206: -0.113027
Cost after iteration 2207: 0.083087
Cost after iteration 2208: -0.072804
Cost after iteration 2209: 0.042685
Cost after iteration 2210: -0.040954
Cost after iteration 2211: 0.019618
Cost after iteration 2212: -0.024127
Cost after iteration 2213: 0.008652
Cost after iteration 2214: -0.015794
Cost after iteration 2215: 0.003473
Cost after iteration 2216: -0.011358
Cost after iteration 2217: 0.000931
Cost after iteration 2218: -0.008763
Cost after iteration 2219: -0.000343
Cost after iteration 2220: -0.007106
Cost after iteration 2221: -0.000970
Cost after iteration 2222: -0.005963
Cost after iteration 2223: -0.001246
Cost after iteration 2224: -0.005125
Cost after iteration 2225: -0.001323
Cost after iteration 2226: -0.004479
Cost after iteration 2227: -0.001283
Cost after iteration 2228: -0.003964
Cost after iteration 2229: -0.001170
Cost after iteration 2230: -0.003542
Cost after iteration 2231: -0.001012
Cost after iteration 2232: -0.003193
Cost after iteration 2233: -0.000822
Cost after iteration 2234: -0.002902
Cost after iteration 2235: -0.000608
Cost after iteration 2236: -0.002664
Cost after iteration 2237: -0.000374
Cost after iteration 2238: -0.002474
Cost after iteration 2239: -0.000118
Cost after iteration 2240: -0.002334
Cost after iteration 2241: 0.000163
Cost after iteration 2242: -0.002247
Cost after iteration 2243: 0.000478
Cost after iteration 2244: -0.002222
Cost after iteration 2245: 0.000838
Cost after iteration 2246: -0.002271
Cost after iteration 2247: 0.001263
Cost after iteration 2248: -0.002417
Cost after iteration 2249: 0.001779
Cost after iteration 2250: -0.002691
Cost after iteration 2251: 0.002428
Cost after iteration 2252: -0.003141
Cost after iteration 2253: 0.003273
Cost after iteration 2254: -0.003841
Cost after iteration 2255: 0.004407
Cost after iteration 2256: -0.004903
Cost after iteration 2257: 0.005978
Cost after iteration 2258: -0.006503
Cost after iteration 2259: 0.008216
Cost after iteration 2260: -0.008919
Cost after iteration 2261: 0.011501
Cost after iteration 2262: -0.012605
Cost after iteration 2263: 0.016467
Cost after iteration 2264: -0.018333
Cost after iteration 2265: 0.024239
Cost after iteration 2266: -0.027528
Cost after iteration 2267: 0.036961
Cost after iteration 2268: -0.043116
Cost after iteration 2269: 0.059055
Cost after iteration 2270: -0.070815
Cost after iteration 2271: 0.099440
Cost after iteration 2272: -0.111123
Cost after iteration 2273: 0.161811
Cost after iteration 2274: -0.142097
Cost after iteration 2275: 0.191957
Cost after iteration 2276: -0.137993
Cost after iteration 2277: 0.148960
Cost after iteration 2278: -0.117011
Cost after iteration 2279: 0.112722
Cost after iteration 2280: -0.109077
Cost after iteration 2281: 0.103829
Cost after iteration 2282: -0.107061
Cost after iteration 2283: 0.103663
Cost after iteration 2284: -0.109754
Cost after iteration 2285: 0.109967
Cost after iteration 2286: -0.116632
Cost after iteration 2287: 0.120432
Cost after iteration 2288: -0.124005
Cost after iteration 2289: 0.127178
Cost after iteration 2290: -0.123814
Cost after iteration 2291: 0.118100
Cost after iteration 2292: -0.109419
Cost after iteration 2293: 0.090743
Cost after iteration 2294: -0.082543
Cost after iteration 2295: 0.058048
Cost after iteration 2296: -0.053888
Cost after iteration 2297: 0.033358
Cost after iteration 2298: -0.034087
Cost after iteration 2299: 0.018837
Cost after iteration 2300: -0.022683
Cost after iteration 2301: 0.010926
Cost after iteration 2302: -0.016157
Cost after iteration 2303: 0.006572
Cost after iteration 2304: -0.012235
Cost after iteration 2305: 0.004115
Cost after iteration 2306: -0.009737
Cost after iteration 2307: 0.002705
Cost after iteration 2308: -0.008060
Cost after iteration 2309: 0.001903
Cost after iteration 2310: -0.006884
Cost after iteration 2311: 0.001470
Cost after iteration 2312: -0.006033
Cost after iteration 2313: 0.001272
Cost after iteration 2314: -0.005405
Cost after iteration 2315: 0.001236
Cost after iteration 2316: -0.004943
Cost after iteration 2317: 0.001316
Cost after iteration 2318: -0.004614
Cost after iteration 2319: 0.001491
Cost after iteration 2320: -0.004400
Cost after iteration 2321: 0.001752
Cost after iteration 2322: -0.004296
Cost after iteration 2323: 0.002100
Cost after iteration 2324: -0.004308
Cost after iteration 2325: 0.002547
Cost after iteration 2326: -0.004450
Cost after iteration 2327: 0.003116
Cost after iteration 2328: -0.004748
Cost after iteration 2329: 0.003843
Cost after iteration 2330: -0.005247
Cost after iteration 2331: 0.004785
Cost after iteration 2332: -0.006012
Cost after iteration 2333: 0.006026
Cost after iteration 2334: -0.007141
Cost after iteration 2335: 0.007696
Cost after iteration 2336: -0.008785
Cost after iteration 2337: 0.009992
Cost after iteration 2338: -0.011174
Cost after iteration 2339: 0.013224
Cost after iteration 2340: -0.014674
Cost after iteration 2341: 0.017897
Cost after iteration 2342: -0.019885
Cost after iteration 2343: 0.024868
Cost after iteration 2344: -0.027868
Cost after iteration 2345: 0.035685
Cost after iteration 2346: -0.040661
Cost after iteration 2347: 0.053321
Cost after iteration 2348: -0.062094
Cost after iteration 2349: 0.083285
Cost after iteration 2350: -0.094689
Cost after iteration 2351: 0.130054
Cost after iteration 2352: -0.124406
Cost after iteration 2353: 0.166004
Cost after iteration 2354: -0.132435
Cost after iteration 2355: 0.152595
Cost after iteration 2356: -0.119956
Cost after iteration 2357: 0.121140
Cost after iteration 2358: -0.111481
Cost after iteration 2359: 0.108586
Cost after iteration 2360: -0.108539
Cost after iteration 2361: 0.105708
Cost after iteration 2362: -0.109297
Cost after iteration 2363: 0.108477
Cost after iteration 2364: -0.113721
Cost after iteration 2365: 0.116039
Cost after iteration 2366: -0.120668
Cost after iteration 2367: 0.124966
Cost after iteration 2368: -0.125366
Cost after iteration 2369: 0.126167
Cost after iteration 2370: -0.120130
Cost after iteration 2371: 0.110222
Cost after iteration 2372: -0.101113
Cost after iteration 2373: 0.080239
Cost after iteration 2374: -0.072918
Cost after iteration 2375: 0.050302
Cost after iteration 2376: -0.047408
Cost after iteration 2377: 0.029674
Cost after iteration 2378: -0.031042
Cost after iteration 2379: 0.017762
Cost after iteration 2380: -0.021544
Cost after iteration 2381: 0.011108
Cost after iteration 2382: -0.015927
Cost after iteration 2383: 0.007313
Cost after iteration 2384: -0.012433
Cost after iteration 2385: 0.005097
Cost after iteration 2386: -0.010146
Cost after iteration 2387: 0.003789
Cost after iteration 2388: -0.008582
Cost after iteration 2389: 0.003030
Cost after iteration 2390: -0.007480
Cost after iteration 2391: 0.002621
Cost after iteration 2392: -0.006690
Cost after iteration 2393: 0.002448
Cost after iteration 2394: -0.006128
Cost after iteration 2395: 0.002448
Cost after iteration 2396: -0.005745
Cost after iteration 2397: 0.002586
Cost after iteration 2398: -0.005516
Cost after iteration 2399: 0.002847
Cost after iteration 2400: -0.005431
Cost after iteration 2401: 0.003230
Cost after iteration 2402: -0.005497
Cost after iteration 2403: 0.003750
Cost after iteration 2404: -0.005730
Cost after iteration 2405: 0.004434
Cost after iteration 2406: -0.006167
Cost after iteration 2407: 0.005330
Cost after iteration 2408: -0.006860
Cost after iteration 2409: 0.006509
Cost after iteration 2410: -0.007895
Cost after iteration 2411: 0.008081
Cost after iteration 2412: -0.009398
Cost after iteration 2413: 0.010213
Cost after iteration 2414: -0.011562
Cost after iteration 2415: 0.013163
Cost after iteration 2416: -0.014687
Cost after iteration 2417: 0.017341
Cost after iteration 2418: -0.019255
Cost after iteration 2419: 0.023424
Cost after iteration 2420: -0.026091
Cost after iteration 2421: 0.032600
Cost after iteration 2422: -0.036716
Cost after iteration 2423: 0.047061
Cost after iteration 2424: -0.053998
Cost after iteration 2425: 0.070850
Cost after iteration 2426: -0.081378
Cost after iteration 2427: 0.108994
Cost after iteration 2428: -0.112008
Cost after iteration 2429: 0.149826
Cost after iteration 2430: -0.127700
Cost after iteration 2431: 0.154520
Cost after iteration 2432: -0.121450
Cost after iteration 2433: 0.127341
Cost after iteration 2434: -0.111248
Cost after iteration 2435: 0.109371
Cost after iteration 2436: -0.106227
Cost after iteration 2437: 0.102725
Cost after iteration 2438: -0.104259
Cost after iteration 2439: 0.101137
Cost after iteration 2440: -0.104917
Cost after iteration 2441: 0.103315
Cost after iteration 2442: -0.107944
Cost after iteration 2443: 0.108032
Cost after iteration 2444: -0.111693
Cost after iteration 2445: 0.111728
Cost after iteration 2446: -0.112285
Cost after iteration 2447: 0.108289
Cost after iteration 2448: -0.104998
Cost after iteration 2449: 0.093269
Cost after iteration 2450: -0.088026
Cost after iteration 2451: 0.069819
Cost after iteration 2452: -0.065481
Cost after iteration 2453: 0.046996
Cost after iteration 2454: -0.045529
Cost after iteration 2455: 0.030507
Cost after iteration 2456: -0.031839
Cost after iteration 2457: 0.020131
Cost after iteration 2458: -0.023183
Cost after iteration 2459: 0.013820
Cost after iteration 2460: -0.017701
Cost after iteration 2461: 0.009962
Cost after iteration 2462: -0.014128
Cost after iteration 2463: 0.007578
Cost after iteration 2464: -0.011727
Cost after iteration 2465: 0.006105
Cost after iteration 2466: -0.010071
Cost after iteration 2467: 0.005219
Cost after iteration 2468: -0.008917
Cost after iteration 2469: 0.004728
Cost after iteration 2470: -0.008120
Cost after iteration 2471: 0.004523
Cost after iteration 2472: -0.007595
Cost after iteration 2473: 0.004541
Cost after iteration 2474: -0.007297
Cost after iteration 2475: 0.004751
Cost after iteration 2476: -0.007205
Cost after iteration 2477: 0.005144
Cost after iteration 2478: -0.007321
Cost after iteration 2479: 0.005734
Cost after iteration 2480: -0.007664
Cost after iteration 2481: 0.006554
Cost after iteration 2482: -0.008276
Cost after iteration 2483: 0.007661
Cost after iteration 2484: -0.009225
Cost after iteration 2485: 0.009148
Cost after iteration 2486: -0.010618
Cost after iteration 2487: 0.011154
Cost after iteration 2488: -0.012616
Cost after iteration 2489: 0.013895
Cost after iteration 2490: -0.015469
Cost after iteration 2491: 0.017708
Cost after iteration 2492: -0.019569
Cost after iteration 2493: 0.023136
Cost after iteration 2494: -0.025565
Cost after iteration 2495: 0.031097
Cost after iteration 2496: -0.034606
Cost after iteration 2497: 0.043221
Cost after iteration 2498: -0.048805
Cost after iteration 2499: 0.062450
Cost after iteration 2500: -0.071305
Cost after iteration 2501: 0.093041
Cost after iteration 2502: -0.100733
Cost after iteration 2503: 0.132720
Cost after iteration 2504: -0.122248
Cost after iteration 2505: 0.152467
Cost after iteration 2506: -0.125084
Cost after iteration 2507: 0.137973
Cost after iteration 2508: -0.117764
Cost after iteration 2509: 0.119808
Cost after iteration 2510: -0.114196
Cost after iteration 2511: 0.113845
Cost after iteration 2512: -0.114709
Cost after iteration 2513: 0.115379
Cost after iteration 2514: -0.119293
Cost after iteration 2515: 0.123366
Cost after iteration 2516: -0.128256
Cost after iteration 2517: 0.136928
Cost after iteration 2518: -0.138610
Cost after iteration 2519: 0.147956
Cost after iteration 2520: -0.140264
Cost after iteration 2521: 0.138919
Cost after iteration 2522: -0.123919
Cost after iteration 2523: 0.105253
Cost after iteration 2524: -0.092352
Cost after iteration 2525: 0.065507
Cost after iteration 2526: -0.058596
Cost after iteration 2527: 0.036615
Cost after iteration 2528: -0.036145
Cost after iteration 2529: 0.020388
Cost after iteration 2530: -0.023855
Cost after iteration 2531: 0.011878
Cost after iteration 2532: -0.017034
Cost after iteration 2533: 0.007276
Cost after iteration 2534: -0.012981
Cost after iteration 2535: 0.004686
Cost after iteration 2536: -0.010401
Cost after iteration 2537: 0.003189
Cost after iteration 2538: -0.008662
Cost after iteration 2539: 0.002322
Cost after iteration 2540: -0.007436
Cost after iteration 2541: 0.001841
Cost after iteration 2542: -0.006545
Cost after iteration 2543: 0.001608
Cost after iteration 2544: -0.005887
Cost after iteration 2545: 0.001546
Cost after iteration 2546: -0.005403
Cost after iteration 2547: 0.001612
Cost after iteration 2548: -0.005060
Cost after iteration 2549: 0.001780
Cost after iteration 2550: -0.004841
Cost after iteration 2551: 0.002042
Cost after iteration 2552: -0.004739
Cost after iteration 2553: 0.002400
Cost after iteration 2554: -0.004761
Cost after iteration 2555: 0.002866
Cost after iteration 2556: -0.004921
Cost after iteration 2557: 0.003462
Cost after iteration 2558: -0.005248
Cost after iteration 2559: 0.004229
Cost after iteration 2560: -0.005787
Cost after iteration 2561: 0.005223
Cost after iteration 2562: -0.006605
Cost after iteration 2563: 0.006534
Cost after iteration 2564: -0.007805
Cost after iteration 2565: 0.008293
Cost after iteration 2566: -0.009540
Cost after iteration 2567: 0.010706
Cost after iteration 2568: -0.012047
Cost after iteration 2569: 0.014090
Cost after iteration 2570: -0.015699
Cost after iteration 2571: 0.018965
Cost after iteration 2572: -0.021112
Cost after iteration 2573: 0.026207
Cost after iteration 2574: -0.029378
Cost after iteration 2575: 0.037402
Cost after iteration 2576: -0.042586
Cost after iteration 2577: 0.055557
Cost after iteration 2578: -0.064436
Cost after iteration 2579: 0.085918
Cost after iteration 2580: -0.096069
Cost after iteration 2581: 0.130688
Cost after iteration 2582: -0.122307
Cost after iteration 2583: 0.159626
Cost after iteration 2584: -0.126592
Cost after iteration 2585: 0.142549
Cost after iteration 2586: -0.113641
Cost after iteration 2587: 0.113366
Cost after iteration 2588: -0.104453
Cost after iteration 2589: 0.100011
Cost after iteration 2590: -0.099022
Cost after iteration 2591: 0.093544
Cost after iteration 2592: -0.095737
Cost after iteration 2593: 0.090432
Cost after iteration 2594: -0.094315
Cost after iteration 2595: 0.089736
Cost after iteration 2596: -0.094229
Cost after iteration 2597: 0.090073
Cost after iteration 2598: -0.094001
Cost after iteration 2599: 0.088990
Cost after iteration 2600: -0.091366
Cost after iteration 2601: 0.083730
Cost after iteration 2602: -0.084279
Cost after iteration 2603: 0.073045
Cost after iteration 2604: -0.072443
Cost after iteration 2605: 0.058746
Cost after iteration 2606: -0.058221
Cost after iteration 2607: 0.044586
Cost after iteration 2608: -0.045061
Cost after iteration 2609: 0.033132
Cost after iteration 2610: -0.034758
Cost after iteration 2611: 0.024818
Cost after iteration 2612: -0.027294
Cost after iteration 2613: 0.019059
Cost after iteration 2614: -0.022022
Cost after iteration 2615: 0.015145
Cost after iteration 2616: -0.018309
Cost after iteration 2617: 0.012512
Cost after iteration 2618: -0.015687
Cost after iteration 2619: 0.010775
Cost after iteration 2620: -0.013840
Cost after iteration 2621: 0.009676
Cost after iteration 2622: -0.012563
Cost after iteration 2623: 0.009054
Cost after iteration 2624: -0.011726
Cost after iteration 2625: 0.008810
Cost after iteration 2626: -0.011255
Cost after iteration 2627: 0.008889
Cost after iteration 2628: -0.011111
Cost after iteration 2629: 0.009272
Cost after iteration 2630: -0.011288
Cost after iteration 2631: 0.009970
Cost after iteration 2632: -0.011809
Cost after iteration 2633: 0.011024
Cost after iteration 2634: -0.012727
Cost after iteration 2635: 0.012512
Cost after iteration 2636: -0.014137
Cost after iteration 2637: 0.014561
Cost after iteration 2638: -0.016184
Cost after iteration 2639: 0.017368
Cost after iteration 2640: -0.019097
Cost after iteration 2641: 0.021240
Cost after iteration 2642: -0.023232
Cost after iteration 2643: 0.026664
Cost after iteration 2644: -0.029169
Cost after iteration 2645: 0.034442
Cost after iteration 2646: -0.037901
Cost after iteration 2647: 0.045951
Cost after iteration 2648: -0.051170
Cost after iteration 2649: 0.063550
Cost after iteration 2650: -0.071497
Cost after iteration 2651: 0.090488
Cost after iteration 2652: -0.098582
Cost after iteration 2653: 0.125840
Cost after iteration 2654: -0.121351
Cost after iteration 2655: 0.149542
Cost after iteration 2656: -0.129935
Cost after iteration 2657: 0.146401
Cost after iteration 2658: -0.129315
Cost after iteration 2659: 0.136836
Cost after iteration 2660: -0.131970
Cost after iteration 2661: 0.139479
Cost after iteration 2662: -0.142151
Cost after iteration 2663: 0.156759
Cost after iteration 2664: -0.162018
Cost after iteration 2665: 0.188476
Cost after iteration 2666: -0.181548
Cost after iteration 2667: 0.203775
Cost after iteration 2668: -0.170466
Cost after iteration 2669: 0.160739
Cost after iteration 2670: -0.129090
Cost after iteration 2671: 0.094553
Cost after iteration 2672: -0.079196
Cost after iteration 2673: 0.047588
Cost after iteration 2674: -0.043989
Cost after iteration 2675: 0.022716
Cost after iteration 2676: -0.026522
Cost after iteration 2677: 0.011021
Cost after iteration 2678: -0.017783
Cost after iteration 2679: 0.005269
Cost after iteration 2680: -0.012949
Cost after iteration 2681: 0.002271
Cost after iteration 2682: -0.010017
Cost after iteration 2683: 0.000652
Cost after iteration 2684: -0.008097
Cost after iteration 2685: -0.000226
Cost after iteration 2686: -0.006761
Cost after iteration 2687: -0.000684
Cost after iteration 2688: -0.005781
Cost after iteration 2689: -0.000892
Cost after iteration 2690: -0.005033
Cost after iteration 2691: -0.000947
Cost after iteration 2692: -0.004443
Cost after iteration 2693: -0.000904
Cost after iteration 2694: -0.003967
Cost after iteration 2695: -0.000796
Cost after iteration 2696: -0.003577
Cost after iteration 2697: -0.000643
Cost after iteration 2698: -0.003258
Cost after iteration 2699: -0.000456
Cost after iteration 2700: -0.002999
Cost after iteration 2701: -0.000240
Cost after iteration 2702: -0.002794
Cost after iteration 2703: 0.000002
Cost after iteration 2704: -0.002643
Cost after iteration 2705: 0.000275
Cost after iteration 2706: -0.002549
Cost after iteration 2707: 0.000584
Cost after iteration 2708: -0.002519
Cost after iteration 2709: 0.000941
Cost after iteration 2710: -0.002564
Cost after iteration 2711: 0.001362
Cost after iteration 2712: -0.002705
Cost after iteration 2713: 0.001873
Cost after iteration 2714: -0.002969
Cost after iteration 2715: 0.002511
Cost after iteration 2716: -0.003401
Cost after iteration 2717: 0.003332
Cost after iteration 2718: -0.004066
Cost after iteration 2719: 0.004421
Cost after iteration 2720: -0.005063
Cost after iteration 2721: 0.005904
Cost after iteration 2722: -0.006543
Cost after iteration 2723: 0.007982
Cost after iteration 2724: -0.008743
Cost after iteration 2725: 0.010974
Cost after iteration 2726: -0.012043
Cost after iteration 2727: 0.015402
Cost after iteration 2728: -0.017069
Cost after iteration 2729: 0.022168
Cost after iteration 2730: -0.024938
Cost after iteration 2731: 0.032927
Cost after iteration 2732: -0.037845
Cost after iteration 2733: 0.050955
Cost after iteration 2734: -0.060223
Cost after iteration 2735: 0.082848
Cost after iteration 2736: -0.095921
Cost after iteration 2737: 0.136185
Cost after iteration 2738: -0.129516
Cost after iteration 2739: 0.179190
Cost after iteration 2740: -0.136699
Cost after iteration 2741: 0.157861
Cost after iteration 2742: -0.117577
Cost after iteration 2743: 0.115200
Cost after iteration 2744: -0.104883
Cost after iteration 2745: 0.098337
Cost after iteration 2746: -0.098093
Cost after iteration 2747: 0.090952
Cost after iteration 2748: -0.094187
Cost after iteration 2749: 0.087710
Cost after iteration 2750: -0.092706
Cost after iteration 2751: 0.087337
Cost after iteration 2752: -0.092769
Cost after iteration 2753: 0.087861
Cost after iteration 2754: -0.092389
Cost after iteration 2755: 0.086323
Cost after iteration 2756: -0.089014
Cost after iteration 2757: 0.079956
Cost after iteration 2758: -0.080772
Cost after iteration 2759: 0.068133
Cost after iteration 2760: -0.067989
Cost after iteration 2761: 0.053459
Cost after iteration 2762: -0.053639
Cost after iteration 2763: 0.039775
Cost after iteration 2764: -0.041038
Cost after iteration 2765: 0.029121
Cost after iteration 2766: -0.031455
Cost after iteration 2767: 0.021543
Cost after iteration 2768: -0.024598
Cost after iteration 2769: 0.016346
Cost after iteration 2770: -0.019770
Cost after iteration 2771: 0.012830
Cost after iteration 2772: -0.016361
Cost after iteration 2773: 0.010470
Cost after iteration 2774: -0.013938
Cost after iteration 2775: 0.008908
Cost after iteration 2776: -0.012210
Cost after iteration 2777: 0.007912
Cost after iteration 2778: -0.010988
Cost after iteration 2779: 0.007332
Cost after iteration 2780: -0.010154
Cost after iteration 2781: 0.007075
Cost after iteration 2782: -0.009635
Cost after iteration 2783: 0.007089
Cost after iteration 2784: -0.009392
Cost after iteration 2785: 0.007349
Cost after iteration 2786: -0.009411
Cost after iteration 2787: 0.007857
Cost after iteration 2788: -0.009704
Cost after iteration 2789: 0.008638
Cost after iteration 2790: -0.010305
Cost after iteration 2791: 0.009744
Cost after iteration 2792: -0.011278
Cost after iteration 2793: 0.011260
Cost after iteration 2794: -0.012722
Cost after iteration 2795: 0.013321
Cost after iteration 2796: -0.014795
Cost after iteration 2797: 0.016134
Cost after iteration 2798: -0.017732
Cost after iteration 2799: 0.020017
Cost after iteration 2800: -0.021907
Cost after iteration 2801: 0.025478
Cost after iteration 2802: -0.027922
Cost after iteration 2803: 0.033358
Cost after iteration 2804: -0.036821
Cost after iteration 2805: 0.045118
Cost after iteration 2806: -0.050480
Cost after iteration 2807: 0.063340
Cost after iteration 2808: -0.071790
Cost after iteration 2809: 0.091859
Cost after iteration 2810: -0.100836
Cost after iteration 2811: 0.130494
Cost after iteration 2812: -0.125533
Cost after iteration 2813: 0.156742
Cost after iteration 2814: -0.134801
Cost after iteration 2815: 0.152754
Cost after iteration 2816: -0.134102
Cost after iteration 2817: 0.142782
Cost after iteration 2818: -0.138679
Cost after iteration 2819: 0.149485
Cost after iteration 2820: -0.153848
Cost after iteration 2821: 0.176205
Cost after iteration 2822: -0.179960
Cost after iteration 2823: 0.214024
Cost after iteration 2824: -0.191748
Cost after iteration 2825: 0.204345
Cost after iteration 2826: -0.159998
Cost after iteration 2827: 0.133876
Cost after iteration 2828: -0.108086
Cost after iteration 2829: 0.069780
Cost after iteration 2830: -0.059994
Cost after iteration 2831: 0.032532
Cost after iteration 2832: -0.033663
Cost after iteration 2833: 0.014914
Cost after iteration 2834: -0.021195
Cost after iteration 2835: 0.006716
Cost after iteration 2836: -0.014752
Cost after iteration 2837: 0.002647
Cost after iteration 2838: -0.011058
Cost after iteration 2839: 0.000534
Cost after iteration 2840: -0.008746
Cost after iteration 2841: -0.000581
Cost after iteration 2842: -0.007191
Cost after iteration 2843: -0.001152
Cost after iteration 2844: -0.006080
Cost after iteration 2845: -0.001414
Cost after iteration 2846: -0.005246
Cost after iteration 2847: -0.001493
Cost after iteration 2848: -0.004593
Cost after iteration 2849: -0.001459
Cost after iteration 2850: -0.004067
Cost after iteration 2851: -0.001356
Cost after iteration 2852: -0.003632
Cost after iteration 2853: -0.001208
Cost after iteration 2854: -0.003268
Cost after iteration 2855: -0.001030
Cost after iteration 2856: -0.002961
Cost after iteration 2857: -0.000830
Cost after iteration 2858: -0.002703
Cost after iteration 2859: -0.000612
Cost after iteration 2860: -0.002490
Cost after iteration 2861: -0.000378
Cost after iteration 2862: -0.002321
Cost after iteration 2863: -0.000124
Cost after iteration 2864: -0.002197
Cost after iteration 2865: 0.000152
Cost after iteration 2866: -0.002123
Cost after iteration 2867: 0.000461
Cost after iteration 2868: -0.002107
Cost after iteration 2869: 0.000813
Cost after iteration 2870: -0.002163
Cost after iteration 2871: 0.001228
Cost after iteration 2872: -0.002313
Cost after iteration 2873: 0.001733
Cost after iteration 2874: -0.002586
Cost after iteration 2875: 0.002367
Cost after iteration 2876: -0.003030
Cost after iteration 2877: 0.003190
Cost after iteration 2878: -0.003716
Cost after iteration 2879: 0.004295
Cost after iteration 2880: -0.004752
Cost after iteration 2881: 0.005822
Cost after iteration 2882: -0.006307
Cost after iteration 2883: 0.007992
Cost after iteration 2884: -0.008646
Cost after iteration 2885: 0.011165
Cost after iteration 2886: -0.012197
Cost after iteration 2887: 0.015940
Cost after iteration 2888: -0.017687
Cost after iteration 2889: 0.023368
Cost after iteration 2890: -0.026435
Cost after iteration 2891: 0.035424
Cost after iteration 2892: -0.041113
Cost after iteration 2893: 0.056126
Cost after iteration 2894: -0.067036
Cost after iteration 2895: 0.093612
Cost after iteration 2896: -0.106014
Cost after iteration 2897: 0.153437
Cost after iteration 2898: -0.137670
Cost after iteration 2899: 0.187677
Cost after iteration 2900: -0.136064
Cost after iteration 2901: 0.149094
Cost after iteration 2902: -0.113818
Cost after iteration 2903: 0.108735
Cost after iteration 2904: -0.102728
Cost after iteration 2905: 0.095237
Cost after iteration 2906: -0.096556
Cost after iteration 2907: 0.089071
Cost after iteration 2908: -0.093310
Cost after iteration 2909: 0.086885
Cost after iteration 2910: -0.092501
Cost after iteration 2911: 0.087209
Cost after iteration 2912: -0.092765
Cost after iteration 2913: 0.087507
Cost after iteration 2914: -0.091689
Cost after iteration 2915: 0.084563
Cost after iteration 2916: -0.086702
Cost after iteration 2917: 0.076067
Cost after iteration 2918: -0.076519
Cost after iteration 2919: 0.062589
Cost after iteration 2920: -0.062542
Cost after iteration 2921: 0.047693
Cost after iteration 2922: -0.048363
Cost after iteration 2923: 0.034891
Cost after iteration 2924: -0.036733
Cost after iteration 2925: 0.025374
Cost after iteration 2926: -0.028183
Cost after iteration 2927: 0.018752
Cost after iteration 2928: -0.022141
Cost after iteration 2929: 0.014256
Cost after iteration 2930: -0.017896
Cost after iteration 2931: 0.011230
Cost after iteration 2932: -0.014894
Cost after iteration 2933: 0.009208
Cost after iteration 2934: -0.012751
Cost after iteration 2935: 0.007879
Cost after iteration 2936: -0.011217
Cost after iteration 2937: 0.007044
Cost after iteration 2938: -0.010129
Cost after iteration 2939: 0.006573
Cost after iteration 2940: -0.009386
Cost after iteration 2941: 0.006389
Cost after iteration 2942: -0.008927
Cost after iteration 2943: 0.006447
Cost after iteration 2944: -0.008718
Cost after iteration 2945: 0.006728
Cost after iteration 2946: -0.008751
Cost after iteration 2947: 0.007236
Cost after iteration 2948: -0.009038
Cost after iteration 2949: 0.007998
Cost after iteration 2950: -0.009613
Cost after iteration 2951: 0.009063
Cost after iteration 2952: -0.010539
Cost after iteration 2953: 0.010515
Cost after iteration 2954: -0.011912
Cost after iteration 2955: 0.012482
Cost after iteration 2956: -0.013881
Cost after iteration 2957: 0.015160
Cost after iteration 2958: -0.016671
Cost after iteration 2959: 0.018853
Cost after iteration 2960: -0.020636
Cost after iteration 2961: 0.024040
Cost after iteration 2962: -0.026344
Cost after iteration 2963: 0.031514
Cost after iteration 2964: -0.034772
Cost after iteration 2965: 0.042648
Cost after iteration 2966: -0.047687
Cost after iteration 2967: 0.059897
Cost after iteration 2968: -0.068011
Cost after iteration 2969: 0.087193
Cost after iteration 2970: -0.097026
Cost after iteration 2971: 0.126189
Cost after iteration 2972: -0.123990
Cost after iteration 2973: 0.157228
Cost after iteration 2974: -0.135845
Cost after iteration 2975: 0.156501
Cost after iteration 2976: -0.135182
Cost after iteration 2977: 0.144531
Cost after iteration 2978: -0.138702
Cost after iteration 2979: 0.149173
Cost after iteration 2980: -0.153086
Cost after iteration 2981: 0.174804
Cost after iteration 2982: -0.179416
Cost after iteration 2983: 0.214382
Cost after iteration 2984: -0.193869
Cost after iteration 2985: 0.209256
Cost after iteration 2986: -0.163672
Cost after iteration 2987: 0.139072
Cost after iteration 2988: -0.111542
Cost after iteration 2989: 0.072549
Cost after iteration 2990: -0.062014
Cost after iteration 2991: 0.033643
Cost after iteration 2992: -0.034465
Cost after iteration 2993: 0.015231
Cost after iteration 2994: -0.021512
Cost after iteration 2995: 0.006745
Cost after iteration 2996: -0.014888
Cost after iteration 2997: 0.002572
Cost after iteration 2998: -0.011121
Cost after iteration 2999: 0.000421
W1 = [[ 2.08801776 -1.89503493]
 [ 2.46515642 -2.07480543]]
b1 = [[-4.84976773]
 [ 6.30623075]]
W2 = [[-7.20236323  7.07357254]]
b2 = [[-3.45260284]]
Expected Output¶

Note: the actual values can be different!

Cost after iteration 0: 0.693148
Cost after iteration 1: 0.693147
Cost after iteration 2: 0.693147
Cost after iteration 3: 0.693147
Cost after iteration 4: 0.693147
Cost after iteration 5: 0.693147
...
Cost after iteration 2995: 0.209524
Cost after iteration 2996: 0.208025
Cost after iteration 2997: 0.210427
Cost after iteration 2998: 0.208929
Cost after iteration 2999: 0.211306
W1 = [[ 2.14274251 -1.93155541]
 [ 2.20268789 -2.1131799 ]]
b1 = [[-4.83079243]
 [ 6.2845223 ]]
W2 = [[-7.21370685  7.0898022 ]]
b2 = [[-3.48755239]]
In [26]:
Copied!
# Note: 
# Actual values are not checked here in the unit tests (due to random initialization).
w3_unittest.test_nn_model(nn_model)
# Note: # Actual values are not checked here in the unit tests (due to random initialization). w3_unittest.test_nn_model(nn_model)
 All tests passed

The final model parameters can be used to find the boundary line and for making predictions.

Exercise 8¶

Computes probabilities using forward propagation, and make classification to 0/1 using 0.5 as the threshold.

In [27]:
graded
Copied!
# GRADED FUNCTION: predict

def predict(X, parameters):
    """
    Using the learned parameters, predicts a class for each example in X
    
    Arguments:
    parameters -- python dictionary containing your parameters 
    X -- input data of size (n_x, m)
    
    Returns
    predictions -- vector of predictions of our model (blue: 0 / red: 1)
    """
    
    ### START CODE HERE ### (≈ 2 lines of code)
    A2, cache = forward_propagation(X, parameters)
    predictions = A2 > 0.5
    ### END CODE HERE ###
    
    return predictions
# GRADED FUNCTION: predict def predict(X, parameters): """ Using the learned parameters, predicts a class for each example in X Arguments: parameters -- python dictionary containing your parameters X -- input data of size (n_x, m) Returns predictions -- vector of predictions of our model (blue: 0 / red: 1) """ ### START CODE HERE ### (≈ 2 lines of code) A2, cache = forward_propagation(X, parameters) predictions = A2 > 0.5 ### END CODE HERE ### return predictions
In [28]:
graded
Copied!
X_pred = np.array([[2, 8, 2, 8], [2, 8, 8, 2]])
Y_pred = predict(X_pred, parameters)

print(f"Coordinates (in the columns):\n{X_pred}")
print(f"Predictions:\n{Y_pred}")
X_pred = np.array([[2, 8, 2, 8], [2, 8, 8, 2]]) Y_pred = predict(X_pred, parameters) print(f"Coordinates (in the columns):\n{X_pred}") print(f"Predictions:\n{Y_pred}")
Coordinates (in the columns):
[[2 8 2 8]
 [2 8 8 2]]
Predictions:
[[ True  True False False]]
Expected Output¶
Coordinates (in the columns):
[[2 8 2 8]
 [2 8 8 2]]
Predictions:
[[ True  True False False]]
In [29]:
Copied!
w3_unittest.test_predict(predict)
w3_unittest.test_predict(predict)
 All tests passed

Let's visualize the boundary line. Do not worry if you don't understand the function plot_decision_boundary line by line - it simply makes prediction for some points on the plane and plots them as a contour plot (just two colors - blue and red).

In [30]:
graded
Copied!
def plot_decision_boundary(predict, parameters, X, Y):
    # Define bounds of the domain.
    min1, max1 = X[0, :].min()-1, X[0, :].max()+1
    min2, max2 = X[1, :].min()-1, X[1, :].max()+1
    # Define the x and y scale.
    x1grid = np.arange(min1, max1, 0.1)
    x2grid = np.arange(min2, max2, 0.1)
    # Create all of the lines and rows of the grid.
    xx, yy = np.meshgrid(x1grid, x2grid)
    # Flatten each grid to a vector.
    r1, r2 = xx.flatten(), yy.flatten()
    r1, r2 = r1.reshape((1, len(r1))), r2.reshape((1, len(r2)))
    # Vertical stack vectors to create x1,x2 input for the model.
    grid = np.vstack((r1,r2))
    # Make predictions for the grid.
    predictions = predict(grid, parameters)
    # Reshape the predictions back into a grid.
    zz = predictions.reshape(xx.shape)
    # Plot the grid of x, y and z values as a surface.
    plt.contourf(xx, yy, zz, cmap=plt.cm.Spectral.reversed())
    plt.scatter(X[0, :], X[1, :], c=Y, cmap=colors.ListedColormap(['blue', 'red']));

# Plot the decision boundary.
plot_decision_boundary(predict, parameters, X, Y)
plt.title("Decision Boundary for hidden layer size " + str(n_h))
def plot_decision_boundary(predict, parameters, X, Y): # Define bounds of the domain. min1, max1 = X[0, :].min()-1, X[0, :].max()+1 min2, max2 = X[1, :].min()-1, X[1, :].max()+1 # Define the x and y scale. x1grid = np.arange(min1, max1, 0.1) x2grid = np.arange(min2, max2, 0.1) # Create all of the lines and rows of the grid. xx, yy = np.meshgrid(x1grid, x2grid) # Flatten each grid to a vector. r1, r2 = xx.flatten(), yy.flatten() r1, r2 = r1.reshape((1, len(r1))), r2.reshape((1, len(r2))) # Vertical stack vectors to create x1,x2 input for the model. grid = np.vstack((r1,r2)) # Make predictions for the grid. predictions = predict(grid, parameters) # Reshape the predictions back into a grid. zz = predictions.reshape(xx.shape) # Plot the grid of x, y and z values as a surface. plt.contourf(xx, yy, zz, cmap=plt.cm.Spectral.reversed()) plt.scatter(X[0, :], X[1, :], c=Y, cmap=colors.ListedColormap(['blue', 'red'])); # Plot the decision boundary. plot_decision_boundary(predict, parameters, X, Y) plt.title("Decision Boundary for hidden layer size " + str(n_h))
Out[30]:
Text(0.5, 1.0, 'Decision Boundary for hidden layer size 2')
No description has been provided for this image

That's great, you can see that more complicated classification problems can be solved with two layer neural network!

4 - Optional: Other Dataset¶

Build a slightly different dataset:

In [31]:
graded
Copied!
n_samples = 2000
samples, labels = make_blobs(n_samples=n_samples, 
                             centers=([2.5, 3], [6.7, 7.9], [2.1, 7.9], [7.4, 2.8]), 
                             cluster_std=1.1,
                             random_state=0)
labels[(labels == 0)] = 0
labels[(labels == 1)] = 1
labels[(labels == 2) | (labels == 3)] = 1
X_2 = np.transpose(samples)
Y_2 = labels.reshape((1,n_samples))

plt.scatter(X_2[0, :], X_2[1, :], c=Y_2, cmap=colors.ListedColormap(['blue', 'red']));
n_samples = 2000 samples, labels = make_blobs(n_samples=n_samples, centers=([2.5, 3], [6.7, 7.9], [2.1, 7.9], [7.4, 2.8]), cluster_std=1.1, random_state=0) labels[(labels == 0)] = 0 labels[(labels == 1)] = 1 labels[(labels == 2) | (labels == 3)] = 1 X_2 = np.transpose(samples) Y_2 = labels.reshape((1,n_samples)) plt.scatter(X_2[0, :], X_2[1, :], c=Y_2, cmap=colors.ListedColormap(['blue', 'red']));
No description has been provided for this image

Notice that when building your neural network, a number of the nodes in the hidden layer could be taken as a parameter. Try to change this parameter and investigate the results:

In [32]:
graded
Copied!
# parameters_2 = nn_model(X_2, Y_2, n_h=1, num_iterations=3000, learning_rate=1.2, print_cost=False)
parameters_2 = nn_model(X_2, Y_2, n_h=2, num_iterations=3000, learning_rate=1.2, print_cost=False)
# parameters_2 = nn_model(X_2, Y_2, n_h=15, num_iterations=3000, learning_rate=1.2, print_cost=False)

# This function will call predict function 
plot_decision_boundary(predict, parameters_2, X_2, Y_2)
plt.title("Decision Boundary for hidden layer size " + str(n_h))
# parameters_2 = nn_model(X_2, Y_2, n_h=1, num_iterations=3000, learning_rate=1.2, print_cost=False) parameters_2 = nn_model(X_2, Y_2, n_h=2, num_iterations=3000, learning_rate=1.2, print_cost=False) # parameters_2 = nn_model(X_2, Y_2, n_h=15, num_iterations=3000, learning_rate=1.2, print_cost=False) # This function will call predict function plot_decision_boundary(predict, parameters_2, X_2, Y_2) plt.title("Decision Boundary for hidden layer size " + str(n_h))
Out[32]:
Text(0.5, 1.0, 'Decision Boundary for hidden layer size 2')
No description has been provided for this image

You can see that there are some misclassified points - real-world datasets are usually linearly inseparable, and there will be a small percentage of errors. More than that, you do not want to build a model that fits too closely, almost exactly to a particular set of data - it may fail to predict future observations. This problem is known as overfitting.

Congrats on finishing this programming assignment!

In [ ]:
graded
Copied!


Documentation built with MkDocs.

Keyboard Shortcuts

Keys Action
? Open this help
n Next page
p Previous page
s Search