Skip to content

Commit ac266fb

Browse files
authored
PyGAD 2.7.0 Regression Support
Changes in PyGAD 2.7.0 (11 September 2020): 1. The `learning_rate` parameter in the `pygad.nn.train()` function defaults to **0.01**. 2. Added support of building neural networks for regression using the new parameter named `problem_type`. It is added as a parameter to both `pygad.nn.train()` and `pygad.nn.predict()` functions. The value of this parameter can be either **classification** or **regression** to define the problem type. It defaults to **classification**. 3. The activation function for a layer can be set to the string `"None"` to refer that there is no activation function at this layer. As a result, the supported values for the activation function are `"sigmoid"`, `"relu"`, `"softmax"`, and `"None"`. To build a regression network using the `pygad.nn` module, just do the following: 1. Set the `problem_type` parameter in the `pygad.nn.train()` and `pygad.nn.predict()` functions to the string `"regression"`. 2. Set the activation function for the output layer to the string `"None"`. This sets no limits on the range of the outputs as it will be from `-infinity` to `+infinity`. If you are sure that all outputs will be nonnegative values, then use the ReLU function. Check the documentation of the `pygad.nn` module for an example that builds a neural network for regression. The regression example is also available at [this GitHub project](https://github.com/ahmedfgad/NumPyANN): https://github.com/ahmedfgad/NumPyANN To build and train a regression network using the `pygad.gann` module, do the following: 1. Set the `problem_type` parameter in the `pygad.nn.train()` and `pygad.nn.predict()` functions to the string `"regression"`. 2. Set the `output_activation` parameter in the constructor of the `pygad.gann.GANN` class to `"None"`. Check the documentation of the `pygad.gann` module for an example that builds and trains a neural network for regression. The regression example is also available at [this GitHub project](https://github.com/ahmedfgad/NeuralGenetic): https://github.com/ahmedfgad/NeuralGenetic To build a classification network, either ignore the `problem_type` parameter or set it to `"classification"` (default value). In this case, the activation function of the last layer can be set to any type (e.g. softmax).
1 parent c08cd0e commit ac266fb

6 files changed

+358
-3
lines changed

Fish.csv

+160
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,160 @@
1+
Species,Weight,Length1,Length2,Length3,Height,Width
2+
Bream,242,23.2,25.4,30,11.52,4.02
3+
Bream,290,24,26.3,31.2,12.48,4.3056
4+
Bream,340,23.9,26.5,31.1,12.3778,4.6961
5+
Bream,363,26.3,29,33.5,12.73,4.4555
6+
Bream,430,26.5,29,34,12.444,5.134
7+
Bream,450,26.8,29.7,34.7,13.6024,4.9274
8+
Bream,500,26.8,29.7,34.5,14.1795,5.2785
9+
Bream,390,27.6,30,35,12.67,4.69
10+
Bream,450,27.6,30,35.1,14.0049,4.8438
11+
Bream,500,28.5,30.7,36.2,14.2266,4.9594
12+
Bream,475,28.4,31,36.2,14.2628,5.1042
13+
Bream,500,28.7,31,36.2,14.3714,4.8146
14+
Bream,500,29.1,31.5,36.4,13.7592,4.368
15+
Bream,340,29.5,32,37.3,13.9129,5.0728
16+
Bream,600,29.4,32,37.2,14.9544,5.1708
17+
Bream,600,29.4,32,37.2,15.438,5.58
18+
Bream,700,30.4,33,38.3,14.8604,5.2854
19+
Bream,700,30.4,33,38.5,14.938,5.1975
20+
Bream,610,30.9,33.5,38.6,15.633,5.1338
21+
Bream,650,31,33.5,38.7,14.4738,5.7276
22+
Bream,575,31.3,34,39.5,15.1285,5.5695
23+
Bream,685,31.4,34,39.2,15.9936,5.3704
24+
Bream,620,31.5,34.5,39.7,15.5227,5.2801
25+
Bream,680,31.8,35,40.6,15.4686,6.1306
26+
Bream,700,31.9,35,40.5,16.2405,5.589
27+
Bream,725,31.8,35,40.9,16.36,6.0532
28+
Bream,720,32,35,40.6,16.3618,6.09
29+
Bream,714,32.7,36,41.5,16.517,5.8515
30+
Bream,850,32.8,36,41.6,16.8896,6.1984
31+
Bream,1000,33.5,37,42.6,18.957,6.603
32+
Bream,920,35,38.5,44.1,18.0369,6.3063
33+
Bream,955,35,38.5,44,18.084,6.292
34+
Bream,925,36.2,39.5,45.3,18.7542,6.7497
35+
Bream,975,37.4,41,45.9,18.6354,6.7473
36+
Bream,950,38,41,46.5,17.6235,6.3705
37+
Roach,40,12.9,14.1,16.2,4.1472,2.268
38+
Roach,69,16.5,18.2,20.3,5.2983,2.8217
39+
Roach,78,17.5,18.8,21.2,5.5756,2.9044
40+
Roach,87,18.2,19.8,22.2,5.6166,3.1746
41+
Roach,120,18.6,20,22.2,6.216,3.5742
42+
Roach,0,19,20.5,22.8,6.4752,3.3516
43+
Roach,110,19.1,20.8,23.1,6.1677,3.3957
44+
Roach,120,19.4,21,23.7,6.1146,3.2943
45+
Roach,150,20.4,22,24.7,5.8045,3.7544
46+
Roach,145,20.5,22,24.3,6.6339,3.5478
47+
Roach,160,20.5,22.5,25.3,7.0334,3.8203
48+
Roach,140,21,22.5,25,6.55,3.325
49+
Roach,160,21.1,22.5,25,6.4,3.8
50+
Roach,169,22,24,27.2,7.5344,3.8352
51+
Roach,161,22,23.4,26.7,6.9153,3.6312
52+
Roach,200,22.1,23.5,26.8,7.3968,4.1272
53+
Roach,180,23.6,25.2,27.9,7.0866,3.906
54+
Roach,290,24,26,29.2,8.8768,4.4968
55+
Roach,272,25,27,30.6,8.568,4.7736
56+
Roach,390,29.5,31.7,35,9.485,5.355
57+
Whitefish,270,23.6,26,28.7,8.3804,4.2476
58+
Whitefish,270,24.1,26.5,29.3,8.1454,4.2485
59+
Whitefish,306,25.6,28,30.8,8.778,4.6816
60+
Whitefish,540,28.5,31,34,10.744,6.562
61+
Whitefish,800,33.7,36.4,39.6,11.7612,6.5736
62+
Whitefish,1000,37.3,40,43.5,12.354,6.525
63+
Parkki,55,13.5,14.7,16.5,6.8475,2.3265
64+
Parkki,60,14.3,15.5,17.4,6.5772,2.3142
65+
Parkki,90,16.3,17.7,19.8,7.4052,2.673
66+
Parkki,120,17.5,19,21.3,8.3922,2.9181
67+
Parkki,150,18.4,20,22.4,8.8928,3.2928
68+
Parkki,140,19,20.7,23.2,8.5376,3.2944
69+
Parkki,170,19,20.7,23.2,9.396,3.4104
70+
Parkki,145,19.8,21.5,24.1,9.7364,3.1571
71+
Parkki,200,21.2,23,25.8,10.3458,3.6636
72+
Parkki,273,23,25,28,11.088,4.144
73+
Parkki,300,24,26,29,11.368,4.234
74+
Perch,5.9,7.5,8.4,8.8,2.112,1.408
75+
Perch,32,12.5,13.7,14.7,3.528,1.9992
76+
Perch,40,13.8,15,16,3.824,2.432
77+
Perch,51.5,15,16.2,17.2,4.5924,2.6316
78+
Perch,70,15.7,17.4,18.5,4.588,2.9415
79+
Perch,100,16.2,18,19.2,5.2224,3.3216
80+
Perch,78,16.8,18.7,19.4,5.1992,3.1234
81+
Perch,80,17.2,19,20.2,5.6358,3.0502
82+
Perch,85,17.8,19.6,20.8,5.1376,3.0368
83+
Perch,85,18.2,20,21,5.082,2.772
84+
Perch,110,19,21,22.5,5.6925,3.555
85+
Perch,115,19,21,22.5,5.9175,3.3075
86+
Perch,125,19,21,22.5,5.6925,3.6675
87+
Perch,130,19.3,21.3,22.8,6.384,3.534
88+
Perch,120,20,22,23.5,6.11,3.4075
89+
Perch,120,20,22,23.5,5.64,3.525
90+
Perch,130,20,22,23.5,6.11,3.525
91+
Perch,135,20,22,23.5,5.875,3.525
92+
Perch,110,20,22,23.5,5.5225,3.995
93+
Perch,130,20.5,22.5,24,5.856,3.624
94+
Perch,150,20.5,22.5,24,6.792,3.624
95+
Perch,145,20.7,22.7,24.2,5.9532,3.63
96+
Perch,150,21,23,24.5,5.2185,3.626
97+
Perch,170,21.5,23.5,25,6.275,3.725
98+
Perch,225,22,24,25.5,7.293,3.723
99+
Perch,145,22,24,25.5,6.375,3.825
100+
Perch,188,22.6,24.6,26.2,6.7334,4.1658
101+
Perch,180,23,25,26.5,6.4395,3.6835
102+
Perch,197,23.5,25.6,27,6.561,4.239
103+
Perch,218,25,26.5,28,7.168,4.144
104+
Perch,300,25.2,27.3,28.7,8.323,5.1373
105+
Perch,260,25.4,27.5,28.9,7.1672,4.335
106+
Perch,265,25.4,27.5,28.9,7.0516,4.335
107+
Perch,250,25.4,27.5,28.9,7.2828,4.5662
108+
Perch,250,25.9,28,29.4,7.8204,4.2042
109+
Perch,300,26.9,28.7,30.1,7.5852,4.6354
110+
Perch,320,27.8,30,31.6,7.6156,4.7716
111+
Perch,514,30.5,32.8,34,10.03,6.018
112+
Perch,556,32,34.5,36.5,10.2565,6.3875
113+
Perch,840,32.5,35,37.3,11.4884,7.7957
114+
Perch,685,34,36.5,39,10.881,6.864
115+
Perch,700,34,36,38.3,10.6091,6.7408
116+
Perch,700,34.5,37,39.4,10.835,6.2646
117+
Perch,690,34.6,37,39.3,10.5717,6.3666
118+
Perch,900,36.5,39,41.4,11.1366,7.4934
119+
Perch,650,36.5,39,41.4,11.1366,6.003
120+
Perch,820,36.6,39,41.3,12.4313,7.3514
121+
Perch,850,36.9,40,42.3,11.9286,7.1064
122+
Perch,900,37,40,42.5,11.73,7.225
123+
Perch,1015,37,40,42.4,12.3808,7.4624
124+
Perch,820,37.1,40,42.5,11.135,6.63
125+
Perch,1100,39,42,44.6,12.8002,6.8684
126+
Perch,1000,39.8,43,45.2,11.9328,7.2772
127+
Perch,1100,40.1,43,45.5,12.5125,7.4165
128+
Perch,1000,40.2,43.5,46,12.604,8.142
129+
Perch,1000,41.1,44,46.6,12.4888,7.5958
130+
Pike,200,30,32.3,34.8,5.568,3.3756
131+
Pike,300,31.7,34,37.8,5.7078,4.158
132+
Pike,300,32.7,35,38.8,5.9364,4.3844
133+
Pike,300,34.8,37.3,39.8,6.2884,4.0198
134+
Pike,430,35.5,38,40.5,7.29,4.5765
135+
Pike,345,36,38.5,41,6.396,3.977
136+
Pike,456,40,42.5,45.5,7.28,4.3225
137+
Pike,510,40,42.5,45.5,6.825,4.459
138+
Pike,540,40.1,43,45.8,7.786,5.1296
139+
Pike,500,42,45,48,6.96,4.896
140+
Pike,567,43.2,46,48.7,7.792,4.87
141+
Pike,770,44.8,48,51.2,7.68,5.376
142+
Pike,950,48.3,51.7,55.1,8.9262,6.1712
143+
Pike,1250,52,56,59.7,10.6863,6.9849
144+
Pike,1600,56,60,64,9.6,6.144
145+
Pike,1550,56,60,64,9.6,6.144
146+
Pike,1650,59,63.4,68,10.812,7.48
147+
Smelt,6.7,9.3,9.8,10.8,1.7388,1.0476
148+
Smelt,7.5,10,10.5,11.6,1.972,1.16
149+
Smelt,7,10.1,10.6,11.6,1.7284,1.1484
150+
Smelt,9.7,10.4,11,12,2.196,1.38
151+
Smelt,9.8,10.7,11.2,12.4,2.0832,1.2772
152+
Smelt,8.7,10.8,11.3,12.6,1.9782,1.2852
153+
Smelt,10,11.3,11.8,13.1,2.2139,1.2838
154+
Smelt,9.9,11.3,11.8,13.1,2.2139,1.1659
155+
Smelt,9.8,11.4,12,13.2,2.2044,1.1484
156+
Smelt,12.2,11.5,12.2,13.4,2.0904,1.3936
157+
Smelt,13.4,11.7,12.4,13.5,2.43,1.269
158+
Smelt,12.2,12.1,13,13.8,2.277,1.2558
159+
Smelt,19.7,13.2,14.3,15.2,2.8728,2.0672
160+
Smelt,19.9,13.8,15,16.2,2.9322,1.8792

README.md

+3-3
Original file line numberDiff line numberDiff line change
@@ -1,12 +1,12 @@
11
# NumPyANN: Building Neural Networks using NumPy
22

3-
[NumPyANN](https://github.com/ahmedfgad/NumPyCNN) is a Python project for training neural networks using NumPy.
3+
[NumPyANN](https://github.com/ahmedfgad/NumPyCNN) is a Python project for building artificial neural networks using NumPy.
44

5-
[NumPyANN](https://github.com/ahmedfgad/NumPyCNN) is part of [PyGAD](https://pypi.org/project/pygad) which is an open-source Python 3 library for implementing the genetic algorithm and optimizing machine learning algorithms.
5+
[NumPyANN](https://github.com/ahmedfgad/NumPyCNN) is part of [PyGAD](https://pypi.org/project/pygad) which is an open-source Python 3 library for implementing the genetic algorithm and optimizing machine learning algorithms. Both regression and classification neural networks are supported starting from PyGAD 2.7.0.
66

77
Check documentation of the [NeuralGenetic](https://github.com/ahmedfgad/NeuralGenetic) project in the PyGAD's documentation: https://pygad.readthedocs.io/en/latest/README_pygad_nn_ReadTheDocs.html
88

9-
The library is under active development and more features in the genetic algorithm will be added like working with binary problems. This is in addition to supporting more machine learning algorithms.
9+
The library is under active development and more features are added regularly. If you want a feature to be supported, please check the **Contact Us** section to send a request.
1010

1111
Before using [NumPyANN](https://github.com/ahmedfgad/NumPyCNN), install PyGAD.
1212

example_XOR_classification.py

+51
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,51 @@
1+
import numpy
2+
import pygad.nn
3+
4+
"""
5+
This project creates a neural network where the architecture has input and dense layers only. More layers will be added in the future.
6+
The project only implements the forward pass of a neural network and no training algorithm is used.
7+
For training a neural network using the genetic algorithm, check this project (https://github.com/ahmedfgad/NeuralGenetic) in which the genetic algorithm is used for training the network.
8+
Feel free to leave an issue in this project (https://github.com/ahmedfgad/NumPyANN) in case something is not working properly or to ask for questions. I am also available for e-mails at [email protected]
9+
"""
10+
11+
# Preparing the NumPy array of the inputs.
12+
data_inputs = numpy.array([[1, 1],
13+
[1, 0],
14+
[0, 1],
15+
[0, 0]])
16+
17+
# Preparing the NumPy array of the outputs.
18+
data_outputs = numpy.array([0,
19+
1,
20+
1,
21+
0])
22+
23+
# The number of inputs (i.e. feature vector length) per sample
24+
num_inputs = data_inputs.shape[1]
25+
# Number of outputs per sample
26+
num_outputs = 2
27+
28+
HL1_neurons = 2
29+
30+
# Building the network architecture.
31+
input_layer = pygad.nn.InputLayer(num_inputs)
32+
hidden_layer1 = pygad.nn.DenseLayer(num_neurons=HL1_neurons, previous_layer=input_layer, activation_function="relu")
33+
output_layer = pygad.nn.DenseLayer(num_neurons=num_outputs, previous_layer=hidden_layer1, activation_function="softmax")
34+
35+
# Training the network.
36+
pygad.nn.train(num_epochs=100,
37+
last_layer=output_layer,
38+
data_inputs=data_inputs,
39+
data_outputs=data_outputs,
40+
learning_rate=0.01)
41+
42+
# Using the trained network for predictions.
43+
predictions = pygad.nn.predict(last_layer=output_layer, data_inputs=data_inputs)
44+
45+
# Calculating some statistics
46+
num_wrong = numpy.where(predictions != data_outputs)[0]
47+
num_correct = data_outputs.size - num_wrong.size
48+
accuracy = 100 * (num_correct/data_outputs.size)
49+
print("Number of correct classifications : {num_correct}.".format(num_correct=num_correct))
50+
print("Number of wrong classifications : {num_wrong}.".format(num_wrong=num_wrong.size))
51+
print("Classification accuracy : {accuracy}.".format(accuracy=accuracy))

example_classification.py

+51
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,51 @@
1+
import numpy
2+
import pygad.nn
3+
4+
"""
5+
This project creates a neural network where the architecture has input and dense layers only. More layers will be added in the future.
6+
The project only implements the forward pass of a neural network and no training algorithm is used.
7+
For training a neural network using the genetic algorithm, check this project (https://github.com/ahmedfgad/NeuralGenetic) in which the genetic algorithm is used for training the network.
8+
Feel free to leave an issue in this project (https://github.com/ahmedfgad/NumPyANN) in case something is not working properly or to ask for questions. I am also available for e-mails at [email protected]
9+
"""
10+
11+
# Reading the data features. Check the 'extract_features.py' script for extracting the features & preparing the outputs of the dataset.
12+
data_inputs = numpy.load("dataset_features.npy") # Download from https://github.com/ahmedfgad/NumPyANN/blob/master/dataset_features.npy
13+
14+
# Optional step for filtering the features using the standard deviation.
15+
features_STDs = numpy.std(a=data_inputs, axis=0)
16+
data_inputs = data_inputs[:, features_STDs > 50]
17+
18+
# Reading the data outputs. Check the 'extract_features.py' script for extracting the features & preparing the outputs of the dataset.
19+
data_outputs = numpy.load("outputs.npy") # Download from https://github.com/ahmedfgad/NumPyANN/blob/master/outputs.npy
20+
21+
# The number of inputs (i.e. feature vector length) per sample
22+
num_inputs = data_inputs.shape[1]
23+
# Number of outputs per sample
24+
num_outputs = 4
25+
26+
HL1_neurons = 150
27+
HL2_neurons = 60
28+
29+
# Building the network architecture.
30+
input_layer = pygad.nn.InputLayer(num_inputs)
31+
hidden_layer1 = pygad.nn.DenseLayer(num_neurons=HL1_neurons, previous_layer=input_layer, activation_function="relu")
32+
hidden_layer2 = pygad.nn.DenseLayer(num_neurons=HL2_neurons, previous_layer=hidden_layer1, activation_function="relu")
33+
output_layer = pygad.nn.DenseLayer(num_neurons=num_outputs, previous_layer=hidden_layer2, activation_function="softmax")
34+
35+
# Training the network.
36+
pygad.nn.train(num_epochs=10,
37+
last_layer=output_layer,
38+
data_inputs=data_inputs,
39+
data_outputs=data_outputs,
40+
learning_rate=0.01)
41+
42+
# Using the trained network for predictions.
43+
predictions = pygad.nn.predict(last_layer=output_layer, data_inputs=data_inputs)
44+
45+
# Calculating some statistics
46+
num_wrong = numpy.where(predictions != data_outputs)[0]
47+
num_correct = data_outputs.size - num_wrong.size
48+
accuracy = 100 * (num_correct/data_outputs.size)
49+
print("Number of correct classifications : {num_correct}.".format(num_correct=num_correct))
50+
print("Number of wrong classifications : {num_wrong}.".format(num_wrong=num_wrong.size))
51+
print("Classification accuracy : {accuracy}.".format(accuracy=accuracy))

example_regression.py

+46
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,46 @@
1+
import numpy
2+
import pygad.nn
3+
4+
"""
5+
This example creates a neural network for regression where the architecture has input and dense layers only. More layers will be added in the future.
6+
The project only implements the forward pass of a neural network and no training algorithm is used.
7+
For training a neural network using the genetic algorithm, check this project (https://github.com/ahmedfgad/NeuralGenetic) in which the genetic algorithm is used for training the network.
8+
Feel free to leave an issue in this project (https://github.com/ahmedfgad/NumPyANN) in case something is not working properly or to ask for questions. I am also available for e-mails at [email protected]
9+
"""
10+
11+
# Preparing the NumPy array of the inputs.
12+
data_inputs = numpy.array([[2, 5, -3, 0.1],
13+
[8, 15, 20, 13]])
14+
15+
# Preparing the NumPy array of the outputs.
16+
data_outputs = numpy.array([0.1,
17+
1.5])
18+
19+
# The number of inputs (i.e. feature vector length) per sample
20+
num_inputs = data_inputs.shape[1]
21+
# Number of outputs per sample
22+
num_outputs = 1
23+
24+
HL1_neurons = 2
25+
26+
# Building the network architecture.
27+
input_layer = pygad.nn.InputLayer(num_inputs)
28+
hidden_layer1 = pygad.nn.DenseLayer(num_neurons=HL1_neurons, previous_layer=input_layer, activation_function="relu")
29+
output_layer = pygad.nn.DenseLayer(num_neurons=num_outputs, previous_layer=hidden_layer1, activation_function="None")
30+
31+
# Training the network.
32+
pygad.nn.train(num_epochs=100,
33+
last_layer=output_layer,
34+
data_inputs=data_inputs,
35+
data_outputs=data_outputs,
36+
learning_rate=0.01,
37+
problem_type="regression")
38+
39+
# Using the trained network for predictions.
40+
predictions = pygad.nn.predict(last_layer=output_layer,
41+
data_inputs=data_inputs,
42+
problem_type="regression")
43+
44+
# Calculating some statistics
45+
abs_error = numpy.mean(numpy.abs(predictions - data_outputs))
46+
print("Absolute error : {abs_error}.".format(abs_error=abs_error))

example_regression_fish.py

+47
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,47 @@
1+
import numpy
2+
import pygad.nn
3+
import pandas
4+
5+
"""
6+
This example creates a neural network for regression where the architecture has input and dense layers only. More layers will be added in the future.
7+
The project only implements the forward pass of a neural network and no training algorithm is used.
8+
For training a neural network using the genetic algorithm, check this project (https://github.com/ahmedfgad/NeuralGenetic) in which the genetic algorithm is used for training the network.
9+
Feel free to leave an issue in this project (https://github.com/ahmedfgad/NumPyANN) in case something is not working properly or to ask for questions. I am also available for e-mails at [email protected]
10+
"""
11+
12+
data = numpy.array(pandas.read_csv("Fish.csv"))
13+
14+
# Preparing the NumPy array of the inputs.
15+
data_inputs = numpy.asarray(data[:, 2:], dtype=numpy.float32)
16+
17+
# Preparing the NumPy array of the outputs.
18+
data_outputs = numpy.asarray(data[:, 1], dtype=numpy.float32) # Fish Weight
19+
20+
# The number of inputs (i.e. feature vector length) per sample
21+
num_inputs = data_inputs.shape[1]
22+
# Number of outputs per sample
23+
num_outputs = 1
24+
25+
HL1_neurons = 2
26+
27+
# Building the network architecture.
28+
input_layer = pygad.nn.InputLayer(num_inputs)
29+
hidden_layer1 = pygad.nn.DenseLayer(num_neurons=HL1_neurons, previous_layer=input_layer, activation_function="relu")
30+
output_layer = pygad.nn.DenseLayer(num_neurons=num_outputs, previous_layer=hidden_layer1, activation_function="None")
31+
32+
# Training the network.
33+
pygad.nn.train(num_epochs=100,
34+
last_layer=output_layer,
35+
data_inputs=data_inputs,
36+
data_outputs=data_outputs,
37+
learning_rate=0.01,
38+
problem_type="regression")
39+
40+
# Using the trained network for predictions.
41+
predictions = pygad.nn.predict(last_layer=output_layer,
42+
data_inputs=data_inputs,
43+
problem_type="regression")
44+
45+
# Calculating some statistics
46+
abs_error = numpy.mean(numpy.abs(predictions - data_outputs))
47+
print("Absolute error : {abs_error}.".format(abs_error=abs_error))

0 commit comments

Comments
 (0)