@@ -24,8 +24,8 @@ outputs = 1
24
24
network = NeuralNetwork(inputs, outputs, cost = " mse" )
25
25
26
26
# Add 2 hidden layers with 16 neurons each and activation function 'tanh'
27
- network.addLayer (16 , activation_function = " tanh" )
28
- network.addLayer (16 , activation_function = " tanh" )
27
+ network.add_layer (16 , activation_function = " tanh" )
28
+ network.add_layer (16 , activation_function = " tanh" )
29
29
30
30
# Finish the neural network by adding the output layer with sigmoid activation function.
31
31
network.compile(activation_function = " sigmoid" )
@@ -44,9 +44,9 @@ input_file = "inputs.csv"
44
44
target_file = " targets.csv"
45
45
46
46
# Create a dataset object with the same inputs and outputs defined for the network.
47
- datasetCreator = Dataset(inputs, outputs)
48
- datasetCreator.makeDataset (input_file, target_file)
49
- data, size = datasetCreator.getRawData ()
47
+ dataset_handler = Dataset(inputs, outputs)
48
+ dataset_handler.make_dataset (input_file, target_file)
49
+ data, size = dataset_handler.get_raw_data ()
50
50
```
51
51
52
52
If you want to manually make a dataset, follow these rules:
@@ -80,15 +80,15 @@ For eg, a typical XOR data set looks something like :
80
80
### Training The network
81
81
The library provides a * Train* function which accepts the dataset, dataset size, and two optional parameters epochs, and logging.
82
82
``` python3
83
- def Train (dataset , size , epochs = 5000 , logging = True ) :
83
+ def Train (self , dataset : T_Dataset , size , epochs = 100 , logging = False , epoch_logging = True , prediction_evaulator = None ) :
84
84
... .
85
85
... .
86
86
```
87
87
For Eg: If you want to train your network for 1000 epochs.
88
88
``` python3
89
89
>> > network.Train(data, size, epochs = 1000 )
90
90
```
91
- Notice that I didn't change the value of log_outputs as I want the output to printed for each epoch.
91
+ Notice that I didn't change the value of ` logging ` as I want the output to be printed for each epoch.
92
92
93
93
94
94
### Debugging
@@ -109,7 +109,7 @@ To take a look at all the layers' info
109
109
110
110
Sometimes, learning rate might have to be altered for better convergence.
111
111
``` python3
112
- >> > network.setLearningRate (0.1 )
112
+ >> > network.set_learning_rate (0.1 )
113
113
```
114
114
115
115
### Exporting Model
0 commit comments