Skip to content

Conversation

@mpjlu
Copy link

@mpjlu mpjlu commented Jul 30, 2018

Python Dropout op uses the following code to check keep_prob value:
if tensor_util.constant_value(keep_prob) == 1: return x
If keep_prob is placeholder, tensor_util.constant_value(keep_prob) will return None, if statement will always be false.

@jakeret
Copy link
Owner

jakeret commented Jul 31, 2018

Thx for you contribution. I see why this is better during training. But how should we control the dropout during validation and prediction? There we want to set the dropout to 1.
Or am I missing something?

@mpjlu
Copy link
Author

mpjlu commented Aug 2, 2018

For prediction, we don't need dropout.
If set dropout to 1. The right behavior is dropout layer return directly.

@jakeret
Copy link
Owner

jakeret commented Aug 2, 2018

Right. So during training we want dropout to be < 1 and during validation it should be = 1.
How can we control this?

@mpjlu
Copy link
Author

mpjlu commented Aug 3, 2018

We can create two Unet with different keep_prob for training and validation. How do you think about it?
Since the Dropout layer is very time-consuming, it is better to skip Dropout during validation and inference.

@jakeret
Copy link
Owner

jakeret commented Aug 3, 2018

Don't we have to train two models then?
I wasn't aware that dropout is so time consuming. How much does it affect training/validation performance?

@mpjlu
Copy link
Author

mpjlu commented Aug 6, 2018

For inference, the Dropout is about 16% of iteration time.
Second row of this picture.
We don't need to train two model. Just need to create a new model (model with keep_prob = 1) for the inference/validation.
image

@mpjlu
Copy link
Author

mpjlu commented Aug 15, 2018

Hi @jakeret , any comment on the data. The data is based on CPU.

@jakeret
Copy link
Owner

jakeret commented Aug 16, 2018

An 16% performance improvement is nice.
However, i still don't fully understand how the training/validation procedure would look like. If a new model is created for validation, how would you transfer the learned weights?

@mpjlu
Copy link
Author

mpjlu commented Sep 18, 2018

I am sorry for reply later.
How about input two nets when creating Trainer object. The train_net for train, and the validation_net for validation. train_net can save the model for each epoch, and validation_net can restore the model for validation. What do you think about it?

@jakeret
Copy link
Owner

jakeret commented Sep 19, 2018

I don't see how this should be implemented. The computation-graph would be different for the two networks, which makes it hard to transfer the weights from one to the other

@mpjlu
Copy link
Author

mpjlu commented Sep 20, 2018

There is no weight for the dropout layer, it is ok to save model in the train net, and restore them in the validation net.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants