Skip to content

Commit 657c076

Browse files
nps1nghpmeier
andauthored
fix doc of ops.misc.MLP (#7041)
Co-authored-by: Philip Meier <[email protected]>
1 parent 386cfe2 commit 657c076

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

torchvision/ops/misc.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -266,8 +266,8 @@ class MLP(torch.nn.Sequential):
266266
Args:
267267
in_channels (int): Number of channels of the input
268268
hidden_channels (List[int]): List of the hidden channel dimensions
269-
norm_layer (Callable[..., torch.nn.Module], optional): Norm layer that will be stacked on top of the convolution layer. If ``None`` this layer wont be used. Default: ``None``
270-
activation_layer (Callable[..., torch.nn.Module], optional): Activation function which will be stacked on top of the normalization layer (if not None), otherwise on top of the conv layer. If ``None`` this layer wont be used. Default: ``torch.nn.ReLU``
269+
norm_layer (Callable[..., torch.nn.Module], optional): Norm layer that will be stacked on top of the linear layer. If ``None`` this layer wont be used. Default: ``None``
270+
activation_layer (Callable[..., torch.nn.Module], optional): Activation function which will be stacked on top of the normalization layer (if not None), otherwise on top of the linear layer. If ``None`` this layer wont be used. Default: ``torch.nn.ReLU``
271271
inplace (bool): Parameter for the activation layer, which can optionally do the operation in-place. Default ``True``
272272
bias (bool): Whether to use bias in the linear layer. Default ``True``
273273
dropout (float): The probability for the dropout layer. Default: 0.0

0 commit comments

Comments
 (0)