-
Notifications
You must be signed in to change notification settings - Fork 58
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
一些关于分割任务的记录 #25
Comments
FCN中deconv是初始化与前面的conv不同么? for m in self.modules():
if isinstance(m, nn.Conv2d) or isinstance(m, nn.Linear):
m.weight.data.normal_(0.0, 0.01)
m.bias.data.fill_(0)
if isinstance(m, nn.ConvTranspose2d):
assert m.kernel_size[0] == m.kernel_size[1]
initial_weight = get_upsampling_weight(
m.in_channels, m.out_channels, m.kernel_size[0])
m.weight.data.copy_(initial_weight) 学姐的代码里这里的对于转置卷积的初始化操作有些特别, 不太理解。 def get_upsampling_weight(in_channels, out_channels, kernel_size):
"""Make a 2D bilinear kernel suitable for upsampling"""
factor = (kernel_size + 1) // 2
if kernel_size % 2 == 1:
center = factor - 1
else:
center = factor - 0.5
og = np.ogrid[:kernel_size, :kernel_size]
filt = (1 - abs(og[0] - center) / factor) * \
(1 - abs(og[1] - center) / factor)
weight = np.zeros((in_channels, out_channels, kernel_size, kernel_size),
dtype=np.float64)
weight[range(in_channels), range(out_channels), :, :] = filt
return torch.from_numpy(weight).float() 我们将2x上采样初始化为双线性插值, 但允许按照3.3节所述学习参数( We initialize the upsampling tobilinear interpolation, but allow the parameters to be learnedas described in Section 3.3.) 这里使用的双线性插值的方式来进行的初始化. |
从某种意义上,伴随因子f的上采样是对步长为1/f的分数式输入的卷积操作。只要f是整数,一种自然的方法进行上采样就是向后卷积(有时称为去卷积)伴随输出步长为f。这样的操作实现是不重要的,因为它只是简单的调换了卷积的顺推法和逆推法。所以上采样在网内通过计算像素级别的损失的反向传播用于端到端的学习。 需要注意的是去卷积滤波在这种层面上不需要被固定不变(比如双线性上采样)但是可以被学习。一堆反褶积层和激励函数甚至能学习一种非线性上采样。在我们的实验中,我们发现在网内的上采样对于学习dense prediction是快速且有效的 |
可以从最近的相关的论文中了解过去这些年的语义分割方法的发展历史. https://www.groundai.com/project/pyramid-attention-network-for-semantic-segmentation/#section_3 |
总结:
不同的网络如何实现上采样
|
No description provided.
The text was updated successfully, but these errors were encountered: