【发布时间】:2020-11-09 07:32:30
【问题描述】:
我是深度学习的新手,我已经建立了一个图卷积网络。我使用了 5 折交叉验证。在将平均 train_loss (blue) 和 validate_loss (orange) 绘制在一起后,我得到了这个宝贝。
如您所见,从 validate_loss 的曲线趋势来看,我的网络似乎学到的东西很少。 (我猜是数据?GCN 框架?学习率?)
你们能具体帮我找出错误吗?
我将不胜感激!如果你不明白我的意思,请告诉我。
class Scorer(nn.Module):
"""
Three conv_layers and two fc_layers with Dropout
"""
def __init__(self):
super(Scorer, self).__init__()
self.conv_layer1 = GraphConvNet(5, 64)
self.conv_layer2 = GraphConvNet(64, 128)
self.conv_layer3 = GraphConvNet(128, 256) # (I have tried delete conv_layer3)
self.fc_layer1 = nn.Linear(256, 128)
self.drop_layer1 = nn.Dropout(0.5)
self.fc_layer2 = nn.Linear(128, 64)
self.drop_layer2 = nn.Dropout(0.5)
self.out_layer = nn.Linear(64, 1)
def forward(self, NormLap, feat):
h = self.conv_layer1(NormLap, feat)
h = F.leaky_relu(h)
h = self.conv_layer2(NormLap, h)
h = F.leaky_relu(h)
h = self.conv_layer3(NormLap, h)
h = F.leaky_relu(h)
h = self.fc_layer1(h)
h = self.drop_layer1(h)
h = F.leaky_relu(h)
h = self.fc_layer2(h)
h = self.drop_layer2(h)
h = F.leaky_relu(h)
h = self.out_layer(h)
h = F.leaky_relu(h)
以下是我的网络和参数:
# parameter setting
learning_rate = 0.001 # (I have tried 1e-1, 1e-2)
weight_decay = 1e-3 # (I have tried 1e-4)
epochs = 500
batch_size = 50 # (I have tried 30)
model = Scorer()
loss_func = nn.MSELoss()
optimizer = th.optim.Adam(model.parameters(), lr=learning_rate, weight_decay=weight_decay)
【问题讨论】:
标签: parameters deep-learning pytorch data-loss