【发布时间】:2021-08-01 10:47:00
【问题描述】:
这段代码来自 PyTorch 转换器:
self.linear1 = Linear(d_model, dim_feedforward, **factory_kwargs)
self.dropout = Dropout(dropout)
self.linear2 = Linear(dim_feedforward, d_model, **factory_kwargs)
self.norm1 = LayerNorm(d_model, eps=layer_norm_eps, **factory_kwargs)
self.norm2 = LayerNorm(d_model, eps=layer_norm_eps, **factory_kwargs)
self.norm3 = LayerNorm(d_model, eps=layer_norm_eps, **factory_kwargs)
self.dropout1 = Dropout(dropout)
self.dropout2 = Dropout(dropout)
self.dropout3 = Dropout(dropout)
当self.dropout 已经存在并且功能完全相同时,为什么还要添加self.dropout1、...2、...3?
另外,(self.linear1,self.linear2)和self.linear有什么区别?
【问题讨论】:
标签: python pytorch instance dropout