【问题标题】:PyTorch: How to convert pretrained FC layers in a CNN to Conv layersPyTorch:如何将 CNN 中预训练的 FC 层转换为 Conv 层
【发布时间】:2017-10-24 02:32:28
【问题描述】:

我想在 Pytorch 中将预训练的 CNN(如 VGG-16)转换为全卷积网络。我该怎么做?

【问题讨论】:

    标签: neural-network pytorch conv-neural-network


    【解决方案1】:

    您可以按照以下方式进行操作(参见 cmets 的说明):

    import torch
    import torch.nn as nn
    from torchvision import models
    
    # 1. LOAD PRE-TRAINED VGG16
    model = models.vgg16(pretrained=True)
    
    # 2. GET CONV LAYERS
    features = model.features
    
    # 3. GET FULLY CONNECTED LAYERS
    fcLayers = nn.Sequential(
        # stop at last layer
        *list(model.classifier.children())[:-1]
    )
    
    # 4. CONVERT FULLY CONNECTED LAYERS TO CONVOLUTIONAL LAYERS
    
    ### convert first fc layer to conv layer with 512x7x7 kernel
    fc = fcLayers[0].state_dict()
    in_ch = 512
    out_ch = fc["weight"].size(0)
    
    firstConv = nn.Conv2d(in_ch, out_ch, 7, 7)
    
    ### get the weights from the fc layer
    firstConv.load_state_dict({"weight":fc["weight"].view(out_ch, in_ch, 7, 7),
                               "bias":fc["bias"]})
    
    # CREATE A LIST OF CONVS
    convList = [firstConv]
    
    # Similarly convert the remaining linear layers to conv layers 
    for layer in enumerate(fcLayers[1:]):
        if isinstance(module, nn.Linear):
            # Convert the nn.Linear to nn.Conv
            fc = module.state_dict()
            in_ch = fc["weight"].size(1)
            out_ch = fc["weight"].size(0)
            conv = nn.Conv2d(in_ch, out_ch, 1, 1)
    
            conv.load_state_dict({"weight":fc["weight"].view(out_ch, in_ch, 1, 1),
                "bias":fc["bias"]})
    
            convList += [conv]
        else:
            # Append other layers such as ReLU and Dropout
            convList += [layer]
    
    # Set the conv layers as a nn.Sequential module
    convLayers = nn.Sequential(*convList)  
    

    【讨论】:

    猜你喜欢
    • 2019-03-16
    • 1970-01-01
    • 1970-01-01
    • 2018-03-12
    • 1970-01-01
    • 2021-03-27
    • 2020-02-06
    • 2020-08-27
    • 2019-01-21
    相关资源
    最近更新 更多