精华内容
下载资源
问答
  • Inceptionv3

    2020-12-02 15:02:10
    <div><p>hi , I try to test inceptionv3, I change the forward and model. It appears the following problems <code>Traceback (most recent call last): File "inceptionv3_train.py", line 346, in &...
  • InceptionV3

    2018-11-26 11:29:21
    InceptionV3的PyTorch实现:https://github.com/pytorch/vision/blob/master/torchvision/models/inception.py 2a表示第2组的第1个Block,同一组的空间维度相同 但为何没有3a, 5a? (299, 299, 3) →【1a, ...

    InceptionV3的PyTorch实现:https://github.com/pytorch/vision/blob/master/torchvision/models/inception.py

    2a表示第2组的第1个Block,同一组的空间维度相同
    但为何没有3a, 5a?

    (299, 299, 3)
    
    →【1a, Cout=32, f=3, s=2】→(149, 149, 32)
    
    →【2a, Cout=32, f=3】→(147, 147, 32)→【2b, Cout=64, f=3, p=1】→(147, 147, 64)
    →【max pool, f=3, s=2】→(73, 73, 64)
    
    →【3b, Cout=80, f=1】→(73, 73, 80)
    
    →【4a, Cout=192, f=3】→(71, 71, 192)→【max pool, f=3, s=2】→(35, 35, 192)
    
    →【Mixed_5b, InceptionA】→(35, 35, 256)→【Mixed_5c, InceptionA】→(35, 35, 288)
    →【Mixed_5d, InceptionA】→(35, 35, 288)
    
    →【Mixed_6a, InceptionB】→(17, 17, 768)
    →【Mixed_6b, InceptionC】→(17, 17, 768)→【Mixed_6c, InceptionC】→(17, 17, 768)
    →【Mixed_6d, InceptionC】→(17, 17, 768)→【Mixed_6e, InceptionC】→(17, 17, 768)
    
    →【Mixed_7a, InceptionD】→(8, 8, 1280)
    →【Mixed_7b, InceptionE】→(8, 8, 2048)→【Mixed_7c, InceptionE】→(8, 8, 2048)
    
    →【global avg pool】→(2048,)→【dropout】→(2048,)→【fc】→(1000,)
    
    分支:→【Mixed_6e, InceptionC】→(17, 17, 768)→【InceptionAux】→(1000,)
    

    InceptionA使用了3次,分别用在Mixed_5b, Mixed_5c, Mixed_5d中,包含参数pool_features,输入为(35, 35, in_channels),输出固定为(35, 35, 224+pool features)

    (35, 35, 192)→【Mixed_5b,InceptionA, pool_features=32】→(35, 35, 224+32=256)
    (35, 35, 256)→【Mixed_5c,InceptionA, pool_features=64】→(35, 35, 224+64=288)
    (35, 35, 288)→【Mixed_5d,InceptionA, pool_features=64】→(35, 35, 224+64=288)
    

    (35, 35, 192)→【Mixed_5b, InceptionA, pool_features=32】→(35, 35, 256)为例

    输入:(35, 35, 192)
    
    分支1:→【BasicConv2d, Cout=64, f=1】→(35, 35, 64)
    
    分支2:→【BasicConv2d, Cout=48, f=1】→(35, 35, 48)
    →【BasicConv2d, Cout=64, f=5, p=2】→(35, 35, 64)
    
    分支3:→【BasicConv2d, Cout=64, f=1】→(35, 35, 64)
    →【BasicConv2d, Cout=96, f=3, p=1】→(35, 35, 96)
    →【BasicConv2d, Cout=96, f=3, p=1】→(35, 35, 96)
    
    分支4:→【avg pool, f=3, s=1, p=1】→(35, 35, 192)
    →【BasicConv2d, Cout=pool_features, f=1】→(35, 35, pool_features)
    
    合并:(35, 35, 224+pool_features)
    

    InceptionB只使用了1次,用在Mixed_6a
    (35, 35, 288)→【Mixed_6a, InceptionB】→(17, 17, 768),空间维度减半,通道数增加到大约2.7倍

    输入:(35, 35, 288)
    
    分支1:→【BasicConv2d, Cout=384, f=3, s=2】→(17, 17, 384)
    
    分支2:→【BasicConv2d, Cout=64, f=1】→(35, 35, 64)
    →【BasicConv2d, Cout=96, f=3, p=1】→(35, 35, 96)
    →【BasicConv2d, Cout=96, f=3, s=2】→(17, 17, 96)
    
    分支3:→【max pool, f=3, s=2】→(17, 17, 288)
    
    合并:(17, 17, 384+96+288=768)
    

    InceptionC使用了4次,分别用在Mixed_6b, Mixed_6c, Mixed_6d, Mixed_6e中,输入和输出均为(17, 17, 768),只是参数channels_7x7不同,参数channels_7x7简记为c7

    Mixed_6b,c7=128
    Mixed_6c,c7=160
    Mixed_6d,c7=160
    Mixed_6e,c7=192
    
    输入:(17, 17, 768)
    
    分支1:→【BasicConv2d, Cout=192, f=1】→(17, 17, 192)
    
    分支2:→【BasicConv2d, Cout=c7, f=1】→(17, 17, c7)
    →【BasicConv2d, Cout=c7, f=(1, 7), p=(0, 3)】→(17, 17, c7)
    →【BasicConv2d, Cout=192, f=(7, 1), p=(3, 0)】→(17, 17, 192)
    
    分支3:→【BasicConv2d, Cout=c7, f=1】→(17, 17, c7)
    →【BasicConv2d, Cout=c7, f=(7, 1), p=(3, 0)】→(17, 17, c7)
    →【BasicConv2d, Cout=c7, f=(1, 7), p=(0, 3)】→(17, 17, c7)
    →【BasicConv2d, Cout=c7, f=(7, 1), p=(3, 0)】→(17, 17, c7)
    →【BasicConv2d, Cout=192, f=(1, 7), p=(0, 3)】→(17, 17, 192)
    
    分支4:→【avg pool, f=3, s=1, p=1】→(17, 17, 768)
    →【BasicConv2d, Cout=192, f=1】→(17, 17, 192)
    
    合并:(17, 17, 192×4=768)
    

    InceptionD只使用了1次,用在Mixed_7a
    (17, 17, 768)→【Mixed_7a, InceptionD】→(8, 8, 1280),空间维度减半,通道数增加到大约1.7倍

    输入:(17, 17, 768)
    
    分支1:→【BasicConv2d, Cout=192, f=1】→(17, 17, 192)
    →【BasicConv2d, Cout=320, f=3, s=2】→(8, 8, 320)
    
    分支2:→【BasicConv2d, Cout=192, f=1】→(17, 17, 192)
    →【BasicConv2d, Cout=192, f=(1, 7), p=(0, 3)】→(17, 17, 192)
    →【BasicConv2d, Cout=192, f=(7, 1), p=(3, 0)】→(17, 17, 192)
    →【BasicConv2d, Cout=192, f=3, s=2】→(8, 8, 192)
    
    分支3:→【max pool, f=3, s=2】→(8, 8, 768)
    
    合并:(8, 8, 320+192+768=1280)
    

    InceptionE使用了2次, 分别用在Mixed_7b, Mixed_7c中,输入为(8, 8, in_channels),输出固定为(8, 8, 2048)
    (8, 8, 1280)→【Mixed_7b, InceptionE】→(8, 8, 2048)为例

    输入:(8, 8, 1280)
    
    分支1:→【BasicConv2d, Cout=320, f=1】→(8, 8, 320)
    
    分支2:→【BasicConv2d, Cout=384, f=1】→(8, 8, 384)
      分支2-1:→【BasicConv2d, Cout=384, f=(1, 3), p=(0, 1)】→(8, 8, 384)
      分支2-2:→【BasicConv2d, Cout=384, f=(3, 1), p=(1, 0)】→(8, 8, 384)
      合并:(8, 8, 384×2=768)
    
    分支3:→【BasicConv2d, Cout=448, f=1】→(8, 8, 448)
          →【BasicConv2d, Cout=384, f=3, p=1】→(8, 8, 384)
      分支3-1:→【BasicConv2d, Cout=384, f=(1, 3), p=(0, 1)】→(8, 8, 384)
      分支3-2:→【BasicConv2d, Cout=384, f=(3, 1), p=(1, 0)】→(8, 8, 384)
      合并:(8, 8, 384×2=768)
    
    分支4:→【avg pool, f=3, s=1, p=1】→(8, 8, 1280)
    →【BasicConv2d, Cout=192, f=1】→(8, 8, 192)
    
    合并:(8, 8, 320+768+768+192=2048)
    

    InceptionAux只用了1次,连接在【Mixed_6e, InceptionC】的输出(17, 17, 768)

    (17, 17, 768)
    →【avg pool, f=5, s=3】→(5, 5, 768)
    →【BasicConv2d, Cout=128, f=1】→(5, 5, 128)
    →【BasicConv2d, Cout=768, f=5】→(1, 1, 768)
    →【reshape】→(768,)→【fc】→(1000,)
    

    pool层几个值得注意的地方
    (147, 147, 64)→【max pool, f=3, s=2】→(73, 73, 64)
    (71, 71, 192)→【max pool, f=3, s=2】→(35, 35, 192),使用f=3(一般使用f=2)

    Keras中的InceptionV3,共313层
    参考:https://github.com/keras-team/keras-applications/blob/master/keras_applications/inception_v3.py

    __________________________________________________________________________________________________
    Layer (type)                    Output Shape         Param #     Connected to                     
    ==================================================================================================
    
    第0层
    __________________________________________________________________________________________________
    input_1 (InputLayer)            (None, 299, 299, 3)  0                                            
    __________________________________________________________________________________________________
    
    
    
    第1-3层,(299, 299, 3)→【1a, Cout=32, f=3, s=2】→(149, 149, 32)
    __________________________________________________________________________________________________
    conv2d_1 (Conv2D)               (None, 149, 149, 32) 864         input_1[0][0]                    
    __________________________________________________________________________________________________
    batch_normalization_1 (BatchNor (None, 149, 149, 32) 96          conv2d_1[0][0]                   
    __________________________________________________________________________________________________
    activation_1 (Activation)       (None, 149, 149, 32) 0           batch_normalization_1[0][0]      
    __________________________________________________________________________________________________
    
    
    
    第4-6层,(149, 149, 32)→【2a, Cout=32, f=3】→(147, 147, 32)
    __________________________________________________________________________________________________
    conv2d_2 (Conv2D)               (None, 147, 147, 32) 9216        activation_1[0][0]               
    __________________________________________________________________________________________________
    batch_normalization_2 (BatchNor (None, 147, 147, 32) 96          conv2d_2[0][0]                   
    __________________________________________________________________________________________________
    activation_2 (Activation)       (None, 147, 147, 32) 0           batch_normalization_2[0][0]      
    __________________________________________________________________________________________________
    
    
    
    第7-9层,(147, 147, 32)→【2b, Cout=64, f=3, p=1】→(147, 147, 64)
    __________________________________________________________________________________________________
    conv2d_3 (Conv2D)               (None, 147, 147, 64) 18432       activation_2[0][0]               
    __________________________________________________________________________________________________
    batch_normalization_3 (BatchNor (None, 147, 147, 64) 192         conv2d_3[0][0]                   
    __________________________________________________________________________________________________
    activation_3 (Activation)       (None, 147, 147, 64) 0           batch_normalization_3[0][0]      
    __________________________________________________________________________________________________
    
    
    
    第10层,(147, 147, 64)→【max pool, f=3, s=2】→(73, 73, 64)
    __________________________________________________________________________________________________
    max_pooling2d_1 (MaxPooling2D)  (None, 73, 73, 64)   0           activation_3[0][0]               
    __________________________________________________________________________________________________
    
    
    
    第11-13层,(73, 73, 64)→【3b, Cout=80, f=1】→(73, 73, 80)
    __________________________________________________________________________________________________
    conv2d_4 (Conv2D)               (None, 73, 73, 80)   5120        max_pooling2d_1[0][0]            
    __________________________________________________________________________________________________
    batch_normalization_4 (BatchNor (None, 73, 73, 80)   240         conv2d_4[0][0]                   
    __________________________________________________________________________________________________
    activation_4 (Activation)       (None, 73, 73, 80)   0           batch_normalization_4[0][0]      
    __________________________________________________________________________________________________
    
    
    
    第14-16层,(73, 73, 80)→【4a, Cout=192, f=3】→(71, 71, 192)
    __________________________________________________________________________________________________
    conv2d_5 (Conv2D)               (None, 71, 71, 192)  138240      activation_4[0][0]               
    __________________________________________________________________________________________________
    batch_normalization_5 (BatchNor (None, 71, 71, 192)  576         conv2d_5[0][0]                   
    __________________________________________________________________________________________________
    activation_5 (Activation)       (None, 71, 71, 192)  0           batch_normalization_5[0][0]      
    __________________________________________________________________________________________________
    
    
    
    第17层,(71, 71, 192)→【max pool, f=3, s=2】→(35, 35, 192)
    __________________________________________________________________________________________________
    max_pooling2d_2 (MaxPooling2D)  (None, 35, 35, 192)  0           activation_5[0][0]               
    __________________________________________________________________________________________________
    
    
    
    
    第18-40层,(35, 35, 192)→【Mixed_5b, InceptionA】→(35, 35, 256)
    __________________________________________________________________________________________________
    conv2d_9 (Conv2D)               (None, 35, 35, 64)   12288       max_pooling2d_2[0][0]            
    __________________________________________________________________________________________________
    batch_normalization_9 (BatchNor (None, 35, 35, 64)   192         conv2d_9[0][0]                   
    __________________________________________________________________________________________________
    activation_9 (Activation)       (None, 35, 35, 64)   0           batch_normalization_9[0][0]      
    __________________________________________________________________________________________________
    conv2d_7 (Conv2D)               (None, 35, 35, 48)   9216        max_pooling2d_2[0][0]            
    __________________________________________________________________________________________________
    conv2d_10 (Conv2D)              (None, 35, 35, 96)   55296       activation_9[0][0]               
    __________________________________________________________________________________________________
    batch_normalization_7 (BatchNor (None, 35, 35, 48)   144         conv2d_7[0][0]                   
    __________________________________________________________________________________________________
    batch_normalization_10 (BatchNo (None, 35, 35, 96)   288         conv2d_10[0][0]                  
    __________________________________________________________________________________________________
    activation_7 (Activation)       (None, 35, 35, 48)   0           batch_normalization_7[0][0]      
    __________________________________________________________________________________________________
    activation_10 (Activation)      (None, 35, 35, 96)   0           batch_normalization_10[0][0]     
    __________________________________________________________________________________________________
    average_pooling2d_1 (AveragePoo (None, 35, 35, 192)  0           max_pooling2d_2[0][0]            
    __________________________________________________________________________________________________
    conv2d_6 (Conv2D)               (None, 35, 35, 64)   12288       max_pooling2d_2[0][0]            
    __________________________________________________________________________________________________
    conv2d_8 (Conv2D)               (None, 35, 35, 64)   76800       activation_7[0][0]               
    __________________________________________________________________________________________________
    conv2d_11 (Conv2D)              (None, 35, 35, 96)   82944       activation_10[0][0]              
    __________________________________________________________________________________________________
    conv2d_12 (Conv2D)              (None, 35, 35, 32)   6144        average_pooling2d_1[0][0]        
    __________________________________________________________________________________________________
    batch_normalization_6 (BatchNor (None, 35, 35, 64)   192         conv2d_6[0][0]                   
    __________________________________________________________________________________________________
    batch_normalization_8 (BatchNor (None, 35, 35, 64)   192         conv2d_8[0][0]                   
    __________________________________________________________________________________________________
    batch_normalization_11 (BatchNo (None, 35, 35, 96)   288         conv2d_11[0][0]                  
    __________________________________________________________________________________________________
    batch_normalization_12 (BatchNo (None, 35, 35, 32)   96          conv2d_12[0][0]                  
    __________________________________________________________________________________________________
    activation_6 (Activation)       (None, 35, 35, 64)   0           batch_normalization_6[0][0]      
    __________________________________________________________________________________________________
    activation_8 (Activation)       (None, 35, 35, 64)   0           batch_normalization_8[0][0]      
    __________________________________________________________________________________________________
    activation_11 (Activation)      (None, 35, 35, 96)   0           batch_normalization_11[0][0]     
    __________________________________________________________________________________________________
    activation_12 (Activation)      (None, 35, 35, 32)   0           batch_normalization_12[0][0]     
    __________________________________________________________________________________________________
    mixed0 (Concatenate)            (None, 35, 35, 256)  0           activation_6[0][0]               
                                                                     activation_8[0][0]               
                                                                     activation_11[0][0]              
                                                                     activation_12[0][0]              
    __________________________________________________________________________________________________
    
    
    
    
    第41-63层,(35, 35, 256)→【Mixed_5c, InceptionA】→(35, 35, 288)
    __________________________________________________________________________________________________
    conv2d_16 (Conv2D)              (None, 35, 35, 64)   16384       mixed0[0][0]                     
    __________________________________________________________________________________________________
    batch_normalization_16 (BatchNo (None, 35, 35, 64)   192         conv2d_16[0][0]                  
    __________________________________________________________________________________________________
    activation_16 (Activation)      (None, 35, 35, 64)   0           batch_normalization_16[0][0]     
    __________________________________________________________________________________________________
    conv2d_14 (Conv2D)              (None, 35, 35, 48)   12288       mixed0[0][0]                     
    __________________________________________________________________________________________________
    conv2d_17 (Conv2D)              (None, 35, 35, 96)   55296       activation_16[0][0]              
    __________________________________________________________________________________________________
    batch_normalization_14 (BatchNo (None, 35, 35, 48)   144         conv2d_14[0][0]                  
    __________________________________________________________________________________________________
    batch_normalization_17 (BatchNo (None, 35, 35, 96)   288         conv2d_17[0][0]                  
    __________________________________________________________________________________________________
    activation_14 (Activation)      (None, 35, 35, 48)   0           batch_normalization_14[0][0]     
    __________________________________________________________________________________________________
    activation_17 (Activation)      (None, 35, 35, 96)   0           batch_normalization_17[0][0]     
    __________________________________________________________________________________________________
    average_pooling2d_2 (AveragePoo (None, 35, 35, 256)  0           mixed0[0][0]                     
    __________________________________________________________________________________________________
    conv2d_13 (Conv2D)              (None, 35, 35, 64)   16384       mixed0[0][0]                     
    __________________________________________________________________________________________________
    conv2d_15 (Conv2D)              (None, 35, 35, 64)   76800       activation_14[0][0]              
    __________________________________________________________________________________________________
    conv2d_18 (Conv2D)              (None, 35, 35, 96)   82944       activation_17[0][0]              
    __________________________________________________________________________________________________
    conv2d_19 (Conv2D)              (None, 35, 35, 64)   16384       average_pooling2d_2[0][0]        
    __________________________________________________________________________________________________
    batch_normalization_13 (BatchNo (None, 35, 35, 64)   192         conv2d_13[0][0]                  
    __________________________________________________________________________________________________
    batch_normalization_15 (BatchNo (None, 35, 35, 64)   192         conv2d_15[0][0]                  
    __________________________________________________________________________________________________
    batch_normalization_18 (BatchNo (None, 35, 35, 96)   288         conv2d_18[0][0]                  
    __________________________________________________________________________________________________
    batch_normalization_19 (BatchNo (None, 35, 35, 64)   192         conv2d_19[0][0]                  
    __________________________________________________________________________________________________
    activation_13 (Activation)      (None, 35, 35, 64)   0           batch_normalization_13[0][0]     
    __________________________________________________________________________________________________
    activation_15 (Activation)      (None, 35, 35, 64)   0           batch_normalization_15[0][0]     
    __________________________________________________________________________________________________
    activation_18 (Activation)      (None, 35, 35, 96)   0           batch_normalization_18[0][0]     
    __________________________________________________________________________________________________
    activation_19 (Activation)      (None, 35, 35, 64)   0           batch_normalization_19[0][0]     
    __________________________________________________________________________________________________
    mixed1 (Concatenate)            (None, 35, 35, 288)  0           activation_13[0][0]              
                                                                     activation_15[0][0]              
                                                                     activation_18[0][0]              
                                                                     activation_19[0][0]              
    __________________________________________________________________________________________________
    
    
    
    
    第64-86层,(35, 35, 288)→【Mixed_5d, InceptionA】→(35, 35, 288)
    __________________________________________________________________________________________________
    conv2d_23 (Conv2D)              (None, 35, 35, 64)   18432       mixed1[0][0]                     
    __________________________________________________________________________________________________
    batch_normalization_23 (BatchNo (None, 35, 35, 64)   192         conv2d_23[0][0]                  
    __________________________________________________________________________________________________
    activation_23 (Activation)      (None, 35, 35, 64)   0           batch_normalization_23[0][0]     
    __________________________________________________________________________________________________
    conv2d_21 (Conv2D)              (None, 35, 35, 48)   13824       mixed1[0][0]                     
    __________________________________________________________________________________________________
    conv2d_24 (Conv2D)              (None, 35, 35, 96)   55296       activation_23[0][0]              
    __________________________________________________________________________________________________
    batch_normalization_21 (BatchNo (None, 35, 35, 48)   144         conv2d_21[0][0]                  
    __________________________________________________________________________________________________
    batch_normalization_24 (BatchNo (None, 35, 35, 96)   288         conv2d_24[0][0]                  
    __________________________________________________________________________________________________
    activation_21 (Activation)      (None, 35, 35, 48)   0           batch_normalization_21[0][0]     
    __________________________________________________________________________________________________
    activation_24 (Activation)      (None, 35, 35, 96)   0           batch_normalization_24[0][0]     
    __________________________________________________________________________________________________
    average_pooling2d_3 (AveragePoo (None, 35, 35, 288)  0           mixed1[0][0]                     
    __________________________________________________________________________________________________
    conv2d_20 (Conv2D)              (None, 35, 35, 64)   18432       mixed1[0][0]                     
    __________________________________________________________________________________________________
    conv2d_22 (Conv2D)              (None, 35, 35, 64)   76800       activation_21[0][0]              
    __________________________________________________________________________________________________
    conv2d_25 (Conv2D)              (None, 35, 35, 96)   82944       activation_24[0][0]              
    __________________________________________________________________________________________________
    conv2d_26 (Conv2D)              (None, 35, 35, 64)   18432       average_pooling2d_3[0][0]        
    __________________________________________________________________________________________________
    batch_normalization_20 (BatchNo (None, 35, 35, 64)   192         conv2d_20[0][0]                  
    __________________________________________________________________________________________________
    batch_normalization_22 (BatchNo (None, 35, 35, 64)   192         conv2d_22[0][0]                  
    __________________________________________________________________________________________________
    batch_normalization_25 (BatchNo (None, 35, 35, 96)   288         conv2d_25[0][0]                  
    __________________________________________________________________________________________________
    batch_normalization_26 (BatchNo (None, 35, 35, 64)   192         conv2d_26[0][0]                  
    __________________________________________________________________________________________________
    activation_20 (Activation)      (None, 35, 35, 64)   0           batch_normalization_20[0][0]     
    __________________________________________________________________________________________________
    activation_22 (Activation)      (None, 35, 35, 64)   0           batch_normalization_22[0][0]     
    __________________________________________________________________________________________________
    activation_25 (Activation)      (None, 35, 35, 96)   0           batch_normalization_25[0][0]     
    __________________________________________________________________________________________________
    activation_26 (Activation)      (None, 35, 35, 64)   0           batch_normalization_26[0][0]     
    __________________________________________________________________________________________________
    mixed2 (Concatenate)            (None, 35, 35, 288)  0           activation_20[0][0]              
                                                                     activation_22[0][0]              
                                                                     activation_25[0][0]              
                                                                     activation_26[0][0]              
    __________________________________________________________________________________________________
    
    
    
    第87-100层,(35, 35, 288)→【Mixed_6a, InceptionB】→(17, 17, 768)
    __________________________________________________________________________________________________
    conv2d_28 (Conv2D)              (None, 35, 35, 64)   18432       mixed2[0][0]                     
    __________________________________________________________________________________________________
    batch_normalization_28 (BatchNo (None, 35, 35, 64)   192         conv2d_28[0][0]                  
    __________________________________________________________________________________________________
    activation_28 (Activation)      (None, 35, 35, 64)   0           batch_normalization_28[0][0]     
    __________________________________________________________________________________________________
    conv2d_29 (Conv2D)              (None, 35, 35, 96)   55296       activation_28[0][0]              
    __________________________________________________________________________________________________
    batch_normalization_29 (BatchNo (None, 35, 35, 96)   288         conv2d_29[0][0]                  
    __________________________________________________________________________________________________
    activation_29 (Activation)      (None, 35, 35, 96)   0           batch_normalization_29[0][0]     
    __________________________________________________________________________________________________
    conv2d_27 (Conv2D)              (None, 17, 17, 384)  995328      mixed2[0][0]                     
    __________________________________________________________________________________________________
    conv2d_30 (Conv2D)              (None, 17, 17, 96)   82944       activation_29[0][0]              
    __________________________________________________________________________________________________
    batch_normalization_27 (BatchNo (None, 17, 17, 384)  1152        conv2d_27[0][0]                  
    __________________________________________________________________________________________________
    batch_normalization_30 (BatchNo (None, 17, 17, 96)   288         conv2d_30[0][0]                  
    __________________________________________________________________________________________________
    activation_27 (Activation)      (None, 17, 17, 384)  0           batch_normalization_27[0][0]     
    __________________________________________________________________________________________________
    activation_30 (Activation)      (None, 17, 17, 96)   0           batch_normalization_30[0][0]     
    __________________________________________________________________________________________________
    max_pooling2d_3 (MaxPooling2D)  (None, 17, 17, 288)  0           mixed2[0][0]                     
    __________________________________________________________________________________________________
    mixed3 (Concatenate)            (None, 17, 17, 768)  0           activation_27[0][0]              
                                                                     activation_30[0][0]              
                                                                     max_pooling2d_3[0][0]            
    __________________________________________________________________________________________________
    
    
    
    第101-132层,(17, 17, 768)→【Mixed_6b, InceptionC】→(17, 17, 768)
    __________________________________________________________________________________________________
    conv2d_35 (Conv2D)              (None, 17, 17, 128)  98304       mixed3[0][0]                     
    __________________________________________________________________________________________________
    batch_normalization_35 (BatchNo (None, 17, 17, 128)  384         conv2d_35[0][0]                  
    __________________________________________________________________________________________________
    activation_35 (Activation)      (None, 17, 17, 128)  0           batch_normalization_35[0][0]     
    __________________________________________________________________________________________________
    conv2d_36 (Conv2D)              (None, 17, 17, 128)  114688      activation_35[0][0]              
    __________________________________________________________________________________________________
    batch_normalization_36 (BatchNo (None, 17, 17, 128)  384         conv2d_36[0][0]                  
    __________________________________________________________________________________________________
    activation_36 (Activation)      (None, 17, 17, 128)  0           batch_normalization_36[0][0]     
    __________________________________________________________________________________________________
    conv2d_32 (Conv2D)              (None, 17, 17, 128)  98304       mixed3[0][0]                     
    __________________________________________________________________________________________________
    conv2d_37 (Conv2D)              (None, 17, 17, 128)  114688      activation_36[0][0]              
    __________________________________________________________________________________________________
    batch_normalization_32 (BatchNo (None, 17, 17, 128)  384         conv2d_32[0][0]                  
    __________________________________________________________________________________________________
    batch_normalization_37 (BatchNo (None, 17, 17, 128)  384         conv2d_37[0][0]                  
    __________________________________________________________________________________________________
    activation_32 (Activation)      (None, 17, 17, 128)  0           batch_normalization_32[0][0]     
    __________________________________________________________________________________________________
    activation_37 (Activation)      (None, 17, 17, 128)  0           batch_normalization_37[0][0]     
    __________________________________________________________________________________________________
    conv2d_33 (Conv2D)              (None, 17, 17, 128)  114688      activation_32[0][0]              
    __________________________________________________________________________________________________
    conv2d_38 (Conv2D)              (None, 17, 17, 128)  114688      activation_37[0][0]              
    __________________________________________________________________________________________________
    batch_normalization_33 (BatchNo (None, 17, 17, 128)  384         conv2d_33[0][0]                  
    __________________________________________________________________________________________________
    batch_normalization_38 (BatchNo (None, 17, 17, 128)  384         conv2d_38[0][0]                  
    __________________________________________________________________________________________________
    activation_33 (Activation)      (None, 17, 17, 128)  0           batch_normalization_33[0][0]     
    __________________________________________________________________________________________________
    activation_38 (Activation)      (None, 17, 17, 128)  0           batch_normalization_38[0][0]     
    __________________________________________________________________________________________________
    average_pooling2d_4 (AveragePoo (None, 17, 17, 768)  0           mixed3[0][0]                     
    __________________________________________________________________________________________________
    conv2d_31 (Conv2D)              (None, 17, 17, 192)  147456      mixed3[0][0]                     
    __________________________________________________________________________________________________
    conv2d_34 (Conv2D)              (None, 17, 17, 192)  172032      activation_33[0][0]              
    __________________________________________________________________________________________________
    conv2d_39 (Conv2D)              (None, 17, 17, 192)  172032      activation_38[0][0]              
    __________________________________________________________________________________________________
    conv2d_40 (Conv2D)              (None, 17, 17, 192)  147456      average_pooling2d_4[0][0]        
    __________________________________________________________________________________________________
    batch_normalization_31 (BatchNo (None, 17, 17, 192)  576         conv2d_31[0][0]                  
    __________________________________________________________________________________________________
    batch_normalization_34 (BatchNo (None, 17, 17, 192)  576         conv2d_34[0][0]                  
    __________________________________________________________________________________________________
    batch_normalization_39 (BatchNo (None, 17, 17, 192)  576         conv2d_39[0][0]                  
    __________________________________________________________________________________________________
    batch_normalization_40 (BatchNo (None, 17, 17, 192)  576         conv2d_40[0][0]                  
    __________________________________________________________________________________________________
    activation_31 (Activation)      (None, 17, 17, 192)  0           batch_normalization_31[0][0]     
    __________________________________________________________________________________________________
    activation_34 (Activation)      (None, 17, 17, 192)  0           batch_normalization_34[0][0]     
    __________________________________________________________________________________________________
    activation_39 (Activation)      (None, 17, 17, 192)  0           batch_normalization_39[0][0]     
    __________________________________________________________________________________________________
    activation_40 (Activation)      (None, 17, 17, 192)  0           batch_normalization_40[0][0]     
    __________________________________________________________________________________________________
    mixed4 (Concatenate)            (None, 17, 17, 768)  0           activation_31[0][0]              
                                                                     activation_34[0][0]              
                                                                     activation_39[0][0]              
                                                                     activation_40[0][0]              
    __________________________________________________________________________________________________
    
    
    
    第133-164层,(17, 17, 768)→【Mixed_6c, InceptionC】→(17, 17, 768)
    __________________________________________________________________________________________________
    conv2d_45 (Conv2D)              (None, 17, 17, 160)  122880      mixed4[0][0]                     
    __________________________________________________________________________________________________
    batch_normalization_45 (BatchNo (None, 17, 17, 160)  480         conv2d_45[0][0]                  
    __________________________________________________________________________________________________
    activation_45 (Activation)      (None, 17, 17, 160)  0           batch_normalization_45[0][0]     
    __________________________________________________________________________________________________
    conv2d_46 (Conv2D)              (None, 17, 17, 160)  179200      activation_45[0][0]              
    __________________________________________________________________________________________________
    batch_normalization_46 (BatchNo (None, 17, 17, 160)  480         conv2d_46[0][0]                  
    __________________________________________________________________________________________________
    activation_46 (Activation)      (None, 17, 17, 160)  0           batch_normalization_46[0][0]     
    __________________________________________________________________________________________________
    conv2d_42 (Conv2D)              (None, 17, 17, 160)  122880      mixed4[0][0]                     
    __________________________________________________________________________________________________
    conv2d_47 (Conv2D)              (None, 17, 17, 160)  179200      activation_46[0][0]              
    __________________________________________________________________________________________________
    batch_normalization_42 (BatchNo (None, 17, 17, 160)  480         conv2d_42[0][0]                  
    __________________________________________________________________________________________________
    batch_normalization_47 (BatchNo (None, 17, 17, 160)  480         conv2d_47[0][0]                  
    __________________________________________________________________________________________________
    activation_42 (Activation)      (None, 17, 17, 160)  0           batch_normalization_42[0][0]     
    __________________________________________________________________________________________________
    activation_47 (Activation)      (None, 17, 17, 160)  0           batch_normalization_47[0][0]     
    __________________________________________________________________________________________________
    conv2d_43 (Conv2D)              (None, 17, 17, 160)  179200      activation_42[0][0]              
    __________________________________________________________________________________________________
    conv2d_48 (Conv2D)              (None, 17, 17, 160)  179200      activation_47[0][0]              
    __________________________________________________________________________________________________
    batch_normalization_43 (BatchNo (None, 17, 17, 160)  480         conv2d_43[0][0]                  
    __________________________________________________________________________________________________
    batch_normalization_48 (BatchNo (None, 17, 17, 160)  480         conv2d_48[0][0]                  
    __________________________________________________________________________________________________
    activation_43 (Activation)      (None, 17, 17, 160)  0           batch_normalization_43[0][0]     
    __________________________________________________________________________________________________
    activation_48 (Activation)      (None, 17, 17, 160)  0           batch_normalization_48[0][0]     
    __________________________________________________________________________________________________
    average_pooling2d_5 (AveragePoo (None, 17, 17, 768)  0           mixed4[0][0]                     
    __________________________________________________________________________________________________
    conv2d_41 (Conv2D)              (None, 17, 17, 192)  147456      mixed4[0][0]                     
    __________________________________________________________________________________________________
    conv2d_44 (Conv2D)              (None, 17, 17, 192)  215040      activation_43[0][0]              
    __________________________________________________________________________________________________
    conv2d_49 (Conv2D)              (None, 17, 17, 192)  215040      activation_48[0][0]              
    __________________________________________________________________________________________________
    conv2d_50 (Conv2D)              (None, 17, 17, 192)  147456      average_pooling2d_5[0][0]        
    __________________________________________________________________________________________________
    batch_normalization_41 (BatchNo (None, 17, 17, 192)  576         conv2d_41[0][0]                  
    __________________________________________________________________________________________________
    batch_normalization_44 (BatchNo (None, 17, 17, 192)  576         conv2d_44[0][0]                  
    __________________________________________________________________________________________________
    batch_normalization_49 (BatchNo (None, 17, 17, 192)  576         conv2d_49[0][0]                  
    __________________________________________________________________________________________________
    batch_normalization_50 (BatchNo (None, 17, 17, 192)  576         conv2d_50[0][0]                  
    __________________________________________________________________________________________________
    activation_41 (Activation)      (None, 17, 17, 192)  0           batch_normalization_41[0][0]     
    __________________________________________________________________________________________________
    activation_44 (Activation)      (None, 17, 17, 192)  0           batch_normalization_44[0][0]     
    __________________________________________________________________________________________________
    activation_49 (Activation)      (None, 17, 17, 192)  0           batch_normalization_49[0][0]     
    __________________________________________________________________________________________________
    activation_50 (Activation)      (None, 17, 17, 192)  0           batch_normalization_50[0][0]     
    __________________________________________________________________________________________________
    mixed5 (Concatenate)            (None, 17, 17, 768)  0           activation_41[0][0]              
                                                                     activation_44[0][0]              
                                                                     activation_49[0][0]              
                                                                     activation_50[0][0]              
    __________________________________________________________________________________________________
    
    
    
    第165-196层,(17, 17, 768)→【Mixed_6d, InceptionC】→(17, 17, 768)
    __________________________________________________________________________________________________
    conv2d_55 (Conv2D)              (None, 17, 17, 160)  122880      mixed5[0][0]                     
    __________________________________________________________________________________________________
    batch_normalization_55 (BatchNo (None, 17, 17, 160)  480         conv2d_55[0][0]                  
    __________________________________________________________________________________________________
    activation_55 (Activation)      (None, 17, 17, 160)  0           batch_normalization_55[0][0]     
    __________________________________________________________________________________________________
    conv2d_56 (Conv2D)              (None, 17, 17, 160)  179200      activation_55[0][0]              
    __________________________________________________________________________________________________
    batch_normalization_56 (BatchNo (None, 17, 17, 160)  480         conv2d_56[0][0]                  
    __________________________________________________________________________________________________
    activation_56 (Activation)      (None, 17, 17, 160)  0           batch_normalization_56[0][0]     
    __________________________________________________________________________________________________
    conv2d_52 (Conv2D)              (None, 17, 17, 160)  122880      mixed5[0][0]                     
    __________________________________________________________________________________________________
    conv2d_57 (Conv2D)              (None, 17, 17, 160)  179200      activation_56[0][0]              
    __________________________________________________________________________________________________
    batch_normalization_52 (BatchNo (None, 17, 17, 160)  480         conv2d_52[0][0]                  
    __________________________________________________________________________________________________
    batch_normalization_57 (BatchNo (None, 17, 17, 160)  480         conv2d_57[0][0]                  
    __________________________________________________________________________________________________
    activation_52 (Activation)      (None, 17, 17, 160)  0           batch_normalization_52[0][0]     
    __________________________________________________________________________________________________
    activation_57 (Activation)      (None, 17, 17, 160)  0           batch_normalization_57[0][0]     
    __________________________________________________________________________________________________
    conv2d_53 (Conv2D)              (None, 17, 17, 160)  179200      activation_52[0][0]              
    __________________________________________________________________________________________________
    conv2d_58 (Conv2D)              (None, 17, 17, 160)  179200      activation_57[0][0]              
    __________________________________________________________________________________________________
    batch_normalization_53 (BatchNo (None, 17, 17, 160)  480         conv2d_53[0][0]                  
    __________________________________________________________________________________________________
    batch_normalization_58 (BatchNo (None, 17, 17, 160)  480         conv2d_58[0][0]                  
    __________________________________________________________________________________________________
    activation_53 (Activation)      (None, 17, 17, 160)  0           batch_normalization_53[0][0]     
    __________________________________________________________________________________________________
    activation_58 (Activation)      (None, 17, 17, 160)  0           batch_normalization_58[0][0]     
    __________________________________________________________________________________________________
    average_pooling2d_6 (AveragePoo (None, 17, 17, 768)  0           mixed5[0][0]                     
    __________________________________________________________________________________________________
    conv2d_51 (Conv2D)              (None, 17, 17, 192)  147456      mixed5[0][0]                     
    __________________________________________________________________________________________________
    conv2d_54 (Conv2D)              (None, 17, 17, 192)  215040      activation_53[0][0]              
    __________________________________________________________________________________________________
    conv2d_59 (Conv2D)              (None, 17, 17, 192)  215040      activation_58[0][0]              
    __________________________________________________________________________________________________
    conv2d_60 (Conv2D)              (None, 17, 17, 192)  147456      average_pooling2d_6[0][0]        
    __________________________________________________________________________________________________
    batch_normalization_51 (BatchNo (None, 17, 17, 192)  576         conv2d_51[0][0]                  
    __________________________________________________________________________________________________
    batch_normalization_54 (BatchNo (None, 17, 17, 192)  576         conv2d_54[0][0]                  
    __________________________________________________________________________________________________
    batch_normalization_59 (BatchNo (None, 17, 17, 192)  576         conv2d_59[0][0]                  
    __________________________________________________________________________________________________
    batch_normalization_60 (BatchNo (None, 17, 17, 192)  576         conv2d_60[0][0]                  
    __________________________________________________________________________________________________
    activation_51 (Activation)      (None, 17, 17, 192)  0           batch_normalization_51[0][0]     
    __________________________________________________________________________________________________
    activation_54 (Activation)      (None, 17, 17, 192)  0           batch_normalization_54[0][0]     
    __________________________________________________________________________________________________
    activation_59 (Activation)      (None, 17, 17, 192)  0           batch_normalization_59[0][0]     
    __________________________________________________________________________________________________
    activation_60 (Activation)      (None, 17, 17, 192)  0           batch_normalization_60[0][0]     
    __________________________________________________________________________________________________
    mixed6 (Concatenate)            (None, 17, 17, 768)  0           activation_51[0][0]              
                                                                     activation_54[0][0]              
                                                                     activation_59[0][0]              
                                                                     activation_60[0][0]              
    __________________________________________________________________________________________________
    
    
    
    第197-228层,(17, 17, 768)→【Mixed_6e, InceptionC】→(17, 17, 768)
    __________________________________________________________________________________________________
    conv2d_65 (Conv2D)              (None, 17, 17, 192)  147456      mixed6[0][0]                     
    __________________________________________________________________________________________________
    batch_normalization_65 (BatchNo (None, 17, 17, 192)  576         conv2d_65[0][0]                  
    __________________________________________________________________________________________________
    activation_65 (Activation)      (None, 17, 17, 192)  0           batch_normalization_65[0][0]     
    __________________________________________________________________________________________________
    conv2d_66 (Conv2D)              (None, 17, 17, 192)  258048      activation_65[0][0]              
    __________________________________________________________________________________________________
    batch_normalization_66 (BatchNo (None, 17, 17, 192)  576         conv2d_66[0][0]                  
    __________________________________________________________________________________________________
    activation_66 (Activation)      (None, 17, 17, 192)  0           batch_normalization_66[0][0]     
    __________________________________________________________________________________________________
    conv2d_62 (Conv2D)              (None, 17, 17, 192)  147456      mixed6[0][0]                     
    __________________________________________________________________________________________________
    conv2d_67 (Conv2D)              (None, 17, 17, 192)  258048      activation_66[0][0]              
    __________________________________________________________________________________________________
    batch_normalization_62 (BatchNo (None, 17, 17, 192)  576         conv2d_62[0][0]                  
    __________________________________________________________________________________________________
    batch_normalization_67 (BatchNo (None, 17, 17, 192)  576         conv2d_67[0][0]                  
    __________________________________________________________________________________________________
    activation_62 (Activation)      (None, 17, 17, 192)  0           batch_normalization_62[0][0]     
    __________________________________________________________________________________________________
    activation_67 (Activation)      (None, 17, 17, 192)  0           batch_normalization_67[0][0]     
    __________________________________________________________________________________________________
    conv2d_63 (Conv2D)              (None, 17, 17, 192)  258048      activation_62[0][0]              
    __________________________________________________________________________________________________
    conv2d_68 (Conv2D)              (None, 17, 17, 192)  258048      activation_67[0][0]              
    __________________________________________________________________________________________________
    batch_normalization_63 (BatchNo (None, 17, 17, 192)  576         conv2d_63[0][0]                  
    __________________________________________________________________________________________________
    batch_normalization_68 (BatchNo (None, 17, 17, 192)  576         conv2d_68[0][0]                  
    __________________________________________________________________________________________________
    activation_63 (Activation)      (None, 17, 17, 192)  0           batch_normalization_63[0][0]     
    __________________________________________________________________________________________________
    activation_68 (Activation)      (None, 17, 17, 192)  0           batch_normalization_68[0][0]     
    __________________________________________________________________________________________________
    average_pooling2d_7 (AveragePoo (None, 17, 17, 768)  0           mixed6[0][0]                     
    __________________________________________________________________________________________________
    conv2d_61 (Conv2D)              (None, 17, 17, 192)  147456      mixed6[0][0]                     
    __________________________________________________________________________________________________
    conv2d_64 (Conv2D)              (None, 17, 17, 192)  258048      activation_63[0][0]              
    __________________________________________________________________________________________________
    conv2d_69 (Conv2D)              (None, 17, 17, 192)  258048      activation_68[0][0]              
    __________________________________________________________________________________________________
    conv2d_70 (Conv2D)              (None, 17, 17, 192)  147456      average_pooling2d_7[0][0]        
    __________________________________________________________________________________________________
    batch_normalization_61 (BatchNo (None, 17, 17, 192)  576         conv2d_61[0][0]                  
    __________________________________________________________________________________________________
    batch_normalization_64 (BatchNo (None, 17, 17, 192)  576         conv2d_64[0][0]                  
    __________________________________________________________________________________________________
    batch_normalization_69 (BatchNo (None, 17, 17, 192)  576         conv2d_69[0][0]                  
    __________________________________________________________________________________________________
    batch_normalization_70 (BatchNo (None, 17, 17, 192)  576         conv2d_70[0][0]                  
    __________________________________________________________________________________________________
    activation_61 (Activation)      (None, 17, 17, 192)  0           batch_normalization_61[0][0]     
    __________________________________________________________________________________________________
    activation_64 (Activation)      (None, 17, 17, 192)  0           batch_normalization_64[0][0]     
    __________________________________________________________________________________________________
    activation_69 (Activation)      (None, 17, 17, 192)  0           batch_normalization_69[0][0]     
    __________________________________________________________________________________________________
    activation_70 (Activation)      (None, 17, 17, 192)  0           batch_normalization_70[0][0]     
    __________________________________________________________________________________________________
    mixed7 (Concatenate)            (None, 17, 17, 768)  0           activation_61[0][0]              
                                                                     activation_64[0][0]              
                                                                     activation_69[0][0]              
                                                                     activation_70[0][0]              
    __________________________________________________________________________________________________
    
    
    
    第229-248层,(17, 17, 768)→【Mixed_7a, InceptionD】→(8, 8, 1280)
    __________________________________________________________________________________________________
    conv2d_73 (Conv2D)              (None, 17, 17, 192)  147456      mixed7[0][0]                     
    __________________________________________________________________________________________________
    batch_normalization_73 (BatchNo (None, 17, 17, 192)  576         conv2d_73[0][0]                  
    __________________________________________________________________________________________________
    activation_73 (Activation)      (None, 17, 17, 192)  0           batch_normalization_73[0][0]     
    __________________________________________________________________________________________________
    conv2d_74 (Conv2D)              (None, 17, 17, 192)  258048      activation_73[0][0]              
    __________________________________________________________________________________________________
    batch_normalization_74 (BatchNo (None, 17, 17, 192)  576         conv2d_74[0][0]                  
    __________________________________________________________________________________________________
    activation_74 (Activation)      (None, 17, 17, 192)  0           batch_normalization_74[0][0]     
    __________________________________________________________________________________________________
    conv2d_71 (Conv2D)              (None, 17, 17, 192)  147456      mixed7[0][0]                     
    __________________________________________________________________________________________________
    conv2d_75 (Conv2D)              (None, 17, 17, 192)  258048      activation_74[0][0]              
    __________________________________________________________________________________________________
    batch_normalization_71 (BatchNo (None, 17, 17, 192)  576         conv2d_71[0][0]                  
    __________________________________________________________________________________________________
    batch_normalization_75 (BatchNo (None, 17, 17, 192)  576         conv2d_75[0][0]                  
    __________________________________________________________________________________________________
    activation_71 (Activation)      (None, 17, 17, 192)  0           batch_normalization_71[0][0]     
    __________________________________________________________________________________________________
    activation_75 (Activation)      (None, 17, 17, 192)  0           batch_normalization_75[0][0]     
    __________________________________________________________________________________________________
    conv2d_72 (Conv2D)              (None, 8, 8, 320)    552960      activation_71[0][0]              
    __________________________________________________________________________________________________
    conv2d_76 (Conv2D)              (None, 8, 8, 192)    331776      activation_75[0][0]              
    __________________________________________________________________________________________________
    batch_normalization_72 (BatchNo (None, 8, 8, 320)    960         conv2d_72[0][0]                  
    __________________________________________________________________________________________________
    batch_normalization_76 (BatchNo (None, 8, 8, 192)    576         conv2d_76[0][0]                  
    __________________________________________________________________________________________________
    activation_72 (Activation)      (None, 8, 8, 320)    0           batch_normalization_72[0][0]     
    __________________________________________________________________________________________________
    activation_76 (Activation)      (None, 8, 8, 192)    0           batch_normalization_76[0][0]     
    __________________________________________________________________________________________________
    max_pooling2d_4 (MaxPooling2D)  (None, 8, 8, 768)    0           mixed7[0][0]                     
    __________________________________________________________________________________________________
    mixed8 (Concatenate)            (None, 8, 8, 1280)   0           activation_72[0][0]              
                                                                     activation_76[0][0]              
                                                                     max_pooling2d_4[0][0]            
    __________________________________________________________________________________________________
    
    
    
    第249-279层,(8, 8, 1280)→【Mixed_7b, InceptionE】→(8, 8, 2048)
    __________________________________________________________________________________________________
    conv2d_81 (Conv2D)              (None, 8, 8, 448)    573440      mixed8[0][0]                     
    __________________________________________________________________________________________________
    batch_normalization_81 (BatchNo (None, 8, 8, 448)    1344        conv2d_81[0][0]                  
    __________________________________________________________________________________________________
    activation_81 (Activation)      (None, 8, 8, 448)    0           batch_normalization_81[0][0]     
    __________________________________________________________________________________________________
    conv2d_78 (Conv2D)              (None, 8, 8, 384)    491520      mixed8[0][0]                     
    __________________________________________________________________________________________________
    conv2d_82 (Conv2D)              (None, 8, 8, 384)    1548288     activation_81[0][0]              
    __________________________________________________________________________________________________
    batch_normalization_78 (BatchNo (None, 8, 8, 384)    1152        conv2d_78[0][0]                  
    __________________________________________________________________________________________________
    batch_normalization_82 (BatchNo (None, 8, 8, 384)    1152        conv2d_82[0][0]                  
    __________________________________________________________________________________________________
    activation_78 (Activation)      (None, 8, 8, 384)    0           batch_normalization_78[0][0]     
    __________________________________________________________________________________________________
    activation_82 (Activation)      (None, 8, 8, 384)    0           batch_normalization_82[0][0]     
    __________________________________________________________________________________________________
    conv2d_79 (Conv2D)              (None, 8, 8, 384)    442368      activation_78[0][0]              
    __________________________________________________________________________________________________
    conv2d_80 (Conv2D)              (None, 8, 8, 384)    442368      activation_78[0][0]              
    __________________________________________________________________________________________________
    conv2d_83 (Conv2D)              (None, 8, 8, 384)    442368      activation_82[0][0]              
    __________________________________________________________________________________________________
    conv2d_84 (Conv2D)              (None, 8, 8, 384)    442368      activation_82[0][0]              
    __________________________________________________________________________________________________
    average_pooling2d_8 (AveragePoo (None, 8, 8, 1280)   0           mixed8[0][0]                     
    __________________________________________________________________________________________________
    conv2d_77 (Conv2D)              (None, 8, 8, 320)    409600      mixed8[0][0]                     
    __________________________________________________________________________________________________
    batch_normalization_79 (BatchNo (None, 8, 8, 384)    1152        conv2d_79[0][0]                  
    __________________________________________________________________________________________________
    batch_normalization_80 (BatchNo (None, 8, 8, 384)    1152        conv2d_80[0][0]                  
    __________________________________________________________________________________________________
    batch_normalization_83 (BatchNo (None, 8, 8, 384)    1152        conv2d_83[0][0]                  
    __________________________________________________________________________________________________
    batch_normalization_84 (BatchNo (None, 8, 8, 384)    1152        conv2d_84[0][0]                  
    __________________________________________________________________________________________________
    conv2d_85 (Conv2D)              (None, 8, 8, 192)    245760      average_pooling2d_8[0][0]        
    __________________________________________________________________________________________________
    batch_normalization_77 (BatchNo (None, 8, 8, 320)    960         conv2d_77[0][0]                  
    __________________________________________________________________________________________________
    activation_79 (Activation)      (None, 8, 8, 384)    0           batch_normalization_79[0][0]     
    __________________________________________________________________________________________________
    activation_80 (Activation)      (None, 8, 8, 384)    0           batch_normalization_80[0][0]     
    __________________________________________________________________________________________________
    activation_83 (Activation)      (None, 8, 8, 384)    0           batch_normalization_83[0][0]     
    __________________________________________________________________________________________________
    activation_84 (Activation)      (None, 8, 8, 384)    0           batch_normalization_84[0][0]     
    __________________________________________________________________________________________________
    batch_normalization_85 (BatchNo (None, 8, 8, 192)    576         conv2d_85[0][0]                  
    __________________________________________________________________________________________________
    activation_77 (Activation)      (None, 8, 8, 320)    0           batch_normalization_77[0][0]     
    __________________________________________________________________________________________________
    mixed9_0 (Concatenate)          (None, 8, 8, 768)    0           activation_79[0][0]              
                                                                     activation_80[0][0]              
    __________________________________________________________________________________________________
    concatenate_1 (Concatenate)     (None, 8, 8, 768)    0           activation_83[0][0]              
                                                                     activation_84[0][0]              
    __________________________________________________________________________________________________
    activation_85 (Activation)      (None, 8, 8, 192)    0           batch_normalization_85[0][0]     
    __________________________________________________________________________________________________
    mixed9 (Concatenate)            (None, 8, 8, 2048)   0           activation_77[0][0]              
                                                                     mixed9_0[0][0]                   
                                                                     concatenate_1[0][0]              
                                                                     activation_85[0][0]              
    __________________________________________________________________________________________________
    
    
    
    第280-310层,(8, 8, 2048)→【Mixed_7c, InceptionE】→(8, 8, 2048)
    __________________________________________________________________________________________________
    conv2d_90 (Conv2D)              (None, 8, 8, 448)    917504      mixed9[0][0]                     
    __________________________________________________________________________________________________
    batch_normalization_90 (BatchNo (None, 8, 8, 448)    1344        conv2d_90[0][0]                  
    __________________________________________________________________________________________________
    activation_90 (Activation)      (None, 8, 8, 448)    0           batch_normalization_90[0][0]     
    __________________________________________________________________________________________________
    conv2d_87 (Conv2D)              (None, 8, 8, 384)    786432      mixed9[0][0]                     
    __________________________________________________________________________________________________
    conv2d_91 (Conv2D)              (None, 8, 8, 384)    1548288     activation_90[0][0]              
    __________________________________________________________________________________________________
    batch_normalization_87 (BatchNo (None, 8, 8, 384)    1152        conv2d_87[0][0]                  
    __________________________________________________________________________________________________
    batch_normalization_91 (BatchNo (None, 8, 8, 384)    1152        conv2d_91[0][0]                  
    __________________________________________________________________________________________________
    activation_87 (Activation)      (None, 8, 8, 384)    0           batch_normalization_87[0][0]     
    __________________________________________________________________________________________________
    activation_91 (Activation)      (None, 8, 8, 384)    0           batch_normalization_91[0][0]     
    __________________________________________________________________________________________________
    conv2d_88 (Conv2D)              (None, 8, 8, 384)    442368      activation_87[0][0]              
    __________________________________________________________________________________________________
    conv2d_89 (Conv2D)              (None, 8, 8, 384)    442368      activation_87[0][0]              
    __________________________________________________________________________________________________
    conv2d_92 (Conv2D)              (None, 8, 8, 384)    442368      activation_91[0][0]              
    __________________________________________________________________________________________________
    conv2d_93 (Conv2D)              (None, 8, 8, 384)    442368      activation_91[0][0]              
    __________________________________________________________________________________________________
    average_pooling2d_9 (AveragePoo (None, 8, 8, 2048)   0           mixed9[0][0]                     
    __________________________________________________________________________________________________
    conv2d_86 (Conv2D)              (None, 8, 8, 320)    655360      mixed9[0][0]                     
    __________________________________________________________________________________________________
    batch_normalization_88 (BatchNo (None, 8, 8, 384)    1152        conv2d_88[0][0]                  
    __________________________________________________________________________________________________
    batch_normalization_89 (BatchNo (None, 8, 8, 384)    1152        conv2d_89[0][0]                  
    __________________________________________________________________________________________________
    batch_normalization_92 (BatchNo (None, 8, 8, 384)    1152        conv2d_92[0][0]                  
    __________________________________________________________________________________________________
    batch_normalization_93 (BatchNo (None, 8, 8, 384)    1152        conv2d_93[0][0]                  
    __________________________________________________________________________________________________
    conv2d_94 (Conv2D)              (None, 8, 8, 192)    393216      average_pooling2d_9[0][0]        
    __________________________________________________________________________________________________
    batch_normalization_86 (BatchNo (None, 8, 8, 320)    960         conv2d_86[0][0]                  
    __________________________________________________________________________________________________
    activation_88 (Activation)      (None, 8, 8, 384)    0           batch_normalization_88[0][0]     
    __________________________________________________________________________________________________
    activation_89 (Activation)      (None, 8, 8, 384)    0           batch_normalization_89[0][0]     
    __________________________________________________________________________________________________
    activation_92 (Activation)      (None, 8, 8, 384)    0           batch_normalization_92[0][0]     
    __________________________________________________________________________________________________
    activation_93 (Activation)      (None, 8, 8, 384)    0           batch_normalization_93[0][0]     
    __________________________________________________________________________________________________
    batch_normalization_94 (BatchNo (None, 8, 8, 192)    576         conv2d_94[0][0]                  
    __________________________________________________________________________________________________
    activation_86 (Activation)      (None, 8, 8, 320)    0           batch_normalization_86[0][0]     
    __________________________________________________________________________________________________
    mixed9_1 (Concatenate)          (None, 8, 8, 768)    0           activation_88[0][0]              
                                                                     activation_89[0][0]              
    __________________________________________________________________________________________________
    concatenate_2 (Concatenate)     (None, 8, 8, 768)    0           activation_92[0][0]              
                                                                     activation_93[0][0]              
    __________________________________________________________________________________________________
    activation_94 (Activation)      (None, 8, 8, 192)    0           batch_normalization_94[0][0]     
    __________________________________________________________________________________________________
    mixed10 (Concatenate)           (None, 8, 8, 2048)   0           activation_86[0][0]              
                                                                     mixed9_1[0][0]                   
                                                                     concatenate_2[0][0]              
                                                                     activation_94[0][0]              
    __________________________________________________________________________________________________
    
    
    
    第311-312层,(8, 8, 2048)→【global avg pool】→(2048,)→【dropout】→(2048,)→【fc】→(1000,)
    __________________________________________________________________________________________________
    avg_pool (GlobalAveragePooling2 (None, 2048)         0           mixed10[0][0]                    
    __________________________________________________________________________________________________
    predictions (Dense)             (None, 1000)         2049000     avg_pool[0][0]                   
    ==================================================================================================
    Total params: 23,851,784
    Trainable params: 23,817,352
    Non-trainable params: 34,432
    __________________________________________________________________________________________________
    

    Non-trainable params参数均来自bn层

    源代码中,x = layers.BatchNormalization(axis=bn_axis, scale=False, name=bn_name)(x),设置scale=False表示省略参数γ\gamma(源代码中对应gamma),理由是:When the next layer is linear (also e.g. nn.relu), this can be disabled since the scaling will be done by the next layer.

    例如第1个bn层,(149, 149, 32)→【bn】→(149, 149, 32),分别对32个通道进行batch norm,每个通道包含3个参数,即μ,σ,β\mu, \sigma, \beta(源代码中分别对应moving_mean, moving_variance, beta),其中μ,σ\mu, \sigma是Non-trainable的参数,β\beta是trainable的参数

    故该bn层参数总数为:32×3=96,Non-trainable的参数总数为:32×2=64

    展开全文
  • Inceptionv3.mlmodel

    2019-08-07 13:31:30
    Inceptionv3的mlmodel模型是用于Xcode的,模型可以识别一张照片的主体事物。目前苹果官网已经没有了Inceptionv3的mlmodel模型,此照片模型依旧可以正常使用。
  • inceptionv3结构图visio制作
  • InceptionV3 issue

    2021-01-07 08:38:21
    <div><p>Hi, I'm currently trying to generate heatmap using grad-cam from my InceptionV3 model but I'm getting weird results: ...
  • <ol><li>Supports <code>I3D-inceptionv1</code> and <code>I3D-inceptionv3</code> models for video action recognition from <a href="https://arxiv.org/abs/1705.07750">I3D paper</a>. Performance reproduced...
  • tensorflow+inceptionv3网络

    2018-05-07 09:19:52
    参照原论文使用tensorflow写的一个inceptionv3网络,后续会更新数据集的使用及训练。
  • Error training inceptionv3

    2020-12-04 18:44:01
    <div><p>Hi, i want to train the inceptionv3 network. I use the following command: <p>python train.py --network inceptionv3 --prefix final\inception\new\ssd --finetune 1 --end-epoch 400 --num-class 1 -...
  • inceptionv3..py

    2019-09-09 20:14:04
    迁移学习inceptionv3模型准备好新的数据集和训练好的模型之后,通过以下代码完成迁移学习。个子文件夹,每个子文件夹代表一种花,表示不同类别,每张图片是RGB彩色模式,大小不相同。先将原始图像数据整理成模型需要...
  • <div><p>From master, I modified the <code>activation_maximization.ipynb</code> notebook to replace <code>VGG16</code> with <code>InceptionV3</code>. Basically, I only changed the model instantiation ...
  • InceptionV3网络结构图手画笔记图,本结构图是依据inceptionV3代码解释而来:

    InceptionV3网络结构图手画笔记图,本结构图是依据inceptionV3代码解释而来:
    在这里插入图片描述
    在这里插入图片描述

    展开全文
  • InceptionV3_Optimization_with_TensorRT
  • 本课程讲解内容是基于深度学习框架Keras,对InceptionV3模型进行迁移学习。涉及到迁移学习的必要性,迁移学习方法,迁移学习实战,最后用迁移学习结果去识别图片。
  • <p>When trying to import InceptionV3 on Python3.6, using <pre><code>python from keras.applications.inception_v3 import InceptionV3 basemodel = InceptionV3(weights='imagenet', include_...
  • InceptionV3 迁移训练

    2019-11-02 13:35:25
    InceptionV3.py: import tensorflow as tf import os import flower_photo_dispose as fd from tensorflow.python.platform import gfile print("hello wrold1") model_path = "inception_dec_2015/" model_file =...

    InceptionV3.py:

    import tensorflow as tf
    import os
    import flower_photo_dispose as fd
    from tensorflow.python.platform import gfile
    print("hello wrold1")
    model_path = "inception_dec_2015/"
    model_file = "tensorflow_inception_graph.pb"
    
    
    num_steps = 4000
    BATCH_SIZE = 100
    
    bottleneck_size = 2048  # InceptionV3模型瓶颈层的节点个数
    
    # 调用create_image_lists()函数获得该函数返回的字典
    image_lists = fd.create_image_dict()
    num_classes = len(image_lists.keys())  # num_classes=5,因为有5类
    
    # 读取已经训练好的Inception-v3模型。
    with gfile.FastGFile(os.path.join(model_path, model_file), 'rb') as f:
        graph_def = tf.GraphDef()
        graph_def.ParseFromString(f.read())
    
    # 使用import_graph_def()函数加载读取的InceptionV3模型后会返回
    # 图像数据输入节点的张量名称以及计算瓶颈结果所对应的张量,函数原型为
    # import_graph_def(graph_def,input_map,return_elements,name,op_dict,producer_op_list)
    bottleneck_tensor, jpeg_data_tensor = tf.import_graph_def(graph_def,
                                                              return_elements=["pool_3/_reshape:0",
                                                                               "DecodeJpeg/contents:0"])
    
    x = tf.placeholder(tf.float32, [None, bottleneck_size], name='BottleneckInputPlaceholder')
    y_ = tf.placeholder(tf.float32, [None, num_classes], name='GroundTruthInput')
    
    # 定义一层全连接层
    with tf.name_scope("final_training_ops"):
        weights = tf.Variable(tf.truncated_normal([bottleneck_size, num_classes], stddev=0.001))
        biases = tf.Variable(tf.zeros([num_classes]))
        logits = tf.matmul(x, weights) + biases
        final_tensor = tf.nn.softmax(logits)
    
    # 定义交叉熵损失函数以及train_step使用的随机梯度下降优化器
    cross_entropy = tf.nn.softmax_cross_entropy_with_logits(logits=logits, labels=y_)
    cross_entropy_mean = tf.reduce_mean(cross_entropy)
    train_step = tf.train.GradientDescentOptimizer(0.01).minimize(cross_entropy_mean)
    
    # 定义计算正确率的操作
    correct_prediction = tf.equal(tf.argmax(final_tensor, 1), tf.argmax(y_, 1))
    evaluation_step = tf.reduce_mean(tf.cast(correct_prediction, tf.float32))
    
    with tf.Session() as sess:
        init = tf.global_variables_initializer()
        sess.run(init)
        for i in range(num_steps):
            # 使用get_random_bottlenecks()函数产生训练用的随机的特征向量数据及其对应的label
            # 在run()函数内开始训练的过程
            train_bottlenecks, train_labels = fd.get_random_bottlenecks(sess, num_classes,
                                                                        image_lists, BATCH_SIZE,
                                                                        "training",
                                                                        jpeg_data_tensor, bottleneck_tensor)
            sess.run(train_step, feed_dict={x: train_bottlenecks, y_: train_labels})
    
            # 进行相关的验证,同样是使用get_random_bottlenecks()函数产生随机的特征向量及其
            # 对应的label
            if i % 100 == 0:
                validation_bottlenecks, validation_labels = fd.get_random_bottlenecks(sess,
                                                                                      num_classes, image_lists,
                                                                                      BATCH_SIZE, "validation",
                                                                                      jpeg_data_tensor, bottleneck_tensor)
                validation_accuracy = sess.run(evaluation_step, feed_dict={
                    x: validation_bottlenecks,
                    y_: validation_labels})
                print("Step %d: Validation accuracy = %.1f%%" % (i, validation_accuracy * 100))
    
        # 在最后的测试数据上测试正确率,这里调用的是get_test_bottlenecks()函数,返回
        # 所有图片的特征向量作为特征数据
        test_bottlenecks, test_labels = fd.get_test_bottlenecks(sess, image_lists, num_classes,
                                                                jpeg_data_tensor, bottleneck_tensor)
        test_accuracy = sess.run(evaluation_step, feed_dict={x: test_bottlenecks,
                                                             y_: test_labels})
        print("Finally test accuracy = %.1f%%" % (test_accuracy * 100))
    

     

    flower_photos_dispose.py:

    import glob
    import os.path
    import random
    import numpy as np
    from tensorflow.python.platform import gfile
    import tensorflow as tf
    input_data = "flower_photos"
    CACHE_DIR  = "bottleneck"
    
    
    def create_image_dict():
    	result = {}
    	# path是flower_photos文件夹的路径,同时也包含了其子文件夹的路径
    	# directory的数据形式为一个列表,打印其内容为:
    	# /home/jiangziyang/flower_photos, /home/jiangziyang/flower_photos/daisy,
    	# /home/jiangziyang/flower_photos/tulips, /home/jiangziyang/flower_photos/roses,
    	# /home/jiangziyang/flower_photos/dandelion, /home/jiangziyang/flower_photos/sunflowers
    	path_list = [x[0] for x in os.walk(input_data)]
    	is_root_dir = True
    	for sub_dirs in path_list:
    		if is_root_dir:
    			is_root_dir = False
    			continue  # continue会跳出当前循环执行下一轮的循环
    
    		# extension_name列表列出了图片文件可能的扩展名
    		extension_name = ['jpg', 'jpeg', 'JPG', 'JPEG']
    		# 创建保存图片文件名的列表
    		images_list = []
    		for extension in extension_name:
    			# join()函数用于拼接路径,用extension_name列表中的元素作为后缀名,比如:
    			# /home/jiangziyang/flower_photos/daisy/*.jpg
    			# /home/jiangziyang/flower_photos/daisy/*.jpeg
    			# /home/jiangziyang/flower_photos/daisy/*.JPG
    			# /home/jiangziyang/flower_photos/daisy/*.JPEG
    			file_glob = os.path.join(sub_dirs, '*.' + extension)
    
    			# 使用glob()函数获取满足正则表达式的文件名,例如对于
    			# /home/jiangziyang/flower_photos/daisy/*.jpg,glob()函数会得到该路径下
    			# 所有后缀名为.jpg的文件,例如下面这个例子:
    			# /home/jiangziyang/flower_photos/daisy/7924174040_444d5bbb8a.jpg
    			images_list.extend(glob.glob(file_glob))
    
    		# basename()函数会舍弃一个文件名中保存的路径,比如对于
    		# /home/jiangziyang/flower_photos/daisy,其结果是仅仅保留daisy
    		# flower_category就是图片的类别,这个类别通过子文件夹名获得
    		dir_name = os.path.basename(sub_dirs)
    		flower_category = dir_name
    
    		# 初始化每个类别的flower photos对应的训练集图片名列表、测试集图片名列表
    		# 和验证集图片名列表
    		training_images = []
    		testing_images = []
    		validation_images = []
    
    		for image_name in images_list:
    			# 对于images_name列表中的图片文件名,它也包含了路径名,但我们不需要
    			# 路径名所以这里使用basename()函数获取文件名
    			image_name = os.path.basename(image_name)
    			# random.randint()函数产生均匀分布的整数
    			score = np.random.randint(100)
    			if score < 10:
    				validation_images.append(image_name)
    			elif score < 20:
    				testing_images.append(image_name)
    			else:
    				training_images.append(image_name)
    
    		# 每执行一次最外层的循环,都会刷新一次result,result是一个字典,
    		# 它的key为flower_category,它的value也是一个字典,以数据集分类的形式存储了
    		# 所有图片的名称,最后函数将result返回
    		result[flower_category] = {
    			"dir": dir_name,
    			"training": training_images,
    			"testing": testing_images,
    			"validation": validation_images,
    		}
    	return result
    
    
    def get_image_path(image_lists, image_dir, flower_category, image_index, data_category):
    	# category_list用列表的形式保存了某一类花的某一个数据集的内容,
    	# 其中参数flower_category从函数get_random_bottlenecks()传递过来
    	category_list = image_lists[flower_category][data_category]
    
    	# actual_index是一个图片在category_list列表中的位置序号
    	# 其中参数image_index也是从函数get_random_bottlenecks()传递过来
    	actual_index = image_index % len(category_list)
    
    	# image_name就是图片的文件名
    	image_name = category_list[actual_index]
    
    	# sub_dir得到flower_photos中某一类花所在的子文件夹名
    	sub_dir = image_lists[flower_category]["dir"]
    
    	# 拼接路径,这个路径包含了文件名,最终返回给create_bottleneck()函数
    	# 作为每一个图片对应的特征向量的文件
    	full_path = os.path.join(image_dir, sub_dir, image_name)
    	return full_path
    
    
    def create_bottleneck(sess, image_lists, flower_category, image_index,
                          data_category, jpeg_data_tensor, bottleneck_tensor):
    	# sub_dir得到的是flower_photos下某一类花的文件夹名,这类花由
    	# flower_photos参数确定,花的文件夹名由dir参数确定
    	sub_dir = image_lists[flower_category]["dir"]
    
    	# 拼接路径,路径名就是在CACHE_DIR路径的基础上加上sub_dir
    	sub_dir_path = os.path.join(CACHE_DIR, sub_dir)
    
    	# 判断拼接出的路径是否存在,如果不存在,则在CACHE_DIR下创建相应的子文件夹
    	if not os.path.exists(sub_dir_path):
    		os.makedirs(sub_dir_path)
    
    	# 获取一张图片对应的特征向量的全名,这个全名包括了路径名,而且会在图片的.jpg后面
    	# 用.txt作为后缀,获取没有.txt缀的文件名使用了get_image_path()函数,
    	# 该函数会返回带路径的图片名
    	bottleneck_path = get_image_path(image_lists, CACHE_DIR, flower_category,
    	                                 image_index, data_category) + ".txt"
    
    	# 如果指定名称的特征向量文件不存在,则通过InceptionV3模型计算得到该特征向量
    	# 计算的结果也会存入文件
    	if not os.path.exists(bottleneck_path):
    		# 获取原始的图片名,这个图片名包含了原始图片的完整路径
    		image_path = get_image_path(image_lists, input_data, flower_category,
    		                            image_index, data_category)
    		# 读取图片的内容
    		image_data = gfile.FastGFile(image_path, "rb").read()
    
    		# 将当前图片输入到InceptionV3模型,并计算瓶颈张量的值,所得瓶颈张量的值
    		# 就是这张图片的特征向量,但是得到的特征向量是四维的,所以还需要通过squeeze()
    		# 函数压缩成一维的,以方便作为全连层的输入
    		bottleneck_values = sess.run(bottleneck_tensor, feed_dict={jpeg_data_tensor: image_data})
    
    		# 压缩成一维的
    		bottleneck_values = np.squeeze(bottleneck_values)
    
    		# 将计算得到的特征向量存入文件,存入文件前需要为两个值之间加入逗号作为分隔
    		# 这样可以方便从文件读取数据时的解析过程
    		bottleneck_string = ','.join(str(x) for x in bottleneck_values)
    		with open(bottleneck_path, "w") as bottleneck_file:
    			bottleneck_file.write(bottleneck_string)
    	else:
    		# else是特征向量文件已经存在的情况,此时会直接从bottleneck_path获取
    		# 特征向量数据
    		with open(bottleneck_path, "r") as bottleneck_file:
    			bottleneck_string = bottleneck_file.read()
    
    		# 从文件读取的特征向量数据是字符串的形式,要以逗号为分隔将其转为列表的形式
    		bottleneck_values = [float(x) for x in bottleneck_string.split(',')]
    	return bottleneck_values
    
    
    def get_random_bottlenecks(sess, num_classes, image_lists, batch_size, data_category, jpeg_data_tensor,
                               bottleneck_tensor):
    	# 定义bottlenecks用于存储得到的一个batch的特征向量
    	# 定义labels用于存储这个batch的label标签
    	bottlenecks = []
    	labels = []
    
    	for i in range(batch_size):
    		# random_index是从五个花类中随机抽取的类别编号
    		# image_lists.keys()的值就是五种花的类别名称
    		random_index = random.randrange(num_classes)
    		flower_category = list(image_lists.keys())[random_index]
    
    		# image_index就是随机抽取的图片的编号,在get_image_path()函数中
    		# 我们会看到如何通过这个图片编号和random_index确定类别找到图片的文件名
    		image_index = random.randrange(65536)
    
    		# 调用get_or_create_bottleneck()函数获取或者创建图片的特征向量
    		# 这个函数调用了get_image_path()函数
    		bottleneck = create_bottleneck(sess, image_lists, flower_category, image_index,
    		                               data_category, jpeg_data_tensor, bottleneck_tensor)
    
    		# 首先生成每一个标签的答案值,再通过append()函数组织成一个batch列表
    		# 函数将完整的列表返回
    		label = np.zeros(num_classes, dtype=np.float32)
    		label[random_index] = 1.0
    		labels.append(label)
    		bottlenecks.append(bottleneck)
    	# 这个函数的返回值是关于一幅图片的特征向量,以及它对应的标签
    	return bottlenecks, labels
    
    
    def get_test_bottlenecks(sess, image_lists, num_classes, jpeg_data_tensor, bottleneck_tensor):
    	bottlenecks = []
    	labels = []
    
    	# flower_category_list是image_lists中键的列表,打印出来就是这样:
    	# ['roses', 'sunflowers', 'daisy', 'dandelion', 'tulips']
    	flower_category_list = list(image_lists.keys())
    
    	data_category = "testing"
    
    	# 枚举所有的类别和每个类别中的测试图片
    	# 在外层的for循环中,label_index是flower_category_list列表中的元素下标
    	# flower_category就是该列表中的值
    	for label_index, flower_category in enumerate(flower_category_list):
    
    		# 在内层的for循环中,通过flower_category和"testing"枚举image_lists中每一类花中
    		# 用于测试的花名,得到的名字就是unused_base_name,但我们只需要image_index
    		for image_index, unused_base_name in enumerate(image_lists[flower_category]
    		                                               ["testing"]):
    			# 调用create_bottleneck()函数创建特征向量,因为在进行训练或验证的过程中
    			# 用于测试的图片并没有生成相应的特征向量,所以这里要一次性全部生成
    			bottleneck = create_bottleneck(sess, image_lists, flower_category,
    			                               image_index, data_category,
    			                               jpeg_data_tensor, bottleneck_tensor)
    
    			# 接下来就和get_random_bottlenecks()函数相同了
    			label = np.zeros(num_classes, dtype=np.float32)
    			label[label_index] = 1.0
    			labels.append(label)
    			bottlenecks.append(bottleneck)
    	return bottlenecks, labels

     

    展开全文
  • 训练好InceptionV3模型以后,将一张图片输入模型,可以得到模型中每一次卷积的输出结果,并可视化出来.为标准的sci模式特征图
  • <p>Yet, when I tried to load InceptionV3 model, I get an error. There was not any errors when I converted the model from 'h5' to 'json' but the code below does not work. <p><img alt=...
  • 可视化InceptionV3结构

    千次阅读 2018-04-01 20:59:30
    参考博文:https://blog.csdn.net/u014365862/article/details/54380246之前的博客已经介绍过InceptionV3论文,包括实现了InceptionV3的前向传播,很详细的一个版本,接下来还会有更多关于InceptionV3的介绍。...

    参考博文:https://blog.csdn.net/u014365862/article/details/54380246

    之前的博客已经介绍过InceptionV3论文,包括实现了InceptionV3的前向传播,很详细的一个版本,接下来还会有更多关于InceptionV3的介绍。想从InceptionV3入手逐步来理解卷积的构造,模型的构建,训练,测试以及迁移。

    进入正题

    1.导入各种包

    import tensorflow as tf
    import os
    import tarfile
    import requests

    2.下载模型

    2.1保存下载路径在inception_pretrain_model_url

    inception_pretrain_model_url = 'http://download.tensorflow.org/models/image/imagenet/inception-2015-12-05.tgz'

    2.2创建存放模型的文件夹

    #创建文件夹的名字及路径(当前路径下)
    inception_pretrain_model_dir = "inception_pretrain"
    #如果inception_pretrain_model_dir的文件夹不存在,则创建
    if not os.path.exists(inception_pretrain_model_dir):
        #os.path.exists()函数用来检验给出的路径是否真地存在 返回bool
        os.makedirs(inception_pretrain_model_dir)
        #makedir(path):创建文件夹,注:创建已存在的文件夹将异常
    
    filename = inception_pretrain_model_url.split('/')[-1]
    #inception_pretrain_model_url的字符串是http://download.tensorflow.org/models/image/imagenet/inception-2015-12-05.tgz
    #filename取以/分开的最后一个字符串即inception-2015-12-05.tgz
    filepath = os.path.join(inception_pretrain_model_dir, filename)
    #将两个路径连接起来:inception_pretrain\inception-2015-12-05.tgz

    2.3查看路径下是否有文件,若无就下载

    # 如果路径名不存在(这里指的是路径下的内容)的话,就开始下载文件
    if not os.path.exists(filepath):
        print("开始下载: ", filename)
        r = requests.get(inception_pretrain_model_url, stream=True)
        # requests.get从指定http网站上下载内容
        with open(filepath, 'wb') as f:
            for chunk in r.iter_content(chunk_size=1024):
                if chunk:
                    f.write(chunk)
        #用with语句来打开文件,就包含关闭的功能。wb是写二进制文件,由于文件过大,批量写(这里是压缩包)

    2.4解压文件,解压前先打印开始解压提示语

    print("下载完成, 开始解压: ", filename)
    #解压出来的文件其中的classify_image_graph_def.pb 文件就是训练好的Inception-v3模型。
    #imagenet_synset_to_human_label_map.txt是类别文件。
    tarfile.open(filepath, 'r:gz').extractall(inception_pretrain_model_dir)
    # tarfile解压文件

    3.创建TensorBoard log目录

    log_dir = 'inception_log'  #目录地址
    if not os.path.exists(log_dir):
        os.makedirs(log_dir)
    
    # 加载inception graph
    inception_graph_def_file = os.path.join(inception_pretrain_model_dir, 'classify_image_graph_def.pb')
    with tf.Session() as sess:
        with tf.gfile.FastGFile(inception_graph_def_file, 'rb') as f: #以二进制读取文件
            graph_def = tf.GraphDef()
            # 绘图
            graph_def.ParseFromString(f.read())
            tf.import_graph_def(graph_def, name='')
        writer = tf.summary.FileWriter(log_dir, sess.graph)
        #AttributeError: module 'tensorflow.python.training.training' has no attribute 'SummaryWriter'所以用tf.summary.FileWriter
        writer.close()

    解压完后的文件如下:


    所有代码运行完,会得到一个events事件,在目录inception_log下,如图:


    接下来打开这个文件,需要用到命令行。

    注意,我这个inception_log文件的地址是在D:/python/neural network/Inception/inception_log

    4.命令行切换到inception_log跟目录


    切换到目录后就可以使用tensorboard --logdir=log_dir(创建log的地址)打开文件了


    最后出来的这个网址http://DESKTOP-JIUMT28:6006复制到谷歌浏览器里面去就可以得到可视化图


    展开看细节


    好了,不会的宝宝可以私信我

    展开全文
  • Can you provide pretrained inceptionv3 model? You use the inceptionv3 is not http://mxnet.io/model_zoo/ here inceptionv3?</p><p>该提问来源于开源项目:zhreshold/mxnet-ssd</p></div>
  • InceptionV3算法的简介(论文介绍)、架构详解、案例应用等配图集合之详细攻略 目录 InceptionV2 & InceptionV3算法的简介(论文介绍) InceptionV2 & InceptionV3算法的架构详解 1、卷积分解 2、...
  • InceptionV3代码解析

    千次阅读 热门讨论 2018-03-30 16:13:10
    参考书籍:需要电子版的可以去CSDN搜索参考博文:https://blog.csdn.net/superman_xxx/article/details/65451916读了Google的GoogleNet以及InceptionV3的论文,决定把它实现一下,尽管很难,但是网上有不少资源,就...
  • CNN常用模型8 InceptionV3 8 InceptionV3 from keras.models import Model from keras import layers from keras.layers import Activation,Dense,Input,BatchNormalization,Conv2D,MaxPooling2D,AveragePooling2D ...
  • tensorflow2复现inceptionv3

    2019-09-03 00:18:16
    1.inceptionv3的网络结构 2. figure5 figure6 figure7 这里作者对比了减肥之前先降维,或者减肥后再降维两种方法,前者速度快但违反了通用设计准则一,即增加了瓶颈,而后者需要耗费三倍的计算量,似乎看起来都...
  • MATLAB对inceptionV3模型进行迁移学习

    千次阅读 2020-04-01 16:52:51
    使用MATLAB自带的inceptionv3模型进行迁移学习,若没有安装inceptionv3模型支持工具,在命令窗口输入inceptionv3,点击下载链接进行安装。 训练环境:Windows10系统,MATLAB20018b,CPU i3 3.7GHz,4GB内存。 使用...
  • inceptionv3网络结构图

    千次阅读 2019-07-20 17:36:27
    inceptionv3的网络结构图,直接上图: 1.全局结构图: 2. 在上图的基础上展开一级: 3.在上图的基础上进一步再展开一级: ​​​​...
  • InceptionV3论文: Rethinking the Inception Architecture for Computer Vision以及一张网络结构图 MobileNetV1模型_8分类16.3MB InceptionV3模型_8分类83.4MB label_image.py脚本,即加载模型进行,数据仅选了几张...
  • 前段时间做了一个简单的图像分类功能,采用Tensorflow-slim下的InceptionV3、InceptionV4网络模型,现在记录下两者在训练过程中的准确率、训练时间等进行一些比较。 项目地址:...
  • 图像字幕:使用InceptionV3和光束搜索的图像字幕
  • 该文件包含有一个inceptionv3的网络,以及制作和读取TFRecord格式的数据集的方法。

空空如也

空空如也

1 2 3 4 5 ... 20
收藏数 709
精华内容 283
关键字:

inceptionv3