Model: "encoder_r1" __________________________________________________________________________________________________ Layer (type) Output Shape Param # Connected to ================================================================================================== input_1 (InputLayer) [(None, 2048, 3)] 0 __________________________________________________________________________________________________ tf.split (TFOpLambda) [(None, 2048, 1), (N 0 input_1[0][0] __________________________________________________________________________________________________ shared_det_0_conv_0 (Conv1D) (None, 2048, 32) 2080 tf.split[0][0] __________________________________________________________________________________________________ shared_det_0_conv_0_batchnorm ( (None, 2048, 32) 128 shared_det_0_conv_0[0][0] __________________________________________________________________________________________________ leaky_re_lu (LeakyReLU) multiple 0 shared_det_0_conv_0_batchnorm[0][ shared_det_0_conv_1_batchnorm[0][ shared_det_0_conv_2_batchnorm[0][ shared_det_0_conv_3_batchnorm[0][ shared_det_1_conv_0_batchnorm[0][ shared_det_1_conv_1_batchnorm[0][ shared_det_1_conv_2_batchnorm[0][ shared_det_1_conv_3_batchnorm[0][ shared_det_2_conv_0_batchnorm[0][ shared_det_2_conv_1_batchnorm[0][ shared_det_2_conv_2_batchnorm[0][ shared_det_2_conv_3_batchnorm[0][ __________________________________________________________________________________________________ max_pooling1d (MaxPooling1D) (None, 1024, 32) 0 leaky_re_lu[0][0] __________________________________________________________________________________________________ shared_det_0_conv_1 (Conv1D) (None, 512, 32) 32800 max_pooling1d[0][0] __________________________________________________________________________________________________ shared_det_0_conv_1_batchnorm ( (None, 512, 32) 128 shared_det_0_conv_1[0][0] __________________________________________________________________________________________________ max_pooling1d_1 (MaxPooling1D) (None, 256, 32) 0 leaky_re_lu[1][0] __________________________________________________________________________________________________ shared_det_0_conv_2 (Conv1D) (None, 256, 32) 16416 max_pooling1d_1[0][0] __________________________________________________________________________________________________ shared_det_0_conv_2_batchnorm ( (None, 256, 32) 128 shared_det_0_conv_2[0][0] __________________________________________________________________________________________________ max_pooling1d_2 (MaxPooling1D) (None, 128, 32) 0 leaky_re_lu[2][0] __________________________________________________________________________________________________ shared_det_0_conv_3 (Conv1D) (None, 64, 32) 8224 max_pooling1d_2[0][0] __________________________________________________________________________________________________ shared_det_1_conv_0 (Conv1D) (None, 2048, 32) 2080 tf.split[0][1] __________________________________________________________________________________________________ shared_det_0_conv_3_batchnorm ( (None, 64, 32) 128 shared_det_0_conv_3[0][0] __________________________________________________________________________________________________ shared_det_1_conv_0_batchnorm ( (None, 2048, 32) 128 shared_det_1_conv_0[0][0] __________________________________________________________________________________________________ max_pooling1d_4 (MaxPooling1D) (None, 1024, 32) 0 leaky_re_lu[4][0] __________________________________________________________________________________________________ shared_det_1_conv_1 (Conv1D) (None, 512, 32) 32800 max_pooling1d_4[0][0] __________________________________________________________________________________________________ shared_det_1_conv_1_batchnorm ( (None, 512, 32) 128 shared_det_1_conv_1[0][0] __________________________________________________________________________________________________ max_pooling1d_5 (MaxPooling1D) (None, 256, 32) 0 leaky_re_lu[5][0] __________________________________________________________________________________________________ shared_det_1_conv_2 (Conv1D) (None, 256, 32) 16416 max_pooling1d_5[0][0] __________________________________________________________________________________________________ shared_det_1_conv_2_batchnorm ( (None, 256, 32) 128 shared_det_1_conv_2[0][0] __________________________________________________________________________________________________ max_pooling1d_6 (MaxPooling1D) (None, 128, 32) 0 leaky_re_lu[6][0] __________________________________________________________________________________________________ shared_det_1_conv_3 (Conv1D) (None, 64, 32) 8224 max_pooling1d_6[0][0] __________________________________________________________________________________________________ shared_det_2_conv_0 (Conv1D) (None, 2048, 32) 2080 tf.split[0][2] __________________________________________________________________________________________________ shared_det_1_conv_3_batchnorm ( (None, 64, 32) 128 shared_det_1_conv_3[0][0] __________________________________________________________________________________________________ shared_det_2_conv_0_batchnorm ( (None, 2048, 32) 128 shared_det_2_conv_0[0][0] __________________________________________________________________________________________________ max_pooling1d_8 (MaxPooling1D) (None, 1024, 32) 0 leaky_re_lu[8][0] __________________________________________________________________________________________________ shared_det_2_conv_1 (Conv1D) (None, 512, 32) 32800 max_pooling1d_8[0][0] __________________________________________________________________________________________________ shared_det_2_conv_1_batchnorm ( (None, 512, 32) 128 shared_det_2_conv_1[0][0] __________________________________________________________________________________________________ max_pooling1d_9 (MaxPooling1D) (None, 256, 32) 0 leaky_re_lu[9][0] __________________________________________________________________________________________________ shared_det_2_conv_2 (Conv1D) (None, 256, 32) 16416 max_pooling1d_9[0][0] __________________________________________________________________________________________________ shared_det_2_conv_2_batchnorm ( (None, 256, 32) 128 shared_det_2_conv_2[0][0] __________________________________________________________________________________________________ max_pooling1d_10 (MaxPooling1D) (None, 128, 32) 0 leaky_re_lu[10][0] __________________________________________________________________________________________________ shared_det_2_conv_3 (Conv1D) (None, 64, 32) 8224 max_pooling1d_10[0][0] __________________________________________________________________________________________________ shared_det_2_conv_3_batchnorm ( (None, 64, 32) 128 shared_det_2_conv_3[0][0] __________________________________________________________________________________________________ max_pooling1d_3 (MaxPooling1D) (None, 32, 32) 0 leaky_re_lu[3][0] __________________________________________________________________________________________________ max_pooling1d_7 (MaxPooling1D) (None, 32, 32) 0 leaky_re_lu[7][0] __________________________________________________________________________________________________ max_pooling1d_11 (MaxPooling1D) (None, 32, 32) 0 leaky_re_lu[11][0] __________________________________________________________________________________________________ tf.expand_dims (TFOpLambda) (None, 32, 32, 1) 0 max_pooling1d_3[0][0] __________________________________________________________________________________________________ tf.expand_dims_1 (TFOpLambda) (None, 32, 32, 1) 0 max_pooling1d_7[0][0] __________________________________________________________________________________________________ tf.expand_dims_2 (TFOpLambda) (None, 32, 32, 1) 0 max_pooling1d_11[0][0] __________________________________________________________________________________________________ concatenate (Concatenate) (None, 32, 32, 3) 0 tf.expand_dims[0][0] tf.expand_dims_1[0][0] tf.expand_dims_2[0][0] __________________________________________________________________________________________________ conv1_pad (ZeroPadding2D) (None, 38, 38, 3) 0 concatenate[0][0] __________________________________________________________________________________________________ conv1_conv (Conv2D) (None, 16, 16, 64) 9472 conv1_pad[0][0] __________________________________________________________________________________________________ conv1_bn (BatchNormalization) (None, 16, 16, 64) 256 conv1_conv[0][0] __________________________________________________________________________________________________ conv1_relu (Activation) (None, 16, 16, 64) 0 conv1_bn[0][0] __________________________________________________________________________________________________ pool1_pad (ZeroPadding2D) (None, 18, 18, 64) 0 conv1_relu[0][0] __________________________________________________________________________________________________ pool1_pool (MaxPooling2D) (None, 8, 8, 64) 0 pool1_pad[0][0] __________________________________________________________________________________________________ conv2_block1_1_conv (Conv2D) (None, 8, 8, 64) 4160 pool1_pool[0][0] __________________________________________________________________________________________________ conv2_block1_1_bn (BatchNormali (None, 8, 8, 64) 256 conv2_block1_1_conv[0][0] __________________________________________________________________________________________________ conv2_block1_1_relu (Activation (None, 8, 8, 64) 0 conv2_block1_1_bn[0][0] __________________________________________________________________________________________________ conv2_block1_2_conv (Conv2D) (None, 8, 8, 64) 36928 conv2_block1_1_relu[0][0] __________________________________________________________________________________________________ conv2_block1_2_bn (BatchNormali (None, 8, 8, 64) 256 conv2_block1_2_conv[0][0] __________________________________________________________________________________________________ conv2_block1_2_relu (Activation (None, 8, 8, 64) 0 conv2_block1_2_bn[0][0] __________________________________________________________________________________________________ conv2_block1_0_conv (Conv2D) (None, 8, 8, 256) 16640 pool1_pool[0][0] __________________________________________________________________________________________________ conv2_block1_3_conv (Conv2D) (None, 8, 8, 256) 16640 conv2_block1_2_relu[0][0] __________________________________________________________________________________________________ conv2_block1_0_bn (BatchNormali (None, 8, 8, 256) 1024 conv2_block1_0_conv[0][0] __________________________________________________________________________________________________ conv2_block1_3_bn (BatchNormali (None, 8, 8, 256) 1024 conv2_block1_3_conv[0][0] __________________________________________________________________________________________________ conv2_block1_add (Add) (None, 8, 8, 256) 0 conv2_block1_0_bn[0][0] conv2_block1_3_bn[0][0] __________________________________________________________________________________________________ conv2_block1_out (Activation) (None, 8, 8, 256) 0 conv2_block1_add[0][0] __________________________________________________________________________________________________ conv2_block2_1_conv (Conv2D) (None, 8, 8, 64) 16448 conv2_block1_out[0][0] __________________________________________________________________________________________________ conv2_block2_1_bn (BatchNormali (None, 8, 8, 64) 256 conv2_block2_1_conv[0][0] __________________________________________________________________________________________________ conv2_block2_1_relu (Activation (None, 8, 8, 64) 0 conv2_block2_1_bn[0][0] __________________________________________________________________________________________________ conv2_block2_2_conv (Conv2D) (None, 8, 8, 64) 36928 conv2_block2_1_relu[0][0] __________________________________________________________________________________________________ conv2_block2_2_bn (BatchNormali (None, 8, 8, 64) 256 conv2_block2_2_conv[0][0] __________________________________________________________________________________________________ conv2_block2_2_relu (Activation (None, 8, 8, 64) 0 conv2_block2_2_bn[0][0] __________________________________________________________________________________________________ conv2_block2_3_conv (Conv2D) (None, 8, 8, 256) 16640 conv2_block2_2_relu[0][0] __________________________________________________________________________________________________ conv2_block2_3_bn (BatchNormali (None, 8, 8, 256) 1024 conv2_block2_3_conv[0][0] __________________________________________________________________________________________________ conv2_block2_add (Add) (None, 8, 8, 256) 0 conv2_block1_out[0][0] conv2_block2_3_bn[0][0] __________________________________________________________________________________________________ conv2_block2_out (Activation) (None, 8, 8, 256) 0 conv2_block2_add[0][0] __________________________________________________________________________________________________ conv2_block3_1_conv (Conv2D) (None, 8, 8, 64) 16448 conv2_block2_out[0][0] __________________________________________________________________________________________________ conv2_block3_1_bn (BatchNormali (None, 8, 8, 64) 256 conv2_block3_1_conv[0][0] __________________________________________________________________________________________________ conv2_block3_1_relu (Activation (None, 8, 8, 64) 0 conv2_block3_1_bn[0][0] __________________________________________________________________________________________________ conv2_block3_2_conv (Conv2D) (None, 8, 8, 64) 36928 conv2_block3_1_relu[0][0] __________________________________________________________________________________________________ conv2_block3_2_bn (BatchNormali (None, 8, 8, 64) 256 conv2_block3_2_conv[0][0] __________________________________________________________________________________________________ conv2_block3_2_relu (Activation (None, 8, 8, 64) 0 conv2_block3_2_bn[0][0] __________________________________________________________________________________________________ conv2_block3_3_conv (Conv2D) (None, 8, 8, 256) 16640 conv2_block3_2_relu[0][0] __________________________________________________________________________________________________ conv2_block3_3_bn (BatchNormali (None, 8, 8, 256) 1024 conv2_block3_3_conv[0][0] __________________________________________________________________________________________________ conv2_block3_add (Add) (None, 8, 8, 256) 0 conv2_block2_out[0][0] conv2_block3_3_bn[0][0] __________________________________________________________________________________________________ conv2_block3_out (Activation) (None, 8, 8, 256) 0 conv2_block3_add[0][0] __________________________________________________________________________________________________ conv3_block1_1_conv (Conv2D) (None, 4, 4, 128) 32896 conv2_block3_out[0][0] __________________________________________________________________________________________________ conv3_block1_1_bn (BatchNormali (None, 4, 4, 128) 512 conv3_block1_1_conv[0][0] __________________________________________________________________________________________________ conv3_block1_1_relu (Activation (None, 4, 4, 128) 0 conv3_block1_1_bn[0][0] __________________________________________________________________________________________________ conv3_block1_2_conv (Conv2D) (None, 4, 4, 128) 147584 conv3_block1_1_relu[0][0] __________________________________________________________________________________________________ conv3_block1_2_bn (BatchNormali (None, 4, 4, 128) 512 conv3_block1_2_conv[0][0] __________________________________________________________________________________________________ conv3_block1_2_relu (Activation (None, 4, 4, 128) 0 conv3_block1_2_bn[0][0] __________________________________________________________________________________________________ conv3_block1_0_conv (Conv2D) (None, 4, 4, 512) 131584 conv2_block3_out[0][0] __________________________________________________________________________________________________ conv3_block1_3_conv (Conv2D) (None, 4, 4, 512) 66048 conv3_block1_2_relu[0][0] __________________________________________________________________________________________________ conv3_block1_0_bn (BatchNormali (None, 4, 4, 512) 2048 conv3_block1_0_conv[0][0] __________________________________________________________________________________________________ conv3_block1_3_bn (BatchNormali (None, 4, 4, 512) 2048 conv3_block1_3_conv[0][0] __________________________________________________________________________________________________ conv3_block1_add (Add) (None, 4, 4, 512) 0 conv3_block1_0_bn[0][0] conv3_block1_3_bn[0][0] __________________________________________________________________________________________________ conv3_block1_out (Activation) (None, 4, 4, 512) 0 conv3_block1_add[0][0] __________________________________________________________________________________________________ conv3_block2_1_conv (Conv2D) (None, 4, 4, 128) 65664 conv3_block1_out[0][0] __________________________________________________________________________________________________ conv3_block2_1_bn (BatchNormali (None, 4, 4, 128) 512 conv3_block2_1_conv[0][0] __________________________________________________________________________________________________ conv3_block2_1_relu (Activation (None, 4, 4, 128) 0 conv3_block2_1_bn[0][0] __________________________________________________________________________________________________ conv3_block2_2_conv (Conv2D) (None, 4, 4, 128) 147584 conv3_block2_1_relu[0][0] __________________________________________________________________________________________________ conv3_block2_2_bn (BatchNormali (None, 4, 4, 128) 512 conv3_block2_2_conv[0][0] __________________________________________________________________________________________________ conv3_block2_2_relu (Activation (None, 4, 4, 128) 0 conv3_block2_2_bn[0][0] __________________________________________________________________________________________________ conv3_block2_3_conv (Conv2D) (None, 4, 4, 512) 66048 conv3_block2_2_relu[0][0] __________________________________________________________________________________________________ conv3_block2_3_bn (BatchNormali (None, 4, 4, 512) 2048 conv3_block2_3_conv[0][0] __________________________________________________________________________________________________ conv3_block2_add (Add) (None, 4, 4, 512) 0 conv3_block1_out[0][0] conv3_block2_3_bn[0][0] __________________________________________________________________________________________________ conv3_block2_out (Activation) (None, 4, 4, 512) 0 conv3_block2_add[0][0] __________________________________________________________________________________________________ conv3_block3_1_conv (Conv2D) (None, 4, 4, 128) 65664 conv3_block2_out[0][0] __________________________________________________________________________________________________ conv3_block3_1_bn (BatchNormali (None, 4, 4, 128) 512 conv3_block3_1_conv[0][0] __________________________________________________________________________________________________ conv3_block3_1_relu (Activation (None, 4, 4, 128) 0 conv3_block3_1_bn[0][0] __________________________________________________________________________________________________ conv3_block3_2_conv (Conv2D) (None, 4, 4, 128) 147584 conv3_block3_1_relu[0][0] __________________________________________________________________________________________________ conv3_block3_2_bn (BatchNormali (None, 4, 4, 128) 512 conv3_block3_2_conv[0][0] __________________________________________________________________________________________________ conv3_block3_2_relu (Activation (None, 4, 4, 128) 0 conv3_block3_2_bn[0][0] __________________________________________________________________________________________________ conv3_block3_3_conv (Conv2D) (None, 4, 4, 512) 66048 conv3_block3_2_relu[0][0] __________________________________________________________________________________________________ conv3_block3_3_bn (BatchNormali (None, 4, 4, 512) 2048 conv3_block3_3_conv[0][0] __________________________________________________________________________________________________ conv3_block3_add (Add) (None, 4, 4, 512) 0 conv3_block2_out[0][0] conv3_block3_3_bn[0][0] __________________________________________________________________________________________________ conv3_block3_out (Activation) (None, 4, 4, 512) 0 conv3_block3_add[0][0] __________________________________________________________________________________________________ conv3_block4_1_conv (Conv2D) (None, 4, 4, 128) 65664 conv3_block3_out[0][0] __________________________________________________________________________________________________ conv3_block4_1_bn (BatchNormali (None, 4, 4, 128) 512 conv3_block4_1_conv[0][0] __________________________________________________________________________________________________ conv3_block4_1_relu (Activation (None, 4, 4, 128) 0 conv3_block4_1_bn[0][0] __________________________________________________________________________________________________ conv3_block4_2_conv (Conv2D) (None, 4, 4, 128) 147584 conv3_block4_1_relu[0][0] __________________________________________________________________________________________________ conv3_block4_2_bn (BatchNormali (None, 4, 4, 128) 512 conv3_block4_2_conv[0][0] __________________________________________________________________________________________________ conv3_block4_2_relu (Activation (None, 4, 4, 128) 0 conv3_block4_2_bn[0][0] __________________________________________________________________________________________________ conv3_block4_3_conv (Conv2D) (None, 4, 4, 512) 66048 conv3_block4_2_relu[0][0] __________________________________________________________________________________________________ conv3_block4_3_bn (BatchNormali (None, 4, 4, 512) 2048 conv3_block4_3_conv[0][0] __________________________________________________________________________________________________ conv3_block4_add (Add) (None, 4, 4, 512) 0 conv3_block3_out[0][0] conv3_block4_3_bn[0][0] __________________________________________________________________________________________________ conv3_block4_out (Activation) (None, 4, 4, 512) 0 conv3_block4_add[0][0] __________________________________________________________________________________________________ conv4_block1_1_conv (Conv2D) (None, 2, 2, 256) 131328 conv3_block4_out[0][0] __________________________________________________________________________________________________ conv4_block1_1_bn (BatchNormali (None, 2, 2, 256) 1024 conv4_block1_1_conv[0][0] __________________________________________________________________________________________________ conv4_block1_1_relu (Activation (None, 2, 2, 256) 0 conv4_block1_1_bn[0][0] __________________________________________________________________________________________________ conv4_block1_2_conv (Conv2D) (None, 2, 2, 256) 590080 conv4_block1_1_relu[0][0] __________________________________________________________________________________________________ conv4_block1_2_bn (BatchNormali (None, 2, 2, 256) 1024 conv4_block1_2_conv[0][0] __________________________________________________________________________________________________ conv4_block1_2_relu (Activation (None, 2, 2, 256) 0 conv4_block1_2_bn[0][0] __________________________________________________________________________________________________ conv4_block1_0_conv (Conv2D) (None, 2, 2, 1024) 525312 conv3_block4_out[0][0] __________________________________________________________________________________________________ conv4_block1_3_conv (Conv2D) (None, 2, 2, 1024) 263168 conv4_block1_2_relu[0][0] __________________________________________________________________________________________________ conv4_block1_0_bn (BatchNormali (None, 2, 2, 1024) 4096 conv4_block1_0_conv[0][0] __________________________________________________________________________________________________ conv4_block1_3_bn (BatchNormali (None, 2, 2, 1024) 4096 conv4_block1_3_conv[0][0] __________________________________________________________________________________________________ conv4_block1_add (Add) (None, 2, 2, 1024) 0 conv4_block1_0_bn[0][0] conv4_block1_3_bn[0][0] __________________________________________________________________________________________________ conv4_block1_out (Activation) (None, 2, 2, 1024) 0 conv4_block1_add[0][0] __________________________________________________________________________________________________ conv4_block2_1_conv (Conv2D) (None, 2, 2, 256) 262400 conv4_block1_out[0][0] __________________________________________________________________________________________________ conv4_block2_1_bn (BatchNormali (None, 2, 2, 256) 1024 conv4_block2_1_conv[0][0] __________________________________________________________________________________________________ conv4_block2_1_relu (Activation (None, 2, 2, 256) 0 conv4_block2_1_bn[0][0] __________________________________________________________________________________________________ conv4_block2_2_conv (Conv2D) (None, 2, 2, 256) 590080 conv4_block2_1_relu[0][0] __________________________________________________________________________________________________ conv4_block2_2_bn (BatchNormali (None, 2, 2, 256) 1024 conv4_block2_2_conv[0][0] __________________________________________________________________________________________________ conv4_block2_2_relu (Activation (None, 2, 2, 256) 0 conv4_block2_2_bn[0][0] __________________________________________________________________________________________________ conv4_block2_3_conv (Conv2D) (None, 2, 2, 1024) 263168 conv4_block2_2_relu[0][0] __________________________________________________________________________________________________ conv4_block2_3_bn (BatchNormali (None, 2, 2, 1024) 4096 conv4_block2_3_conv[0][0] __________________________________________________________________________________________________ conv4_block2_add (Add) (None, 2, 2, 1024) 0 conv4_block1_out[0][0] conv4_block2_3_bn[0][0] __________________________________________________________________________________________________ conv4_block2_out (Activation) (None, 2, 2, 1024) 0 conv4_block2_add[0][0] __________________________________________________________________________________________________ conv4_block3_1_conv (Conv2D) (None, 2, 2, 256) 262400 conv4_block2_out[0][0] __________________________________________________________________________________________________ conv4_block3_1_bn (BatchNormali (None, 2, 2, 256) 1024 conv4_block3_1_conv[0][0] __________________________________________________________________________________________________ conv4_block3_1_relu (Activation (None, 2, 2, 256) 0 conv4_block3_1_bn[0][0] __________________________________________________________________________________________________ conv4_block3_2_conv (Conv2D) (None, 2, 2, 256) 590080 conv4_block3_1_relu[0][0] __________________________________________________________________________________________________ conv4_block3_2_bn (BatchNormali (None, 2, 2, 256) 1024 conv4_block3_2_conv[0][0] __________________________________________________________________________________________________ conv4_block3_2_relu (Activation (None, 2, 2, 256) 0 conv4_block3_2_bn[0][0] __________________________________________________________________________________________________ conv4_block3_3_conv (Conv2D) (None, 2, 2, 1024) 263168 conv4_block3_2_relu[0][0] __________________________________________________________________________________________________ conv4_block3_3_bn (BatchNormali (None, 2, 2, 1024) 4096 conv4_block3_3_conv[0][0] __________________________________________________________________________________________________ conv4_block3_add (Add) (None, 2, 2, 1024) 0 conv4_block2_out[0][0] conv4_block3_3_bn[0][0] __________________________________________________________________________________________________ conv4_block3_out (Activation) (None, 2, 2, 1024) 0 conv4_block3_add[0][0] __________________________________________________________________________________________________ conv4_block4_1_conv (Conv2D) (None, 2, 2, 256) 262400 conv4_block3_out[0][0] __________________________________________________________________________________________________ conv4_block4_1_bn (BatchNormali (None, 2, 2, 256) 1024 conv4_block4_1_conv[0][0] __________________________________________________________________________________________________ conv4_block4_1_relu (Activation (None, 2, 2, 256) 0 conv4_block4_1_bn[0][0] __________________________________________________________________________________________________ conv4_block4_2_conv (Conv2D) (None, 2, 2, 256) 590080 conv4_block4_1_relu[0][0] __________________________________________________________________________________________________ conv4_block4_2_bn (BatchNormali (None, 2, 2, 256) 1024 conv4_block4_2_conv[0][0] __________________________________________________________________________________________________ conv4_block4_2_relu (Activation (None, 2, 2, 256) 0 conv4_block4_2_bn[0][0] __________________________________________________________________________________________________ conv4_block4_3_conv (Conv2D) (None, 2, 2, 1024) 263168 conv4_block4_2_relu[0][0] __________________________________________________________________________________________________ conv4_block4_3_bn (BatchNormali (None, 2, 2, 1024) 4096 conv4_block4_3_conv[0][0] __________________________________________________________________________________________________ conv4_block4_add (Add) (None, 2, 2, 1024) 0 conv4_block3_out[0][0] conv4_block4_3_bn[0][0] __________________________________________________________________________________________________ conv4_block4_out (Activation) (None, 2, 2, 1024) 0 conv4_block4_add[0][0] __________________________________________________________________________________________________ conv4_block5_1_conv (Conv2D) (None, 2, 2, 256) 262400 conv4_block4_out[0][0] __________________________________________________________________________________________________ conv4_block5_1_bn (BatchNormali (None, 2, 2, 256) 1024 conv4_block5_1_conv[0][0] __________________________________________________________________________________________________ conv4_block5_1_relu (Activation (None, 2, 2, 256) 0 conv4_block5_1_bn[0][0] __________________________________________________________________________________________________ conv4_block5_2_conv (Conv2D) (None, 2, 2, 256) 590080 conv4_block5_1_relu[0][0] __________________________________________________________________________________________________ conv4_block5_2_bn (BatchNormali (None, 2, 2, 256) 1024 conv4_block5_2_conv[0][0] __________________________________________________________________________________________________ conv4_block5_2_relu (Activation (None, 2, 2, 256) 0 conv4_block5_2_bn[0][0] __________________________________________________________________________________________________ conv4_block5_3_conv (Conv2D) (None, 2, 2, 1024) 263168 conv4_block5_2_relu[0][0] __________________________________________________________________________________________________ conv4_block5_3_bn (BatchNormali (None, 2, 2, 1024) 4096 conv4_block5_3_conv[0][0] __________________________________________________________________________________________________ conv4_block5_add (Add) (None, 2, 2, 1024) 0 conv4_block4_out[0][0] conv4_block5_3_bn[0][0] __________________________________________________________________________________________________ conv4_block5_out (Activation) (None, 2, 2, 1024) 0 conv4_block5_add[0][0] __________________________________________________________________________________________________ conv4_block6_1_conv (Conv2D) (None, 2, 2, 256) 262400 conv4_block5_out[0][0] __________________________________________________________________________________________________ conv4_block6_1_bn (BatchNormali (None, 2, 2, 256) 1024 conv4_block6_1_conv[0][0] __________________________________________________________________________________________________ conv4_block6_1_relu (Activation (None, 2, 2, 256) 0 conv4_block6_1_bn[0][0] __________________________________________________________________________________________________ conv4_block6_2_conv (Conv2D) (None, 2, 2, 256) 590080 conv4_block6_1_relu[0][0] __________________________________________________________________________________________________ conv4_block6_2_bn (BatchNormali (None, 2, 2, 256) 1024 conv4_block6_2_conv[0][0] __________________________________________________________________________________________________ conv4_block6_2_relu (Activation (None, 2, 2, 256) 0 conv4_block6_2_bn[0][0] __________________________________________________________________________________________________ conv4_block6_3_conv (Conv2D) (None, 2, 2, 1024) 263168 conv4_block6_2_relu[0][0] __________________________________________________________________________________________________ conv4_block6_3_bn (BatchNormali (None, 2, 2, 1024) 4096 conv4_block6_3_conv[0][0] __________________________________________________________________________________________________ conv4_block6_add (Add) (None, 2, 2, 1024) 0 conv4_block5_out[0][0] conv4_block6_3_bn[0][0] __________________________________________________________________________________________________ conv4_block6_out (Activation) (None, 2, 2, 1024) 0 conv4_block6_add[0][0] __________________________________________________________________________________________________ conv5_block1_1_conv (Conv2D) (None, 1, 1, 512) 524800 conv4_block6_out[0][0] __________________________________________________________________________________________________ conv5_block1_1_bn (BatchNormali (None, 1, 1, 512) 2048 conv5_block1_1_conv[0][0] __________________________________________________________________________________________________ conv5_block1_1_relu (Activation (None, 1, 1, 512) 0 conv5_block1_1_bn[0][0] __________________________________________________________________________________________________ conv5_block1_2_conv (Conv2D) (None, 1, 1, 512) 2359808 conv5_block1_1_relu[0][0] __________________________________________________________________________________________________ conv5_block1_2_bn (BatchNormali (None, 1, 1, 512) 2048 conv5_block1_2_conv[0][0] __________________________________________________________________________________________________ conv5_block1_2_relu (Activation (None, 1, 1, 512) 0 conv5_block1_2_bn[0][0] __________________________________________________________________________________________________ conv5_block1_0_conv (Conv2D) (None, 1, 1, 2048) 2099200 conv4_block6_out[0][0] __________________________________________________________________________________________________ conv5_block1_3_conv (Conv2D) (None, 1, 1, 2048) 1050624 conv5_block1_2_relu[0][0] __________________________________________________________________________________________________ conv5_block1_0_bn (BatchNormali (None, 1, 1, 2048) 8192 conv5_block1_0_conv[0][0] __________________________________________________________________________________________________ conv5_block1_3_bn (BatchNormali (None, 1, 1, 2048) 8192 conv5_block1_3_conv[0][0] __________________________________________________________________________________________________ conv5_block1_add (Add) (None, 1, 1, 2048) 0 conv5_block1_0_bn[0][0] conv5_block1_3_bn[0][0] __________________________________________________________________________________________________ conv5_block1_out (Activation) (None, 1, 1, 2048) 0 conv5_block1_add[0][0] __________________________________________________________________________________________________ conv5_block2_1_conv (Conv2D) (None, 1, 1, 512) 1049088 conv5_block1_out[0][0] __________________________________________________________________________________________________ conv5_block2_1_bn (BatchNormali (None, 1, 1, 512) 2048 conv5_block2_1_conv[0][0] __________________________________________________________________________________________________ conv5_block2_1_relu (Activation (None, 1, 1, 512) 0 conv5_block2_1_bn[0][0] __________________________________________________________________________________________________ conv5_block2_2_conv (Conv2D) (None, 1, 1, 512) 2359808 conv5_block2_1_relu[0][0] __________________________________________________________________________________________________ conv5_block2_2_bn (BatchNormali (None, 1, 1, 512) 2048 conv5_block2_2_conv[0][0] __________________________________________________________________________________________________ conv5_block2_2_relu (Activation (None, 1, 1, 512) 0 conv5_block2_2_bn[0][0] __________________________________________________________________________________________________ conv5_block2_3_conv (Conv2D) (None, 1, 1, 2048) 1050624 conv5_block2_2_relu[0][0] __________________________________________________________________________________________________ conv5_block2_3_bn (BatchNormali (None, 1, 1, 2048) 8192 conv5_block2_3_conv[0][0] __________________________________________________________________________________________________ conv5_block2_add (Add) (None, 1, 1, 2048) 0 conv5_block1_out[0][0] conv5_block2_3_bn[0][0] __________________________________________________________________________________________________ conv5_block2_out (Activation) (None, 1, 1, 2048) 0 conv5_block2_add[0][0] __________________________________________________________________________________________________ conv5_block3_1_conv (Conv2D) (None, 1, 1, 512) 1049088 conv5_block2_out[0][0] __________________________________________________________________________________________________ conv5_block3_1_bn (BatchNormali (None, 1, 1, 512) 2048 conv5_block3_1_conv[0][0] __________________________________________________________________________________________________ conv5_block3_1_relu (Activation (None, 1, 1, 512) 0 conv5_block3_1_bn[0][0] __________________________________________________________________________________________________ conv5_block3_2_conv (Conv2D) (None, 1, 1, 512) 2359808 conv5_block3_1_relu[0][0] __________________________________________________________________________________________________ conv5_block3_2_bn (BatchNormali (None, 1, 1, 512) 2048 conv5_block3_2_conv[0][0] __________________________________________________________________________________________________ conv5_block3_2_relu (Activation (None, 1, 1, 512) 0 conv5_block3_2_bn[0][0] __________________________________________________________________________________________________ conv5_block3_3_conv (Conv2D) (None, 1, 1, 2048) 1050624 conv5_block3_2_relu[0][0] __________________________________________________________________________________________________ conv5_block3_3_bn (BatchNormali (None, 1, 1, 2048) 8192 conv5_block3_3_conv[0][0] __________________________________________________________________________________________________ conv5_block3_add (Add) (None, 1, 1, 2048) 0 conv5_block2_out[0][0] conv5_block3_3_bn[0][0] __________________________________________________________________________________________________ conv5_block3_out (Activation) (None, 1, 1, 2048) 0 conv5_block3_add[0][0] __________________________________________________________________________________________________ avg_pool (GlobalAveragePooling2 (None, 2048) 0 conv5_block3_out[0][0] __________________________________________________________________________________________________ r1_mean_dense (Dense) (None, 1024) 2098176 avg_pool[0][0] __________________________________________________________________________________________________ r1_logvar_dense (Dense) (None, 1024) 2098176 avg_pool[0][0] __________________________________________________________________________________________________ r1_modes_dense (Dense) (None, 32) 65568 avg_pool[0][0] __________________________________________________________________________________________________ concatenate_1 (Concatenate) (None, 2080) 0 r1_mean_dense[0][0] r1_logvar_dense[0][0] r1_modes_dense[0][0] ================================================================================================== Total params: 28,029,728 Trainable params: 27,975,840 Non-trainable params: 53,888 __________________________________________________________________________________________________ Model: "encoder_q" __________________________________________________________________________________________________ Layer (type) Output Shape Param # Connected to ================================================================================================== input_1 (InputLayer) [(None, 2048, 3)] 0 __________________________________________________________________________________________________ tf.split (TFOpLambda) [(None, 2048, 1), (N 0 input_1[0][0] __________________________________________________________________________________________________ shared_det_0_conv_0 (Conv1D) (None, 2048, 32) 2080 tf.split[0][0] __________________________________________________________________________________________________ shared_det_0_conv_0_batchnorm ( (None, 2048, 32) 128 shared_det_0_conv_0[0][0] __________________________________________________________________________________________________ leaky_re_lu (LeakyReLU) multiple 0 shared_det_0_conv_0_batchnorm[0][ shared_det_0_conv_1_batchnorm[0][ shared_det_0_conv_2_batchnorm[0][ shared_det_0_conv_3_batchnorm[0][ shared_det_1_conv_0_batchnorm[0][ shared_det_1_conv_1_batchnorm[0][ shared_det_1_conv_2_batchnorm[0][ shared_det_1_conv_3_batchnorm[0][ shared_det_2_conv_0_batchnorm[0][ shared_det_2_conv_1_batchnorm[0][ shared_det_2_conv_2_batchnorm[0][ shared_det_2_conv_3_batchnorm[0][ __________________________________________________________________________________________________ max_pooling1d (MaxPooling1D) (None, 1024, 32) 0 leaky_re_lu[0][0] __________________________________________________________________________________________________ shared_det_0_conv_1 (Conv1D) (None, 512, 32) 32800 max_pooling1d[0][0] __________________________________________________________________________________________________ shared_det_0_conv_1_batchnorm ( (None, 512, 32) 128 shared_det_0_conv_1[0][0] __________________________________________________________________________________________________ max_pooling1d_1 (MaxPooling1D) (None, 256, 32) 0 leaky_re_lu[1][0] __________________________________________________________________________________________________ shared_det_0_conv_2 (Conv1D) (None, 256, 32) 16416 max_pooling1d_1[0][0] __________________________________________________________________________________________________ shared_det_0_conv_2_batchnorm ( (None, 256, 32) 128 shared_det_0_conv_2[0][0] __________________________________________________________________________________________________ max_pooling1d_2 (MaxPooling1D) (None, 128, 32) 0 leaky_re_lu[2][0] __________________________________________________________________________________________________ shared_det_0_conv_3 (Conv1D) (None, 64, 32) 8224 max_pooling1d_2[0][0] __________________________________________________________________________________________________ shared_det_1_conv_0 (Conv1D) (None, 2048, 32) 2080 tf.split[0][1] __________________________________________________________________________________________________ shared_det_0_conv_3_batchnorm ( (None, 64, 32) 128 shared_det_0_conv_3[0][0] __________________________________________________________________________________________________ shared_det_1_conv_0_batchnorm ( (None, 2048, 32) 128 shared_det_1_conv_0[0][0] __________________________________________________________________________________________________ max_pooling1d_4 (MaxPooling1D) (None, 1024, 32) 0 leaky_re_lu[4][0] __________________________________________________________________________________________________ shared_det_1_conv_1 (Conv1D) (None, 512, 32) 32800 max_pooling1d_4[0][0] __________________________________________________________________________________________________ shared_det_1_conv_1_batchnorm ( (None, 512, 32) 128 shared_det_1_conv_1[0][0] __________________________________________________________________________________________________ max_pooling1d_5 (MaxPooling1D) (None, 256, 32) 0 leaky_re_lu[5][0] __________________________________________________________________________________________________ shared_det_1_conv_2 (Conv1D) (None, 256, 32) 16416 max_pooling1d_5[0][0] __________________________________________________________________________________________________ shared_det_1_conv_2_batchnorm ( (None, 256, 32) 128 shared_det_1_conv_2[0][0] __________________________________________________________________________________________________ max_pooling1d_6 (MaxPooling1D) (None, 128, 32) 0 leaky_re_lu[6][0] __________________________________________________________________________________________________ shared_det_1_conv_3 (Conv1D) (None, 64, 32) 8224 max_pooling1d_6[0][0] __________________________________________________________________________________________________ shared_det_2_conv_0 (Conv1D) (None, 2048, 32) 2080 tf.split[0][2] __________________________________________________________________________________________________ shared_det_1_conv_3_batchnorm ( (None, 64, 32) 128 shared_det_1_conv_3[0][0] __________________________________________________________________________________________________ shared_det_2_conv_0_batchnorm ( (None, 2048, 32) 128 shared_det_2_conv_0[0][0] __________________________________________________________________________________________________ max_pooling1d_8 (MaxPooling1D) (None, 1024, 32) 0 leaky_re_lu[8][0] __________________________________________________________________________________________________ shared_det_2_conv_1 (Conv1D) (None, 512, 32) 32800 max_pooling1d_8[0][0] __________________________________________________________________________________________________ shared_det_2_conv_1_batchnorm ( (None, 512, 32) 128 shared_det_2_conv_1[0][0] __________________________________________________________________________________________________ max_pooling1d_9 (MaxPooling1D) (None, 256, 32) 0 leaky_re_lu[9][0] __________________________________________________________________________________________________ shared_det_2_conv_2 (Conv1D) (None, 256, 32) 16416 max_pooling1d_9[0][0] __________________________________________________________________________________________________ shared_det_2_conv_2_batchnorm ( (None, 256, 32) 128 shared_det_2_conv_2[0][0] __________________________________________________________________________________________________ max_pooling1d_10 (MaxPooling1D) (None, 128, 32) 0 leaky_re_lu[10][0] __________________________________________________________________________________________________ shared_det_2_conv_3 (Conv1D) (None, 64, 32) 8224 max_pooling1d_10[0][0] __________________________________________________________________________________________________ shared_det_2_conv_3_batchnorm ( (None, 64, 32) 128 shared_det_2_conv_3[0][0] __________________________________________________________________________________________________ input_2 (InputLayer) [(None, 15)] 0 __________________________________________________________________________________________________ max_pooling1d_3 (MaxPooling1D) (None, 32, 32) 0 leaky_re_lu[3][0] __________________________________________________________________________________________________ max_pooling1d_7 (MaxPooling1D) (None, 32, 32) 0 leaky_re_lu[7][0] __________________________________________________________________________________________________ max_pooling1d_11 (MaxPooling1D) (None, 32, 32) 0 leaky_re_lu[11][0] __________________________________________________________________________________________________ flatten (Flatten) (None, 15) 0 input_2[0][0] __________________________________________________________________________________________________ tf.expand_dims (TFOpLambda) (None, 32, 32, 1) 0 max_pooling1d_3[0][0] __________________________________________________________________________________________________ tf.expand_dims_1 (TFOpLambda) (None, 32, 32, 1) 0 max_pooling1d_7[0][0] __________________________________________________________________________________________________ tf.expand_dims_2 (TFOpLambda) (None, 32, 32, 1) 0 max_pooling1d_11[0][0] __________________________________________________________________________________________________ q_expand (Dense) (None, 1024) 16384 flatten[0][0] __________________________________________________________________________________________________ concatenate (Concatenate) (None, 32, 32, 3) 0 tf.expand_dims[0][0] tf.expand_dims_1[0][0] tf.expand_dims_2[0][0] __________________________________________________________________________________________________ tf.reshape (TFOpLambda) (None, 32, 32, 1) 0 q_expand[0][0] __________________________________________________________________________________________________ concatenate_2 (Concatenate) (None, 32, 32, 4) 0 concatenate[0][0] tf.reshape[0][0] __________________________________________________________________________________________________ conv1_pad (ZeroPadding2D) (None, 38, 38, 4) 0 concatenate_2[0][0] __________________________________________________________________________________________________ conv1_conv (Conv2D) (None, 16, 16, 64) 12608 conv1_pad[0][0] __________________________________________________________________________________________________ conv1_bn (BatchNormalization) (None, 16, 16, 64) 256 conv1_conv[0][0] __________________________________________________________________________________________________ conv1_relu (Activation) (None, 16, 16, 64) 0 conv1_bn[0][0] __________________________________________________________________________________________________ pool1_pad (ZeroPadding2D) (None, 18, 18, 64) 0 conv1_relu[0][0] __________________________________________________________________________________________________ pool1_pool (MaxPooling2D) (None, 8, 8, 64) 0 pool1_pad[0][0] __________________________________________________________________________________________________ conv2_block1_1_conv (Conv2D) (None, 8, 8, 64) 4160 pool1_pool[0][0] __________________________________________________________________________________________________ conv2_block1_1_bn (BatchNormali (None, 8, 8, 64) 256 conv2_block1_1_conv[0][0] __________________________________________________________________________________________________ conv2_block1_1_relu (Activation (None, 8, 8, 64) 0 conv2_block1_1_bn[0][0] __________________________________________________________________________________________________ conv2_block1_2_conv (Conv2D) (None, 8, 8, 64) 36928 conv2_block1_1_relu[0][0] __________________________________________________________________________________________________ conv2_block1_2_bn (BatchNormali (None, 8, 8, 64) 256 conv2_block1_2_conv[0][0] __________________________________________________________________________________________________ conv2_block1_2_relu (Activation (None, 8, 8, 64) 0 conv2_block1_2_bn[0][0] __________________________________________________________________________________________________ conv2_block1_0_conv (Conv2D) (None, 8, 8, 256) 16640 pool1_pool[0][0] __________________________________________________________________________________________________ conv2_block1_3_conv (Conv2D) (None, 8, 8, 256) 16640 conv2_block1_2_relu[0][0] __________________________________________________________________________________________________ conv2_block1_0_bn (BatchNormali (None, 8, 8, 256) 1024 conv2_block1_0_conv[0][0] __________________________________________________________________________________________________ conv2_block1_3_bn (BatchNormali (None, 8, 8, 256) 1024 conv2_block1_3_conv[0][0] __________________________________________________________________________________________________ conv2_block1_add (Add) (None, 8, 8, 256) 0 conv2_block1_0_bn[0][0] conv2_block1_3_bn[0][0] __________________________________________________________________________________________________ conv2_block1_out (Activation) (None, 8, 8, 256) 0 conv2_block1_add[0][0] __________________________________________________________________________________________________ conv2_block2_1_conv (Conv2D) (None, 8, 8, 64) 16448 conv2_block1_out[0][0] __________________________________________________________________________________________________ conv2_block2_1_bn (BatchNormali (None, 8, 8, 64) 256 conv2_block2_1_conv[0][0] __________________________________________________________________________________________________ conv2_block2_1_relu (Activation (None, 8, 8, 64) 0 conv2_block2_1_bn[0][0] __________________________________________________________________________________________________ conv2_block2_2_conv (Conv2D) (None, 8, 8, 64) 36928 conv2_block2_1_relu[0][0] __________________________________________________________________________________________________ conv2_block2_2_bn (BatchNormali (None, 8, 8, 64) 256 conv2_block2_2_conv[0][0] __________________________________________________________________________________________________ conv2_block2_2_relu (Activation (None, 8, 8, 64) 0 conv2_block2_2_bn[0][0] __________________________________________________________________________________________________ conv2_block2_3_conv (Conv2D) (None, 8, 8, 256) 16640 conv2_block2_2_relu[0][0] __________________________________________________________________________________________________ conv2_block2_3_bn (BatchNormali (None, 8, 8, 256) 1024 conv2_block2_3_conv[0][0] __________________________________________________________________________________________________ conv2_block2_add (Add) (None, 8, 8, 256) 0 conv2_block1_out[0][0] conv2_block2_3_bn[0][0] __________________________________________________________________________________________________ conv2_block2_out (Activation) (None, 8, 8, 256) 0 conv2_block2_add[0][0] __________________________________________________________________________________________________ conv2_block3_1_conv (Conv2D) (None, 8, 8, 64) 16448 conv2_block2_out[0][0] __________________________________________________________________________________________________ conv2_block3_1_bn (BatchNormali (None, 8, 8, 64) 256 conv2_block3_1_conv[0][0] __________________________________________________________________________________________________ conv2_block3_1_relu (Activation (None, 8, 8, 64) 0 conv2_block3_1_bn[0][0] __________________________________________________________________________________________________ conv2_block3_2_conv (Conv2D) (None, 8, 8, 64) 36928 conv2_block3_1_relu[0][0] __________________________________________________________________________________________________ conv2_block3_2_bn (BatchNormali (None, 8, 8, 64) 256 conv2_block3_2_conv[0][0] __________________________________________________________________________________________________ conv2_block3_2_relu (Activation (None, 8, 8, 64) 0 conv2_block3_2_bn[0][0] __________________________________________________________________________________________________ conv2_block3_3_conv (Conv2D) (None, 8, 8, 256) 16640 conv2_block3_2_relu[0][0] __________________________________________________________________________________________________ conv2_block3_3_bn (BatchNormali (None, 8, 8, 256) 1024 conv2_block3_3_conv[0][0] __________________________________________________________________________________________________ conv2_block3_add (Add) (None, 8, 8, 256) 0 conv2_block2_out[0][0] conv2_block3_3_bn[0][0] __________________________________________________________________________________________________ conv2_block3_out (Activation) (None, 8, 8, 256) 0 conv2_block3_add[0][0] __________________________________________________________________________________________________ conv3_block1_1_conv (Conv2D) (None, 4, 4, 128) 32896 conv2_block3_out[0][0] __________________________________________________________________________________________________ conv3_block1_1_bn (BatchNormali (None, 4, 4, 128) 512 conv3_block1_1_conv[0][0] __________________________________________________________________________________________________ conv3_block1_1_relu (Activation (None, 4, 4, 128) 0 conv3_block1_1_bn[0][0] __________________________________________________________________________________________________ conv3_block1_2_conv (Conv2D) (None, 4, 4, 128) 147584 conv3_block1_1_relu[0][0] __________________________________________________________________________________________________ conv3_block1_2_bn (BatchNormali (None, 4, 4, 128) 512 conv3_block1_2_conv[0][0] __________________________________________________________________________________________________ conv3_block1_2_relu (Activation (None, 4, 4, 128) 0 conv3_block1_2_bn[0][0] __________________________________________________________________________________________________ conv3_block1_0_conv (Conv2D) (None, 4, 4, 512) 131584 conv2_block3_out[0][0] __________________________________________________________________________________________________ conv3_block1_3_conv (Conv2D) (None, 4, 4, 512) 66048 conv3_block1_2_relu[0][0] __________________________________________________________________________________________________ conv3_block1_0_bn (BatchNormali (None, 4, 4, 512) 2048 conv3_block1_0_conv[0][0] __________________________________________________________________________________________________ conv3_block1_3_bn (BatchNormali (None, 4, 4, 512) 2048 conv3_block1_3_conv[0][0] __________________________________________________________________________________________________ conv3_block1_add (Add) (None, 4, 4, 512) 0 conv3_block1_0_bn[0][0] conv3_block1_3_bn[0][0] __________________________________________________________________________________________________ conv3_block1_out (Activation) (None, 4, 4, 512) 0 conv3_block1_add[0][0] __________________________________________________________________________________________________ conv3_block2_1_conv (Conv2D) (None, 4, 4, 128) 65664 conv3_block1_out[0][0] __________________________________________________________________________________________________ conv3_block2_1_bn (BatchNormali (None, 4, 4, 128) 512 conv3_block2_1_conv[0][0] __________________________________________________________________________________________________ conv3_block2_1_relu (Activation (None, 4, 4, 128) 0 conv3_block2_1_bn[0][0] __________________________________________________________________________________________________ conv3_block2_2_conv (Conv2D) (None, 4, 4, 128) 147584 conv3_block2_1_relu[0][0] __________________________________________________________________________________________________ conv3_block2_2_bn (BatchNormali (None, 4, 4, 128) 512 conv3_block2_2_conv[0][0] __________________________________________________________________________________________________ conv3_block2_2_relu (Activation (None, 4, 4, 128) 0 conv3_block2_2_bn[0][0] __________________________________________________________________________________________________ conv3_block2_3_conv (Conv2D) (None, 4, 4, 512) 66048 conv3_block2_2_relu[0][0] __________________________________________________________________________________________________ conv3_block2_3_bn (BatchNormali (None, 4, 4, 512) 2048 conv3_block2_3_conv[0][0] __________________________________________________________________________________________________ conv3_block2_add (Add) (None, 4, 4, 512) 0 conv3_block1_out[0][0] conv3_block2_3_bn[0][0] __________________________________________________________________________________________________ conv3_block2_out (Activation) (None, 4, 4, 512) 0 conv3_block2_add[0][0] __________________________________________________________________________________________________ conv3_block3_1_conv (Conv2D) (None, 4, 4, 128) 65664 conv3_block2_out[0][0] __________________________________________________________________________________________________ conv3_block3_1_bn (BatchNormali (None, 4, 4, 128) 512 conv3_block3_1_conv[0][0] __________________________________________________________________________________________________ conv3_block3_1_relu (Activation (None, 4, 4, 128) 0 conv3_block3_1_bn[0][0] __________________________________________________________________________________________________ conv3_block3_2_conv (Conv2D) (None, 4, 4, 128) 147584 conv3_block3_1_relu[0][0] __________________________________________________________________________________________________ conv3_block3_2_bn (BatchNormali (None, 4, 4, 128) 512 conv3_block3_2_conv[0][0] __________________________________________________________________________________________________ conv3_block3_2_relu (Activation (None, 4, 4, 128) 0 conv3_block3_2_bn[0][0] __________________________________________________________________________________________________ conv3_block3_3_conv (Conv2D) (None, 4, 4, 512) 66048 conv3_block3_2_relu[0][0] __________________________________________________________________________________________________ conv3_block3_3_bn (BatchNormali (None, 4, 4, 512) 2048 conv3_block3_3_conv[0][0] __________________________________________________________________________________________________ conv3_block3_add (Add) (None, 4, 4, 512) 0 conv3_block2_out[0][0] conv3_block3_3_bn[0][0] __________________________________________________________________________________________________ conv3_block3_out (Activation) (None, 4, 4, 512) 0 conv3_block3_add[0][0] __________________________________________________________________________________________________ conv3_block4_1_conv (Conv2D) (None, 4, 4, 128) 65664 conv3_block3_out[0][0] __________________________________________________________________________________________________ conv3_block4_1_bn (BatchNormali (None, 4, 4, 128) 512 conv3_block4_1_conv[0][0] __________________________________________________________________________________________________ conv3_block4_1_relu (Activation (None, 4, 4, 128) 0 conv3_block4_1_bn[0][0] __________________________________________________________________________________________________ conv3_block4_2_conv (Conv2D) (None, 4, 4, 128) 147584 conv3_block4_1_relu[0][0] __________________________________________________________________________________________________ conv3_block4_2_bn (BatchNormali (None, 4, 4, 128) 512 conv3_block4_2_conv[0][0] __________________________________________________________________________________________________ conv3_block4_2_relu (Activation (None, 4, 4, 128) 0 conv3_block4_2_bn[0][0] __________________________________________________________________________________________________ conv3_block4_3_conv (Conv2D) (None, 4, 4, 512) 66048 conv3_block4_2_relu[0][0] __________________________________________________________________________________________________ conv3_block4_3_bn (BatchNormali (None, 4, 4, 512) 2048 conv3_block4_3_conv[0][0] __________________________________________________________________________________________________ conv3_block4_add (Add) (None, 4, 4, 512) 0 conv3_block3_out[0][0] conv3_block4_3_bn[0][0] __________________________________________________________________________________________________ conv3_block4_out (Activation) (None, 4, 4, 512) 0 conv3_block4_add[0][0] __________________________________________________________________________________________________ conv4_block1_1_conv (Conv2D) (None, 2, 2, 256) 131328 conv3_block4_out[0][0] __________________________________________________________________________________________________ conv4_block1_1_bn (BatchNormali (None, 2, 2, 256) 1024 conv4_block1_1_conv[0][0] __________________________________________________________________________________________________ conv4_block1_1_relu (Activation (None, 2, 2, 256) 0 conv4_block1_1_bn[0][0] __________________________________________________________________________________________________ conv4_block1_2_conv (Conv2D) (None, 2, 2, 256) 590080 conv4_block1_1_relu[0][0] __________________________________________________________________________________________________ conv4_block1_2_bn (BatchNormali (None, 2, 2, 256) 1024 conv4_block1_2_conv[0][0] __________________________________________________________________________________________________ conv4_block1_2_relu (Activation (None, 2, 2, 256) 0 conv4_block1_2_bn[0][0] __________________________________________________________________________________________________ conv4_block1_0_conv (Conv2D) (None, 2, 2, 1024) 525312 conv3_block4_out[0][0] __________________________________________________________________________________________________ conv4_block1_3_conv (Conv2D) (None, 2, 2, 1024) 263168 conv4_block1_2_relu[0][0] __________________________________________________________________________________________________ conv4_block1_0_bn (BatchNormali (None, 2, 2, 1024) 4096 conv4_block1_0_conv[0][0] __________________________________________________________________________________________________ conv4_block1_3_bn (BatchNormali (None, 2, 2, 1024) 4096 conv4_block1_3_conv[0][0] __________________________________________________________________________________________________ conv4_block1_add (Add) (None, 2, 2, 1024) 0 conv4_block1_0_bn[0][0] conv4_block1_3_bn[0][0] __________________________________________________________________________________________________ conv4_block1_out (Activation) (None, 2, 2, 1024) 0 conv4_block1_add[0][0] __________________________________________________________________________________________________ conv4_block2_1_conv (Conv2D) (None, 2, 2, 256) 262400 conv4_block1_out[0][0] __________________________________________________________________________________________________ conv4_block2_1_bn (BatchNormali (None, 2, 2, 256) 1024 conv4_block2_1_conv[0][0] __________________________________________________________________________________________________ conv4_block2_1_relu (Activation (None, 2, 2, 256) 0 conv4_block2_1_bn[0][0] __________________________________________________________________________________________________ conv4_block2_2_conv (Conv2D) (None, 2, 2, 256) 590080 conv4_block2_1_relu[0][0] __________________________________________________________________________________________________ conv4_block2_2_bn (BatchNormali (None, 2, 2, 256) 1024 conv4_block2_2_conv[0][0] __________________________________________________________________________________________________ conv4_block2_2_relu (Activation (None, 2, 2, 256) 0 conv4_block2_2_bn[0][0] __________________________________________________________________________________________________ conv4_block2_3_conv (Conv2D) (None, 2, 2, 1024) 263168 conv4_block2_2_relu[0][0] __________________________________________________________________________________________________ conv4_block2_3_bn (BatchNormali (None, 2, 2, 1024) 4096 conv4_block2_3_conv[0][0] __________________________________________________________________________________________________ conv4_block2_add (Add) (None, 2, 2, 1024) 0 conv4_block1_out[0][0] conv4_block2_3_bn[0][0] __________________________________________________________________________________________________ conv4_block2_out (Activation) (None, 2, 2, 1024) 0 conv4_block2_add[0][0] __________________________________________________________________________________________________ conv4_block3_1_conv (Conv2D) (None, 2, 2, 256) 262400 conv4_block2_out[0][0] __________________________________________________________________________________________________ conv4_block3_1_bn (BatchNormali (None, 2, 2, 256) 1024 conv4_block3_1_conv[0][0] __________________________________________________________________________________________________ conv4_block3_1_relu (Activation (None, 2, 2, 256) 0 conv4_block3_1_bn[0][0] __________________________________________________________________________________________________ conv4_block3_2_conv (Conv2D) (None, 2, 2, 256) 590080 conv4_block3_1_relu[0][0] __________________________________________________________________________________________________ conv4_block3_2_bn (BatchNormali (None, 2, 2, 256) 1024 conv4_block3_2_conv[0][0] __________________________________________________________________________________________________ conv4_block3_2_relu (Activation (None, 2, 2, 256) 0 conv4_block3_2_bn[0][0] __________________________________________________________________________________________________ conv4_block3_3_conv (Conv2D) (None, 2, 2, 1024) 263168 conv4_block3_2_relu[0][0] __________________________________________________________________________________________________ conv4_block3_3_bn (BatchNormali (None, 2, 2, 1024) 4096 conv4_block3_3_conv[0][0] __________________________________________________________________________________________________ conv4_block3_add (Add) (None, 2, 2, 1024) 0 conv4_block2_out[0][0] conv4_block3_3_bn[0][0] __________________________________________________________________________________________________ conv4_block3_out (Activation) (None, 2, 2, 1024) 0 conv4_block3_add[0][0] __________________________________________________________________________________________________ conv4_block4_1_conv (Conv2D) (None, 2, 2, 256) 262400 conv4_block3_out[0][0] __________________________________________________________________________________________________ conv4_block4_1_bn (BatchNormali (None, 2, 2, 256) 1024 conv4_block4_1_conv[0][0] __________________________________________________________________________________________________ conv4_block4_1_relu (Activation (None, 2, 2, 256) 0 conv4_block4_1_bn[0][0] __________________________________________________________________________________________________ conv4_block4_2_conv (Conv2D) (None, 2, 2, 256) 590080 conv4_block4_1_relu[0][0] __________________________________________________________________________________________________ conv4_block4_2_bn (BatchNormali (None, 2, 2, 256) 1024 conv4_block4_2_conv[0][0] __________________________________________________________________________________________________ conv4_block4_2_relu (Activation (None, 2, 2, 256) 0 conv4_block4_2_bn[0][0] __________________________________________________________________________________________________ conv4_block4_3_conv (Conv2D) (None, 2, 2, 1024) 263168 conv4_block4_2_relu[0][0] __________________________________________________________________________________________________ conv4_block4_3_bn (BatchNormali (None, 2, 2, 1024) 4096 conv4_block4_3_conv[0][0] __________________________________________________________________________________________________ conv4_block4_add (Add) (None, 2, 2, 1024) 0 conv4_block3_out[0][0] conv4_block4_3_bn[0][0] __________________________________________________________________________________________________ conv4_block4_out (Activation) (None, 2, 2, 1024) 0 conv4_block4_add[0][0] __________________________________________________________________________________________________ conv4_block5_1_conv (Conv2D) (None, 2, 2, 256) 262400 conv4_block4_out[0][0] __________________________________________________________________________________________________ conv4_block5_1_bn (BatchNormali (None, 2, 2, 256) 1024 conv4_block5_1_conv[0][0] __________________________________________________________________________________________________ conv4_block5_1_relu (Activation (None, 2, 2, 256) 0 conv4_block5_1_bn[0][0] __________________________________________________________________________________________________ conv4_block5_2_conv (Conv2D) (None, 2, 2, 256) 590080 conv4_block5_1_relu[0][0] __________________________________________________________________________________________________ conv4_block5_2_bn (BatchNormali (None, 2, 2, 256) 1024 conv4_block5_2_conv[0][0] __________________________________________________________________________________________________ conv4_block5_2_relu (Activation (None, 2, 2, 256) 0 conv4_block5_2_bn[0][0] __________________________________________________________________________________________________ conv4_block5_3_conv (Conv2D) (None, 2, 2, 1024) 263168 conv4_block5_2_relu[0][0] __________________________________________________________________________________________________ conv4_block5_3_bn (BatchNormali (None, 2, 2, 1024) 4096 conv4_block5_3_conv[0][0] __________________________________________________________________________________________________ conv4_block5_add (Add) (None, 2, 2, 1024) 0 conv4_block4_out[0][0] conv4_block5_3_bn[0][0] __________________________________________________________________________________________________ conv4_block5_out (Activation) (None, 2, 2, 1024) 0 conv4_block5_add[0][0] __________________________________________________________________________________________________ conv4_block6_1_conv (Conv2D) (None, 2, 2, 256) 262400 conv4_block5_out[0][0] __________________________________________________________________________________________________ conv4_block6_1_bn (BatchNormali (None, 2, 2, 256) 1024 conv4_block6_1_conv[0][0] __________________________________________________________________________________________________ conv4_block6_1_relu (Activation (None, 2, 2, 256) 0 conv4_block6_1_bn[0][0] __________________________________________________________________________________________________ conv4_block6_2_conv (Conv2D) (None, 2, 2, 256) 590080 conv4_block6_1_relu[0][0] __________________________________________________________________________________________________ conv4_block6_2_bn (BatchNormali (None, 2, 2, 256) 1024 conv4_block6_2_conv[0][0] __________________________________________________________________________________________________ conv4_block6_2_relu (Activation (None, 2, 2, 256) 0 conv4_block6_2_bn[0][0] __________________________________________________________________________________________________ conv4_block6_3_conv (Conv2D) (None, 2, 2, 1024) 263168 conv4_block6_2_relu[0][0] __________________________________________________________________________________________________ conv4_block6_3_bn (BatchNormali (None, 2, 2, 1024) 4096 conv4_block6_3_conv[0][0] __________________________________________________________________________________________________ conv4_block6_add (Add) (None, 2, 2, 1024) 0 conv4_block5_out[0][0] conv4_block6_3_bn[0][0] __________________________________________________________________________________________________ conv4_block6_out (Activation) (None, 2, 2, 1024) 0 conv4_block6_add[0][0] __________________________________________________________________________________________________ conv5_block1_1_conv (Conv2D) (None, 1, 1, 512) 524800 conv4_block6_out[0][0] __________________________________________________________________________________________________ conv5_block1_1_bn (BatchNormali (None, 1, 1, 512) 2048 conv5_block1_1_conv[0][0] __________________________________________________________________________________________________ conv5_block1_1_relu (Activation (None, 1, 1, 512) 0 conv5_block1_1_bn[0][0] __________________________________________________________________________________________________ conv5_block1_2_conv (Conv2D) (None, 1, 1, 512) 2359808 conv5_block1_1_relu[0][0] __________________________________________________________________________________________________ conv5_block1_2_bn (BatchNormali (None, 1, 1, 512) 2048 conv5_block1_2_conv[0][0] __________________________________________________________________________________________________ conv5_block1_2_relu (Activation (None, 1, 1, 512) 0 conv5_block1_2_bn[0][0] __________________________________________________________________________________________________ conv5_block1_0_conv (Conv2D) (None, 1, 1, 2048) 2099200 conv4_block6_out[0][0] __________________________________________________________________________________________________ conv5_block1_3_conv (Conv2D) (None, 1, 1, 2048) 1050624 conv5_block1_2_relu[0][0] __________________________________________________________________________________________________ conv5_block1_0_bn (BatchNormali (None, 1, 1, 2048) 8192 conv5_block1_0_conv[0][0] __________________________________________________________________________________________________ conv5_block1_3_bn (BatchNormali (None, 1, 1, 2048) 8192 conv5_block1_3_conv[0][0] __________________________________________________________________________________________________ conv5_block1_add (Add) (None, 1, 1, 2048) 0 conv5_block1_0_bn[0][0] conv5_block1_3_bn[0][0] __________________________________________________________________________________________________ conv5_block1_out (Activation) (None, 1, 1, 2048) 0 conv5_block1_add[0][0] __________________________________________________________________________________________________ conv5_block2_1_conv (Conv2D) (None, 1, 1, 512) 1049088 conv5_block1_out[0][0] __________________________________________________________________________________________________ conv5_block2_1_bn (BatchNormali (None, 1, 1, 512) 2048 conv5_block2_1_conv[0][0] __________________________________________________________________________________________________ conv5_block2_1_relu (Activation (None, 1, 1, 512) 0 conv5_block2_1_bn[0][0] __________________________________________________________________________________________________ conv5_block2_2_conv (Conv2D) (None, 1, 1, 512) 2359808 conv5_block2_1_relu[0][0] __________________________________________________________________________________________________ conv5_block2_2_bn (BatchNormali (None, 1, 1, 512) 2048 conv5_block2_2_conv[0][0] __________________________________________________________________________________________________ conv5_block2_2_relu (Activation (None, 1, 1, 512) 0 conv5_block2_2_bn[0][0] __________________________________________________________________________________________________ conv5_block2_3_conv (Conv2D) (None, 1, 1, 2048) 1050624 conv5_block2_2_relu[0][0] __________________________________________________________________________________________________ conv5_block2_3_bn (BatchNormali (None, 1, 1, 2048) 8192 conv5_block2_3_conv[0][0] __________________________________________________________________________________________________ conv5_block2_add (Add) (None, 1, 1, 2048) 0 conv5_block1_out[0][0] conv5_block2_3_bn[0][0] __________________________________________________________________________________________________ conv5_block2_out (Activation) (None, 1, 1, 2048) 0 conv5_block2_add[0][0] __________________________________________________________________________________________________ conv5_block3_1_conv (Conv2D) (None, 1, 1, 512) 1049088 conv5_block2_out[0][0] __________________________________________________________________________________________________ conv5_block3_1_bn (BatchNormali (None, 1, 1, 512) 2048 conv5_block3_1_conv[0][0] __________________________________________________________________________________________________ conv5_block3_1_relu (Activation (None, 1, 1, 512) 0 conv5_block3_1_bn[0][0] __________________________________________________________________________________________________ conv5_block3_2_conv (Conv2D) (None, 1, 1, 512) 2359808 conv5_block3_1_relu[0][0] __________________________________________________________________________________________________ conv5_block3_2_bn (BatchNormali (None, 1, 1, 512) 2048 conv5_block3_2_conv[0][0] __________________________________________________________________________________________________ conv5_block3_2_relu (Activation (None, 1, 1, 512) 0 conv5_block3_2_bn[0][0] __________________________________________________________________________________________________ conv5_block3_3_conv (Conv2D) (None, 1, 1, 2048) 1050624 conv5_block3_2_relu[0][0] __________________________________________________________________________________________________ conv5_block3_3_bn (BatchNormali (None, 1, 1, 2048) 8192 conv5_block3_3_conv[0][0] __________________________________________________________________________________________________ conv5_block3_add (Add) (None, 1, 1, 2048) 0 conv5_block2_out[0][0] conv5_block3_3_bn[0][0] __________________________________________________________________________________________________ conv5_block3_out (Activation) (None, 1, 1, 2048) 0 conv5_block3_add[0][0] __________________________________________________________________________________________________ avg_pool (GlobalAveragePooling2 (None, 2048) 0 conv5_block3_out[0][0] __________________________________________________________________________________________________ q_mean_dense (Dense) (None, 32) 65568 avg_pool[0][0] __________________________________________________________________________________________________ q_logvar_dense (Dense) (None, 32) 65568 avg_pool[0][0] __________________________________________________________________________________________________ concatenate_3 (Concatenate) (None, 64) 0 q_mean_dense[0][0] q_logvar_dense[0][0] ================================================================================================== Total params: 23,918,464 Trainable params: 23,864,576 Non-trainable params: 53,888 __________________________________________________________________________________________________ Model: "decoder_r2" __________________________________________________________________________________________________ Layer (type) Output Shape Param # Connected to ================================================================================================== input_1 (InputLayer) [(None, 2048, 3)] 0 __________________________________________________________________________________________________ tf.split (TFOpLambda) [(None, 2048, 1), (N 0 input_1[0][0] __________________________________________________________________________________________________ shared_det_0_conv_0 (Conv1D) (None, 2048, 32) 2080 tf.split[0][0] __________________________________________________________________________________________________ shared_det_0_conv_0_batchnorm ( (None, 2048, 32) 128 shared_det_0_conv_0[0][0] __________________________________________________________________________________________________ leaky_re_lu (LeakyReLU) multiple 0 shared_det_0_conv_0_batchnorm[0][ shared_det_0_conv_1_batchnorm[0][ shared_det_0_conv_2_batchnorm[0][ shared_det_0_conv_3_batchnorm[0][ shared_det_1_conv_0_batchnorm[0][ shared_det_1_conv_1_batchnorm[0][ shared_det_1_conv_2_batchnorm[0][ shared_det_1_conv_3_batchnorm[0][ shared_det_2_conv_0_batchnorm[0][ shared_det_2_conv_1_batchnorm[0][ shared_det_2_conv_2_batchnorm[0][ shared_det_2_conv_3_batchnorm[0][ __________________________________________________________________________________________________ max_pooling1d (MaxPooling1D) (None, 1024, 32) 0 leaky_re_lu[0][0] __________________________________________________________________________________________________ shared_det_0_conv_1 (Conv1D) (None, 512, 32) 32800 max_pooling1d[0][0] __________________________________________________________________________________________________ shared_det_0_conv_1_batchnorm ( (None, 512, 32) 128 shared_det_0_conv_1[0][0] __________________________________________________________________________________________________ max_pooling1d_1 (MaxPooling1D) (None, 256, 32) 0 leaky_re_lu[1][0] __________________________________________________________________________________________________ shared_det_0_conv_2 (Conv1D) (None, 256, 32) 16416 max_pooling1d_1[0][0] __________________________________________________________________________________________________ shared_det_0_conv_2_batchnorm ( (None, 256, 32) 128 shared_det_0_conv_2[0][0] __________________________________________________________________________________________________ max_pooling1d_2 (MaxPooling1D) (None, 128, 32) 0 leaky_re_lu[2][0] __________________________________________________________________________________________________ shared_det_0_conv_3 (Conv1D) (None, 64, 32) 8224 max_pooling1d_2[0][0] __________________________________________________________________________________________________ shared_det_1_conv_0 (Conv1D) (None, 2048, 32) 2080 tf.split[0][1] __________________________________________________________________________________________________ shared_det_0_conv_3_batchnorm ( (None, 64, 32) 128 shared_det_0_conv_3[0][0] __________________________________________________________________________________________________ shared_det_1_conv_0_batchnorm ( (None, 2048, 32) 128 shared_det_1_conv_0[0][0] __________________________________________________________________________________________________ max_pooling1d_4 (MaxPooling1D) (None, 1024, 32) 0 leaky_re_lu[4][0] __________________________________________________________________________________________________ shared_det_1_conv_1 (Conv1D) (None, 512, 32) 32800 max_pooling1d_4[0][0] __________________________________________________________________________________________________ shared_det_1_conv_1_batchnorm ( (None, 512, 32) 128 shared_det_1_conv_1[0][0] __________________________________________________________________________________________________ max_pooling1d_5 (MaxPooling1D) (None, 256, 32) 0 leaky_re_lu[5][0] __________________________________________________________________________________________________ shared_det_1_conv_2 (Conv1D) (None, 256, 32) 16416 max_pooling1d_5[0][0] __________________________________________________________________________________________________ shared_det_1_conv_2_batchnorm ( (None, 256, 32) 128 shared_det_1_conv_2[0][0] __________________________________________________________________________________________________ max_pooling1d_6 (MaxPooling1D) (None, 128, 32) 0 leaky_re_lu[6][0] __________________________________________________________________________________________________ shared_det_1_conv_3 (Conv1D) (None, 64, 32) 8224 max_pooling1d_6[0][0] __________________________________________________________________________________________________ shared_det_2_conv_0 (Conv1D) (None, 2048, 32) 2080 tf.split[0][2] __________________________________________________________________________________________________ shared_det_1_conv_3_batchnorm ( (None, 64, 32) 128 shared_det_1_conv_3[0][0] __________________________________________________________________________________________________ shared_det_2_conv_0_batchnorm ( (None, 2048, 32) 128 shared_det_2_conv_0[0][0] __________________________________________________________________________________________________ max_pooling1d_8 (MaxPooling1D) (None, 1024, 32) 0 leaky_re_lu[8][0] __________________________________________________________________________________________________ shared_det_2_conv_1 (Conv1D) (None, 512, 32) 32800 max_pooling1d_8[0][0] __________________________________________________________________________________________________ shared_det_2_conv_1_batchnorm ( (None, 512, 32) 128 shared_det_2_conv_1[0][0] __________________________________________________________________________________________________ max_pooling1d_9 (MaxPooling1D) (None, 256, 32) 0 leaky_re_lu[9][0] __________________________________________________________________________________________________ shared_det_2_conv_2 (Conv1D) (None, 256, 32) 16416 max_pooling1d_9[0][0] __________________________________________________________________________________________________ shared_det_2_conv_2_batchnorm ( (None, 256, 32) 128 shared_det_2_conv_2[0][0] __________________________________________________________________________________________________ max_pooling1d_10 (MaxPooling1D) (None, 128, 32) 0 leaky_re_lu[10][0] __________________________________________________________________________________________________ shared_det_2_conv_3 (Conv1D) (None, 64, 32) 8224 max_pooling1d_10[0][0] __________________________________________________________________________________________________ shared_det_2_conv_3_batchnorm ( (None, 64, 32) 128 shared_det_2_conv_3[0][0] __________________________________________________________________________________________________ input_3 (InputLayer) [(None, 32)] 0 __________________________________________________________________________________________________ max_pooling1d_3 (MaxPooling1D) (None, 32, 32) 0 leaky_re_lu[3][0] __________________________________________________________________________________________________ max_pooling1d_7 (MaxPooling1D) (None, 32, 32) 0 leaky_re_lu[7][0] __________________________________________________________________________________________________ max_pooling1d_11 (MaxPooling1D) (None, 32, 32) 0 leaky_re_lu[11][0] __________________________________________________________________________________________________ flatten_1 (Flatten) (None, 32) 0 input_3[0][0] __________________________________________________________________________________________________ tf.expand_dims (TFOpLambda) (None, 32, 32, 1) 0 max_pooling1d_3[0][0] __________________________________________________________________________________________________ tf.expand_dims_1 (TFOpLambda) (None, 32, 32, 1) 0 max_pooling1d_7[0][0] __________________________________________________________________________________________________ tf.expand_dims_2 (TFOpLambda) (None, 32, 32, 1) 0 max_pooling1d_11[0][0] __________________________________________________________________________________________________ r2_expand (Dense) (None, 1024) 33792 flatten_1[0][0] __________________________________________________________________________________________________ concatenate (Concatenate) (None, 32, 32, 3) 0 tf.expand_dims[0][0] tf.expand_dims_1[0][0] tf.expand_dims_2[0][0] __________________________________________________________________________________________________ tf.reshape_1 (TFOpLambda) (None, 32, 32, 1) 0 r2_expand[0][0] __________________________________________________________________________________________________ concatenate_4 (Concatenate) (None, 32, 32, 4) 0 concatenate[0][0] tf.reshape_1[0][0] __________________________________________________________________________________________________ conv1_pad (ZeroPadding2D) (None, 38, 38, 4) 0 concatenate_4[0][0] __________________________________________________________________________________________________ conv1_conv (Conv2D) (None, 16, 16, 64) 12608 conv1_pad[0][0] __________________________________________________________________________________________________ conv1_bn (BatchNormalization) (None, 16, 16, 64) 256 conv1_conv[0][0] __________________________________________________________________________________________________ conv1_relu (Activation) (None, 16, 16, 64) 0 conv1_bn[0][0] __________________________________________________________________________________________________ pool1_pad (ZeroPadding2D) (None, 18, 18, 64) 0 conv1_relu[0][0] __________________________________________________________________________________________________ pool1_pool (MaxPooling2D) (None, 8, 8, 64) 0 pool1_pad[0][0] __________________________________________________________________________________________________ conv2_block1_1_conv (Conv2D) (None, 8, 8, 64) 4160 pool1_pool[0][0] __________________________________________________________________________________________________ conv2_block1_1_bn (BatchNormali (None, 8, 8, 64) 256 conv2_block1_1_conv[0][0] __________________________________________________________________________________________________ conv2_block1_1_relu (Activation (None, 8, 8, 64) 0 conv2_block1_1_bn[0][0] __________________________________________________________________________________________________ conv2_block1_2_conv (Conv2D) (None, 8, 8, 64) 36928 conv2_block1_1_relu[0][0] __________________________________________________________________________________________________ conv2_block1_2_bn (BatchNormali (None, 8, 8, 64) 256 conv2_block1_2_conv[0][0] __________________________________________________________________________________________________ conv2_block1_2_relu (Activation (None, 8, 8, 64) 0 conv2_block1_2_bn[0][0] __________________________________________________________________________________________________ conv2_block1_0_conv (Conv2D) (None, 8, 8, 256) 16640 pool1_pool[0][0] __________________________________________________________________________________________________ conv2_block1_3_conv (Conv2D) (None, 8, 8, 256) 16640 conv2_block1_2_relu[0][0] __________________________________________________________________________________________________ conv2_block1_0_bn (BatchNormali (None, 8, 8, 256) 1024 conv2_block1_0_conv[0][0] __________________________________________________________________________________________________ conv2_block1_3_bn (BatchNormali (None, 8, 8, 256) 1024 conv2_block1_3_conv[0][0] __________________________________________________________________________________________________ conv2_block1_add (Add) (None, 8, 8, 256) 0 conv2_block1_0_bn[0][0] conv2_block1_3_bn[0][0] __________________________________________________________________________________________________ conv2_block1_out (Activation) (None, 8, 8, 256) 0 conv2_block1_add[0][0] __________________________________________________________________________________________________ conv2_block2_1_conv (Conv2D) (None, 8, 8, 64) 16448 conv2_block1_out[0][0] __________________________________________________________________________________________________ conv2_block2_1_bn (BatchNormali (None, 8, 8, 64) 256 conv2_block2_1_conv[0][0] __________________________________________________________________________________________________ conv2_block2_1_relu (Activation (None, 8, 8, 64) 0 conv2_block2_1_bn[0][0] __________________________________________________________________________________________________ conv2_block2_2_conv (Conv2D) (None, 8, 8, 64) 36928 conv2_block2_1_relu[0][0] __________________________________________________________________________________________________ conv2_block2_2_bn (BatchNormali (None, 8, 8, 64) 256 conv2_block2_2_conv[0][0] __________________________________________________________________________________________________ conv2_block2_2_relu (Activation (None, 8, 8, 64) 0 conv2_block2_2_bn[0][0] __________________________________________________________________________________________________ conv2_block2_3_conv (Conv2D) (None, 8, 8, 256) 16640 conv2_block2_2_relu[0][0] __________________________________________________________________________________________________ conv2_block2_3_bn (BatchNormali (None, 8, 8, 256) 1024 conv2_block2_3_conv[0][0] __________________________________________________________________________________________________ conv2_block2_add (Add) (None, 8, 8, 256) 0 conv2_block1_out[0][0] conv2_block2_3_bn[0][0] __________________________________________________________________________________________________ conv2_block2_out (Activation) (None, 8, 8, 256) 0 conv2_block2_add[0][0] __________________________________________________________________________________________________ conv2_block3_1_conv (Conv2D) (None, 8, 8, 64) 16448 conv2_block2_out[0][0] __________________________________________________________________________________________________ conv2_block3_1_bn (BatchNormali (None, 8, 8, 64) 256 conv2_block3_1_conv[0][0] __________________________________________________________________________________________________ conv2_block3_1_relu (Activation (None, 8, 8, 64) 0 conv2_block3_1_bn[0][0] __________________________________________________________________________________________________ conv2_block3_2_conv (Conv2D) (None, 8, 8, 64) 36928 conv2_block3_1_relu[0][0] __________________________________________________________________________________________________ conv2_block3_2_bn (BatchNormali (None, 8, 8, 64) 256 conv2_block3_2_conv[0][0] __________________________________________________________________________________________________ conv2_block3_2_relu (Activation (None, 8, 8, 64) 0 conv2_block3_2_bn[0][0] __________________________________________________________________________________________________ conv2_block3_3_conv (Conv2D) (None, 8, 8, 256) 16640 conv2_block3_2_relu[0][0] __________________________________________________________________________________________________ conv2_block3_3_bn (BatchNormali (None, 8, 8, 256) 1024 conv2_block3_3_conv[0][0] __________________________________________________________________________________________________ conv2_block3_add (Add) (None, 8, 8, 256) 0 conv2_block2_out[0][0] conv2_block3_3_bn[0][0] __________________________________________________________________________________________________ conv2_block3_out (Activation) (None, 8, 8, 256) 0 conv2_block3_add[0][0] __________________________________________________________________________________________________ conv3_block1_1_conv (Conv2D) (None, 4, 4, 128) 32896 conv2_block3_out[0][0] __________________________________________________________________________________________________ conv3_block1_1_bn (BatchNormali (None, 4, 4, 128) 512 conv3_block1_1_conv[0][0] __________________________________________________________________________________________________ conv3_block1_1_relu (Activation (None, 4, 4, 128) 0 conv3_block1_1_bn[0][0] __________________________________________________________________________________________________ conv3_block1_2_conv (Conv2D) (None, 4, 4, 128) 147584 conv3_block1_1_relu[0][0] __________________________________________________________________________________________________ conv3_block1_2_bn (BatchNormali (None, 4, 4, 128) 512 conv3_block1_2_conv[0][0] __________________________________________________________________________________________________ conv3_block1_2_relu (Activation (None, 4, 4, 128) 0 conv3_block1_2_bn[0][0] __________________________________________________________________________________________________ conv3_block1_0_conv (Conv2D) (None, 4, 4, 512) 131584 conv2_block3_out[0][0] __________________________________________________________________________________________________ conv3_block1_3_conv (Conv2D) (None, 4, 4, 512) 66048 conv3_block1_2_relu[0][0] __________________________________________________________________________________________________ conv3_block1_0_bn (BatchNormali (None, 4, 4, 512) 2048 conv3_block1_0_conv[0][0] __________________________________________________________________________________________________ conv3_block1_3_bn (BatchNormali (None, 4, 4, 512) 2048 conv3_block1_3_conv[0][0] __________________________________________________________________________________________________ conv3_block1_add (Add) (None, 4, 4, 512) 0 conv3_block1_0_bn[0][0] conv3_block1_3_bn[0][0] __________________________________________________________________________________________________ conv3_block1_out (Activation) (None, 4, 4, 512) 0 conv3_block1_add[0][0] __________________________________________________________________________________________________ conv3_block2_1_conv (Conv2D) (None, 4, 4, 128) 65664 conv3_block1_out[0][0] __________________________________________________________________________________________________ conv3_block2_1_bn (BatchNormali (None, 4, 4, 128) 512 conv3_block2_1_conv[0][0] __________________________________________________________________________________________________ conv3_block2_1_relu (Activation (None, 4, 4, 128) 0 conv3_block2_1_bn[0][0] __________________________________________________________________________________________________ conv3_block2_2_conv (Conv2D) (None, 4, 4, 128) 147584 conv3_block2_1_relu[0][0] __________________________________________________________________________________________________ conv3_block2_2_bn (BatchNormali (None, 4, 4, 128) 512 conv3_block2_2_conv[0][0] __________________________________________________________________________________________________ conv3_block2_2_relu (Activation (None, 4, 4, 128) 0 conv3_block2_2_bn[0][0] __________________________________________________________________________________________________ conv3_block2_3_conv (Conv2D) (None, 4, 4, 512) 66048 conv3_block2_2_relu[0][0] __________________________________________________________________________________________________ conv3_block2_3_bn (BatchNormali (None, 4, 4, 512) 2048 conv3_block2_3_conv[0][0] __________________________________________________________________________________________________ conv3_block2_add (Add) (None, 4, 4, 512) 0 conv3_block1_out[0][0] conv3_block2_3_bn[0][0] __________________________________________________________________________________________________ conv3_block2_out (Activation) (None, 4, 4, 512) 0 conv3_block2_add[0][0] __________________________________________________________________________________________________ conv3_block3_1_conv (Conv2D) (None, 4, 4, 128) 65664 conv3_block2_out[0][0] __________________________________________________________________________________________________ conv3_block3_1_bn (BatchNormali (None, 4, 4, 128) 512 conv3_block3_1_conv[0][0] __________________________________________________________________________________________________ conv3_block3_1_relu (Activation (None, 4, 4, 128) 0 conv3_block3_1_bn[0][0] __________________________________________________________________________________________________ conv3_block3_2_conv (Conv2D) (None, 4, 4, 128) 147584 conv3_block3_1_relu[0][0] __________________________________________________________________________________________________ conv3_block3_2_bn (BatchNormali (None, 4, 4, 128) 512 conv3_block3_2_conv[0][0] __________________________________________________________________________________________________ conv3_block3_2_relu (Activation (None, 4, 4, 128) 0 conv3_block3_2_bn[0][0] __________________________________________________________________________________________________ conv3_block3_3_conv (Conv2D) (None, 4, 4, 512) 66048 conv3_block3_2_relu[0][0] __________________________________________________________________________________________________ conv3_block3_3_bn (BatchNormali (None, 4, 4, 512) 2048 conv3_block3_3_conv[0][0] __________________________________________________________________________________________________ conv3_block3_add (Add) (None, 4, 4, 512) 0 conv3_block2_out[0][0] conv3_block3_3_bn[0][0] __________________________________________________________________________________________________ conv3_block3_out (Activation) (None, 4, 4, 512) 0 conv3_block3_add[0][0] __________________________________________________________________________________________________ conv3_block4_1_conv (Conv2D) (None, 4, 4, 128) 65664 conv3_block3_out[0][0] __________________________________________________________________________________________________ conv3_block4_1_bn (BatchNormali (None, 4, 4, 128) 512 conv3_block4_1_conv[0][0] __________________________________________________________________________________________________ conv3_block4_1_relu (Activation (None, 4, 4, 128) 0 conv3_block4_1_bn[0][0] __________________________________________________________________________________________________ conv3_block4_2_conv (Conv2D) (None, 4, 4, 128) 147584 conv3_block4_1_relu[0][0] __________________________________________________________________________________________________ conv3_block4_2_bn (BatchNormali (None, 4, 4, 128) 512 conv3_block4_2_conv[0][0] __________________________________________________________________________________________________ conv3_block4_2_relu (Activation (None, 4, 4, 128) 0 conv3_block4_2_bn[0][0] __________________________________________________________________________________________________ conv3_block4_3_conv (Conv2D) (None, 4, 4, 512) 66048 conv3_block4_2_relu[0][0] __________________________________________________________________________________________________ conv3_block4_3_bn (BatchNormali (None, 4, 4, 512) 2048 conv3_block4_3_conv[0][0] __________________________________________________________________________________________________ conv3_block4_add (Add) (None, 4, 4, 512) 0 conv3_block3_out[0][0] conv3_block4_3_bn[0][0] __________________________________________________________________________________________________ conv3_block4_out (Activation) (None, 4, 4, 512) 0 conv3_block4_add[0][0] __________________________________________________________________________________________________ conv4_block1_1_conv (Conv2D) (None, 2, 2, 256) 131328 conv3_block4_out[0][0] __________________________________________________________________________________________________ conv4_block1_1_bn (BatchNormali (None, 2, 2, 256) 1024 conv4_block1_1_conv[0][0] __________________________________________________________________________________________________ conv4_block1_1_relu (Activation (None, 2, 2, 256) 0 conv4_block1_1_bn[0][0] __________________________________________________________________________________________________ conv4_block1_2_conv (Conv2D) (None, 2, 2, 256) 590080 conv4_block1_1_relu[0][0] __________________________________________________________________________________________________ conv4_block1_2_bn (BatchNormali (None, 2, 2, 256) 1024 conv4_block1_2_conv[0][0] __________________________________________________________________________________________________ conv4_block1_2_relu (Activation (None, 2, 2, 256) 0 conv4_block1_2_bn[0][0] __________________________________________________________________________________________________ conv4_block1_0_conv (Conv2D) (None, 2, 2, 1024) 525312 conv3_block4_out[0][0] __________________________________________________________________________________________________ conv4_block1_3_conv (Conv2D) (None, 2, 2, 1024) 263168 conv4_block1_2_relu[0][0] __________________________________________________________________________________________________ conv4_block1_0_bn (BatchNormali (None, 2, 2, 1024) 4096 conv4_block1_0_conv[0][0] __________________________________________________________________________________________________ conv4_block1_3_bn (BatchNormali (None, 2, 2, 1024) 4096 conv4_block1_3_conv[0][0] __________________________________________________________________________________________________ conv4_block1_add (Add) (None, 2, 2, 1024) 0 conv4_block1_0_bn[0][0] conv4_block1_3_bn[0][0] __________________________________________________________________________________________________ conv4_block1_out (Activation) (None, 2, 2, 1024) 0 conv4_block1_add[0][0] __________________________________________________________________________________________________ conv4_block2_1_conv (Conv2D) (None, 2, 2, 256) 262400 conv4_block1_out[0][0] __________________________________________________________________________________________________ conv4_block2_1_bn (BatchNormali (None, 2, 2, 256) 1024 conv4_block2_1_conv[0][0] __________________________________________________________________________________________________ conv4_block2_1_relu (Activation (None, 2, 2, 256) 0 conv4_block2_1_bn[0][0] __________________________________________________________________________________________________ conv4_block2_2_conv (Conv2D) (None, 2, 2, 256) 590080 conv4_block2_1_relu[0][0] __________________________________________________________________________________________________ conv4_block2_2_bn (BatchNormali (None, 2, 2, 256) 1024 conv4_block2_2_conv[0][0] __________________________________________________________________________________________________ conv4_block2_2_relu (Activation (None, 2, 2, 256) 0 conv4_block2_2_bn[0][0] __________________________________________________________________________________________________ conv4_block2_3_conv (Conv2D) (None, 2, 2, 1024) 263168 conv4_block2_2_relu[0][0] __________________________________________________________________________________________________ conv4_block2_3_bn (BatchNormali (None, 2, 2, 1024) 4096 conv4_block2_3_conv[0][0] __________________________________________________________________________________________________ conv4_block2_add (Add) (None, 2, 2, 1024) 0 conv4_block1_out[0][0] conv4_block2_3_bn[0][0] __________________________________________________________________________________________________ conv4_block2_out (Activation) (None, 2, 2, 1024) 0 conv4_block2_add[0][0] __________________________________________________________________________________________________ conv4_block3_1_conv (Conv2D) (None, 2, 2, 256) 262400 conv4_block2_out[0][0] __________________________________________________________________________________________________ conv4_block3_1_bn (BatchNormali (None, 2, 2, 256) 1024 conv4_block3_1_conv[0][0] __________________________________________________________________________________________________ conv4_block3_1_relu (Activation (None, 2, 2, 256) 0 conv4_block3_1_bn[0][0] __________________________________________________________________________________________________ conv4_block3_2_conv (Conv2D) (None, 2, 2, 256) 590080 conv4_block3_1_relu[0][0] __________________________________________________________________________________________________ conv4_block3_2_bn (BatchNormali (None, 2, 2, 256) 1024 conv4_block3_2_conv[0][0] __________________________________________________________________________________________________ conv4_block3_2_relu (Activation (None, 2, 2, 256) 0 conv4_block3_2_bn[0][0] __________________________________________________________________________________________________ conv4_block3_3_conv (Conv2D) (None, 2, 2, 1024) 263168 conv4_block3_2_relu[0][0] __________________________________________________________________________________________________ conv4_block3_3_bn (BatchNormali (None, 2, 2, 1024) 4096 conv4_block3_3_conv[0][0] __________________________________________________________________________________________________ conv4_block3_add (Add) (None, 2, 2, 1024) 0 conv4_block2_out[0][0] conv4_block3_3_bn[0][0] __________________________________________________________________________________________________ conv4_block3_out (Activation) (None, 2, 2, 1024) 0 conv4_block3_add[0][0] __________________________________________________________________________________________________ conv4_block4_1_conv (Conv2D) (None, 2, 2, 256) 262400 conv4_block3_out[0][0] __________________________________________________________________________________________________ conv4_block4_1_bn (BatchNormali (None, 2, 2, 256) 1024 conv4_block4_1_conv[0][0] __________________________________________________________________________________________________ conv4_block4_1_relu (Activation (None, 2, 2, 256) 0 conv4_block4_1_bn[0][0] __________________________________________________________________________________________________ conv4_block4_2_conv (Conv2D) (None, 2, 2, 256) 590080 conv4_block4_1_relu[0][0] __________________________________________________________________________________________________ conv4_block4_2_bn (BatchNormali (None, 2, 2, 256) 1024 conv4_block4_2_conv[0][0] __________________________________________________________________________________________________ conv4_block4_2_relu (Activation (None, 2, 2, 256) 0 conv4_block4_2_bn[0][0] __________________________________________________________________________________________________ conv4_block4_3_conv (Conv2D) (None, 2, 2, 1024) 263168 conv4_block4_2_relu[0][0] __________________________________________________________________________________________________ conv4_block4_3_bn (BatchNormali (None, 2, 2, 1024) 4096 conv4_block4_3_conv[0][0] __________________________________________________________________________________________________ conv4_block4_add (Add) (None, 2, 2, 1024) 0 conv4_block3_out[0][0] conv4_block4_3_bn[0][0] __________________________________________________________________________________________________ conv4_block4_out (Activation) (None, 2, 2, 1024) 0 conv4_block4_add[0][0] __________________________________________________________________________________________________ conv4_block5_1_conv (Conv2D) (None, 2, 2, 256) 262400 conv4_block4_out[0][0] __________________________________________________________________________________________________ conv4_block5_1_bn (BatchNormali (None, 2, 2, 256) 1024 conv4_block5_1_conv[0][0] __________________________________________________________________________________________________ conv4_block5_1_relu (Activation (None, 2, 2, 256) 0 conv4_block5_1_bn[0][0] __________________________________________________________________________________________________ conv4_block5_2_conv (Conv2D) (None, 2, 2, 256) 590080 conv4_block5_1_relu[0][0] __________________________________________________________________________________________________ conv4_block5_2_bn (BatchNormali (None, 2, 2, 256) 1024 conv4_block5_2_conv[0][0] __________________________________________________________________________________________________ conv4_block5_2_relu (Activation (None, 2, 2, 256) 0 conv4_block5_2_bn[0][0] __________________________________________________________________________________________________ conv4_block5_3_conv (Conv2D) (None, 2, 2, 1024) 263168 conv4_block5_2_relu[0][0] __________________________________________________________________________________________________ conv4_block5_3_bn (BatchNormali (None, 2, 2, 1024) 4096 conv4_block5_3_conv[0][0] __________________________________________________________________________________________________ conv4_block5_add (Add) (None, 2, 2, 1024) 0 conv4_block4_out[0][0] conv4_block5_3_bn[0][0] __________________________________________________________________________________________________ conv4_block5_out (Activation) (None, 2, 2, 1024) 0 conv4_block5_add[0][0] __________________________________________________________________________________________________ conv4_block6_1_conv (Conv2D) (None, 2, 2, 256) 262400 conv4_block5_out[0][0] __________________________________________________________________________________________________ conv4_block6_1_bn (BatchNormali (None, 2, 2, 256) 1024 conv4_block6_1_conv[0][0] __________________________________________________________________________________________________ conv4_block6_1_relu (Activation (None, 2, 2, 256) 0 conv4_block6_1_bn[0][0] __________________________________________________________________________________________________ conv4_block6_2_conv (Conv2D) (None, 2, 2, 256) 590080 conv4_block6_1_relu[0][0] __________________________________________________________________________________________________ conv4_block6_2_bn (BatchNormali (None, 2, 2, 256) 1024 conv4_block6_2_conv[0][0] __________________________________________________________________________________________________ conv4_block6_2_relu (Activation (None, 2, 2, 256) 0 conv4_block6_2_bn[0][0] __________________________________________________________________________________________________ conv4_block6_3_conv (Conv2D) (None, 2, 2, 1024) 263168 conv4_block6_2_relu[0][0] __________________________________________________________________________________________________ conv4_block6_3_bn (BatchNormali (None, 2, 2, 1024) 4096 conv4_block6_3_conv[0][0] __________________________________________________________________________________________________ conv4_block6_add (Add) (None, 2, 2, 1024) 0 conv4_block5_out[0][0] conv4_block6_3_bn[0][0] __________________________________________________________________________________________________ conv4_block6_out (Activation) (None, 2, 2, 1024) 0 conv4_block6_add[0][0] __________________________________________________________________________________________________ conv5_block1_1_conv (Conv2D) (None, 1, 1, 512) 524800 conv4_block6_out[0][0] __________________________________________________________________________________________________ conv5_block1_1_bn (BatchNormali (None, 1, 1, 512) 2048 conv5_block1_1_conv[0][0] __________________________________________________________________________________________________ conv5_block1_1_relu (Activation (None, 1, 1, 512) 0 conv5_block1_1_bn[0][0] __________________________________________________________________________________________________ conv5_block1_2_conv (Conv2D) (None, 1, 1, 512) 2359808 conv5_block1_1_relu[0][0] __________________________________________________________________________________________________ conv5_block1_2_bn (BatchNormali (None, 1, 1, 512) 2048 conv5_block1_2_conv[0][0] __________________________________________________________________________________________________ conv5_block1_2_relu (Activation (None, 1, 1, 512) 0 conv5_block1_2_bn[0][0] __________________________________________________________________________________________________ conv5_block1_0_conv (Conv2D) (None, 1, 1, 2048) 2099200 conv4_block6_out[0][0] __________________________________________________________________________________________________ conv5_block1_3_conv (Conv2D) (None, 1, 1, 2048) 1050624 conv5_block1_2_relu[0][0] __________________________________________________________________________________________________ conv5_block1_0_bn (BatchNormali (None, 1, 1, 2048) 8192 conv5_block1_0_conv[0][0] __________________________________________________________________________________________________ conv5_block1_3_bn (BatchNormali (None, 1, 1, 2048) 8192 conv5_block1_3_conv[0][0] __________________________________________________________________________________________________ conv5_block1_add (Add) (None, 1, 1, 2048) 0 conv5_block1_0_bn[0][0] conv5_block1_3_bn[0][0] __________________________________________________________________________________________________ conv5_block1_out (Activation) (None, 1, 1, 2048) 0 conv5_block1_add[0][0] __________________________________________________________________________________________________ conv5_block2_1_conv (Conv2D) (None, 1, 1, 512) 1049088 conv5_block1_out[0][0] __________________________________________________________________________________________________ conv5_block2_1_bn (BatchNormali (None, 1, 1, 512) 2048 conv5_block2_1_conv[0][0] __________________________________________________________________________________________________ conv5_block2_1_relu (Activation (None, 1, 1, 512) 0 conv5_block2_1_bn[0][0] __________________________________________________________________________________________________ conv5_block2_2_conv (Conv2D) (None, 1, 1, 512) 2359808 conv5_block2_1_relu[0][0] __________________________________________________________________________________________________ conv5_block2_2_bn (BatchNormali (None, 1, 1, 512) 2048 conv5_block2_2_conv[0][0] __________________________________________________________________________________________________ conv5_block2_2_relu (Activation (None, 1, 1, 512) 0 conv5_block2_2_bn[0][0] __________________________________________________________________________________________________ conv5_block2_3_conv (Conv2D) (None, 1, 1, 2048) 1050624 conv5_block2_2_relu[0][0] __________________________________________________________________________________________________ conv5_block2_3_bn (BatchNormali (None, 1, 1, 2048) 8192 conv5_block2_3_conv[0][0] __________________________________________________________________________________________________ conv5_block2_add (Add) (None, 1, 1, 2048) 0 conv5_block1_out[0][0] conv5_block2_3_bn[0][0] __________________________________________________________________________________________________ conv5_block2_out (Activation) (None, 1, 1, 2048) 0 conv5_block2_add[0][0] __________________________________________________________________________________________________ conv5_block3_1_conv (Conv2D) (None, 1, 1, 512) 1049088 conv5_block2_out[0][0] __________________________________________________________________________________________________ conv5_block3_1_bn (BatchNormali (None, 1, 1, 512) 2048 conv5_block3_1_conv[0][0] __________________________________________________________________________________________________ conv5_block3_1_relu (Activation (None, 1, 1, 512) 0 conv5_block3_1_bn[0][0] __________________________________________________________________________________________________ conv5_block3_2_conv (Conv2D) (None, 1, 1, 512) 2359808 conv5_block3_1_relu[0][0] __________________________________________________________________________________________________ conv5_block3_2_bn (BatchNormali (None, 1, 1, 512) 2048 conv5_block3_2_conv[0][0] __________________________________________________________________________________________________ conv5_block3_2_relu (Activation (None, 1, 1, 512) 0 conv5_block3_2_bn[0][0] __________________________________________________________________________________________________ conv5_block3_3_conv (Conv2D) (None, 1, 1, 2048) 1050624 conv5_block3_2_relu[0][0] __________________________________________________________________________________________________ conv5_block3_3_bn (BatchNormali (None, 1, 1, 2048) 8192 conv5_block3_3_conv[0][0] __________________________________________________________________________________________________ conv5_block3_add (Add) (None, 1, 1, 2048) 0 conv5_block2_out[0][0] conv5_block3_3_bn[0][0] __________________________________________________________________________________________________ conv5_block3_out (Activation) (None, 1, 1, 2048) 0 conv5_block3_add[0][0] __________________________________________________________________________________________________ avg_pool (GlobalAveragePooling2 (None, 2048) 0 conv5_block3_out[0][0] __________________________________________________________________________________________________ JointM1M2_mean (Dense) (None, 2) 4098 avg_pool[0][0] __________________________________________________________________________________________________ JointM1M2_logvar (Dense) (None, 2) 4098 avg_pool[0][0] __________________________________________________________________________________________________ VonMises_mean (Dense) (None, 10) 20490 avg_pool[0][0] __________________________________________________________________________________________________ VonMises_logvar (Dense) (None, 5) 10245 avg_pool[0][0] __________________________________________________________________________________________________ TruncatedNormal_mean (Dense) (None, 6) 12294 avg_pool[0][0] __________________________________________________________________________________________________ TruncatedNormal_logvar (Dense) (None, 6) 12294 avg_pool[0][0] __________________________________________________________________________________________________ JointVonMisesFisher_mean (Dense (None, 3) 6147 avg_pool[0][0] __________________________________________________________________________________________________ JointVonMisesFisher_logvar (Den (None, 1) 2049 avg_pool[0][0] __________________________________________________________________________________________________ concatenate_5 (Concatenate) (None, 35) 0 JointM1M2_mean[0][0] JointM1M2_logvar[0][0] VonMises_mean[0][0] VonMises_logvar[0][0] TruncatedNormal_mean[0][0] TruncatedNormal_logvar[0][0] JointVonMisesFisher_mean[0][0] JointVonMisesFisher_logvar[0][0] ================================================================================================== Total params: 23,876,451 Trainable params: 23,822,563 Non-trainable params: 53,888 __________________________________________________________________________________________________