1

I have saved models for a large number of autoencoders that I am using for my project. They were saved using the autoencoder.save(outdir + "autoencoder_"+params) function.

Is there any way for me to extract the encoder and decoder components of each of these saved models, or would I need to rerun the script and add in the encoder = Model(input, bottleneck) and decoder = Model(bottleneck, output) lines and save those models?

Here is the autoencoder structure I am attempting to retrieve:

autoencoder.summary()

Model: "model_3"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_1 (InputLayer)         [(None, 3593, 4)]         0         
_________________________________________________________________
flatten (Flatten)            (None, 14372)             0         
_________________________________________________________________
dense (Dense)                (None, 1797)              25828281  
_________________________________________________________________
dense_1 (Dense)              (None, 719)               1292762   
_________________________________________________________________
dense_2 (Dense)              (None, 180)               129600    
_________________________________________________________________
dense_3 (Dense)              (None, 719)               130139    
_________________________________________________________________
dense_4 (Dense)              (None, 1797)              1293840   
_________________________________________________________________
dense_5 (Dense)              (None, 14372)             25840856  
_________________________________________________________________
reshape (Reshape)            multiple                  0         
=================================================================
Total params: 54,515,478
Trainable params: 54,515,478
Non-trainable params: 0
_________________________________________________________________

1 Answer 1

1

You can transfer the weights to two different neural network models. All you need is to identify the index of the bottleneck layer which you can easily know by running model.summary()

Here is a snippet that can help you copy the model

bottleneck_index = # this you need to identify
encoder_model = tf.keras.Sequential()
for layer in ae_model.layers[:bottleneck_index]:
    layer_config = layer.get_config()  # to get all layer's parameters (units, activation, etc...)
    copied_layer = type(layer).from_config(layer_config) # to initialize the same layer class with same parameters
    copied_layer.build(layer.input_shape)  # build the layer to initialize the weights.
    copied_layer.set_weights(layer.get_weights())  # transfer the trainable parameters
    encoder_model.add(copied_layer)  # add it to the encoder's model

Do the same for decoder, where ae_model.layers[bottleneck_index:]

Of course, you can even identify the bottleneck index by checking the units of current layers if it is smaller than the consecutive layer.

3
  • It's throwing me an error for the line copied_layer.set_weights(layer.get_weights()): ValueError: You called "set_weights(weights)" on layer "dense_2" with a weight list of length 2, but the layer was expecting 0 weights.
    – Whitehot
    Commented Apr 28, 2021 at 14:21
  • 1
    This is strange, it might failed because the layer didn't initialize the weights, I have updated my code to build layer before transferring the weights, can you give it a try? Check line: copied_layer.build(layer.input_shape)
    – Coderji
    Commented Apr 28, 2021 at 15:13
  • That does fix that issue, but now it is struggling with the reshape layer. I added the summary to the questions, seems like it could be helpful. Here's the error message: AttributeError: The layer "reshape has multiple inbound nodes, with different input shapes. Hence the notion of "input shape" is ill-defined for the layer. Use get_input_shape_at(node_index) instead.
    – Whitehot
    Commented Apr 28, 2021 at 15:30

Not the answer you're looking for? Browse other questions tagged or ask your own question.