Transfer Learning with Fine-Tunning


I’m trying to do it by myself. Modifying the VGG16 component. I’m slow because I’m just a beginner.

By now it doesn’t work :slight_smile:


Hi Clarilla

Nice to see another new face - hope you find what you want here: we’re all beginners at something :slight_smile:

I’m curious about your VGG16 modifications because it’s a pre-trained component. What are you planning to do to it/with it?

By now it doesn’t work :slight_smile:

Feel free to ask for input when you feel like it - if something isn’t working right or you don’t know how to get the best from PL itself on of the PL guys will help out; if it’s ML in general there are others around who might be able to help - depending on what you are trying to do :slight_smile:

Keep us in the loop - we’re all keen to learn new tricks and to benefit from new ideas & ways of doing things!


Yes, VGG16 is a pretrained model. What I want to do is to retrain the last labels of the model with my data.

The model is trained with some group of images. As I understand, the first layers learn basic shapes, colors and the lasts layers can identify objects. So if my images are different from the images that trained the VGG16, maybe it won’t be a good model for my.

So the Fine-tunning refers to retrain the last labels with my data.

It’s supposed that it can be done. I have done it with code copied from a book. I wanted to try it here whith PL.

In the VGG16 component, where it says:

   self.vgg16.trainable = False

I changed for:

    self.vgg16.trainable = True

    for layer in self.vgg16.layers:
        if set_trainable:

I’m not skilled using classes and I don’t see all the code. It is supposed that I am specifying not to train the first labels, and train labels from the one whose name is ‘block5_conv1’. But I’m not sure if I have done it where it has to be done, or if I should modify more things.

I will try again with something more easy, but my target is to do this retraiment of the last labels of the model.

Thanks for your answer! :slight_smile:

PS: PerceptiLabs for me looks really cool :sunglasses:

Hi Clarilla

I’ll tag @robertl for confirmation but I I think you can use the VGG16 component as it is: it is provided trained so that you benefit from pre-training, i.e. do the “transfer learning” in the title - you just need to provide your labels etc. to give it the specific knowledge of your dataset. This page in the PL documentation refers to these things.

Apologies if I have misunderstood what you are trying to do.

BTW your code:

  • Typo: you have “trainalbe” for "trainable) in a couple of places
  • That code depends on the layer iteration order - once ‘block5_conv1’ has been encountered, it and all subsequent iterations will have layer.trainable = False; I would check the assumption that this does what you expect, i.e. confirm iteration order and that it is robust.
  • However, reading your last comment that your aim is to “train labels from the one whose name is ‘block5_conv1’.” I don’t think your code does that (second conditional is always executed), I think you would need something like this, if each layer needs to be turned off because self.vgg16.trainable = true enables all
    # ignore text formatting introduced by forum ``` ```handling for me
    for layer in self.vgg16.layers:
        if == 'block5_conv1':
            layer.trainable = True
            layer.trainable = False

or even just access that specific layer to set it trainable rather than use a loop :smiley:

(i.e. code depends on detail of self.vgg16.trainable and layer.trainable and interactions, which I’m not familiar with

1 Like

Hey, welcome to the forum @clarilla! :wave:

Sounds like yo want to make a larger part of the model trainable, will be cool to see the results of it :slight_smile:

The most common way to retrain a VGG is to just retrain the last 3 Dense layers. You can do that in PerceptiLabs by using the standard settings of the VGG component and then dragging out 3 Dense layers behind it. This could be a good start to see how well it will train.

1 Like