Custom Component Code observations & comment

PL 0.13.7

The current default code for a custom component is shown below (at the bottom), apart from some unnecessary spacing (that also seems to get duplicated in pasting here), a couple of observations:

  • call docstring is incorrect
  • call preview variable is not used
  • there is no build method

And,

  • because I have never quite got my head around Python’s “Call by Object Reference” approach, I also wanted to confirm that e.g. input_ = inputs['input'] is efficient because it does not actually copy any data
  • how do the visualized_trainables work? It doesn’t reference any self variables - should there be self._visualizable_weights and self._visualizable_biases (maybe initialised to tf.constant(0) in ini?)
  • for info, how is the class X inheriting from Tf2xLayer related to the class XKeras? I note the super().init passes keras_class=... What does that do?

NB I referred to the Dense component for some example code, and note in passing there that self._n_neurons = 128 is set in init but not used in build() where the specific value 128 is hard coded.

Question: is the following a reasonable expansion of the default custom component code? (If so, I’ll use this as my template) Thanks to @birdstream for the excellent example here which I have taken from below.

class LayerCustom_LayerCustom_1Keras(tf.keras.layers.Layer, PerceptiLabsVisualizer):
    """ This custom layer code includes additional code commentary and explanation for the benefit of users.
        Some type hinting has been provided for clarity
        The default component just passes input to output, so the structure of a model can be built before
        attending to the detailed code of each custom component
    """

    # copied from Dense example
    def __init__(self):
        """ Initialise from the base classes and add specific properties (attribute is the collective term for properties, methods, variables)  """
        super()._init__() # call the init methods of all parents
        self._variables = {} # variables belonging to this layer that should be rendered in the frontend
        
        # if the layer does not have weights and biases, then visualized_trainables are always zero
        self._visualizable_weights = tf.constant(0) # update in call() from the kernel/weights & biases 
        self._visualizable_biases = tf.constant(0)

    def call(self, inputs, training=True):
        """ Perform the user-defined input to output transformation and set up the component preview"""
        input_ = inputs['input']
        # do some work on the input...
        # output = self.do_something(input_) # do_something is defined in build() 
        # if desirable, preview can be a transform of the output; otherwise just copy output to preview
        # preview = output
        # and then delete "output = preview = input_" which just makes sure that the default custom component functions as a pass-through component
        output = preview = input_
        self._variables = {k: v for k, v in locals().items() if can_serialize(v)}
        # there may be other elements to _outputs; see a Dense component for an example
        self._outputs = {            
            'output': output,
            'preview': preview, # changed from output to preview; preview might have extra processing
        }
        return self._outputs

    # copied from Dense example
    def build(self, input_shape: Dict[str, tf.TensorShape]) -> None:
        """ Called by the Keras framework upon the first invocation of the call method
            Args:
                input_shape: A dictionary with layer id and its tensor shape        
        """
        # example from @birdstream
        # self.do_something = tf.keras.Sequential([
        #     pp.Rescaling(1./127.5, -1),
        #     pp.RandomContrast(0.1),
        #     pp.RandomZoom(0.1, 0.1),
        #     pp.RandomTranslation(0.1, 0.1),
        #     pp.RandomFlip(),
        #     pp.RandomRotation(0.1)
        # ])        

    # copied from Dense example
    def get_config(self) -> Dict[str, Picklable]:
        """ Any variables belonging to this layer that should be rendered in the frontend.
        Returns:
            A dictionary with tensor names for keys and picklable for values.
        """
        return self._variables

    @property
    def visualized_trainables(self):
        """ Returns two tf.Variables (weights, biases) to be visualized in the frontend """
        return self._visualizable_weights, self._visualizable_biases

class LayerCustom_LayerCustom_1(Tf2xLayer):
    def __init__(self):
        super().__init__(
            keras_class=LayerCustom_LayerCustom_1Keras
        )

Current Default Custom Component Code

class LayerCustom_LayerCustom_1Keras(tf.keras.layers.Layer, PerceptiLabsVisualizer):

    def call(self, inputs, training=True):

        """ Takes a tensor and one-hot encodes it """

        input_ = inputs['input']

        output = preview = input_

           

        self._outputs = {            

            'output': output,

            'preview': output,

        }

        return self._outputs

    def get_config(self):

        """Any variables belonging to this layer that should be rendered in the frontend.

        

        Returns:

            A dictionary with tensor names for keys and picklable for values.

        """

        return {}

    @property

    def visualized_trainables(self):

        """ Returns two tf.Variables (weights, biases) to be visualized in the frontend """

        return tf.constant(0), tf.constant(0)

class LayerCustom_LayerCustom_1(Tf2xLayer):

    def __init__(self):

        super().__init__(

            keras_class=LayerCustom_LayerCustom_1Keras

        )
1 Like

Thanks a ton for the feedback @JulianSMoore! :pray:
We have a task to update the custom code, I’ve added your comments to it :slight_smile:

For your questions, I’ll come back with some more detailed answers after checking with the devs, will make an update here then.

OK thanks; I’ll build other ResNet bits while waiting :slight_smile: - I wanted to go custom for the zero padding etc. and that can wait a little.

Hi @JulianSMoore,

Thanks for the detailed feedback! I’ll try to answer your questions:

  • About the efficiency of inputs_ = inputs[‘input’]: in Python, nothing is copied unless you take explicit steps to do so. So this should be OK in terms of efficiency :slight_smile:

  • Visualized trainables: its purpose is to tell PerceptiLabs which tensor should be plotted as weights and which tensor should be plotted as biases (and their respective gradients). Without it, the layer shouldn’t show an
    y values for weights/biases/gradients. (Sidenote: PerceptiLabs will still include all tensors in training, since that doesn’t require distinguishing between them.)

  • About the class inheriting from Tf2XLayer: Sorry about that. That’s a compatibility wrapper we used to speed up delivery of our TensorFlow 2 version of the tool and doesn’t really need to be visible to the user. We’re looking into removing that before it starts causing too much confusion. :slight_smile:

  • The number of neurons in the dense layer looks like a bug. I’lll have that removed asap, thanks

  • About the layer template: as far as I can tell, that looks good to me. We could use that as a guide when we polish the custom code in the task @robertl mentioned.

1 Like

Hi @anton.k

Thanks for the detailed info

  • Efficiency - that’s what I thought… but I’m not a dev & wanted to understand better - thx!
  • Visualized trainables - ah, so those feed the graphs. Got it.
  • Tf2XLayer - ok, I’ll just ignore it.
  • #neurons :+1:
  • Template - good to know, and I’m sure it’ll look even better after the pros have worked on it :slight_smile:

:pray: