Recurrent: "Return Sequence" not working as intended?

When stacking RNN layers, according to documentation, one should set ŕeturn_sequences=True to provide a 3D output for the subsequent RRN layer. This is not what PL does when enabling “Return Sequence” in the component setting. Instead, it sets return_state=True, which doesn’t provide the 3D output that would go into the next RRN layer. Is this perhaps a bug, or am I misunderstanding things here? :sweat_smile:

Workaround: Edit the code to include return_sequences=True and i can stack the layers :slight_smile:

2 Likes

Hi @birdstream,
Thanks for highlighting this! It is indeed a bug, I added a ticket to fix it :slight_smile:

2 Likes

@birdstream, where in the code did you edit? I assumed it would be this line:

self.recurrent = tf.keras.layers.LSTM(units=self._neurons, activation=self._activation, kernel_initializer='glorot_uniform', return_state=False)

But when I edit that line to include return_sequences=True I still receive a dimension error when I try to stack the layers.

@robertl,

Thanks for your response on Slack, here is the error that I receive in PL:

Traceback (most recent call last):
  File "perceptilabs\lwcore\strategies\tf2x.py", line 34, in perceptilabs.lwcore.strategies.tf2x.Tf2xInnerStrategy._run_internal
  File "perceptilabs\lwcore\strategies\tf2x.py", line 53, in perceptilabs.lwcore.strategies.tf2x.Tf2xInnerStrategy._make_results
  File "perceptilabs\lwcore\strategies\tf2x.py", line 43, in perceptilabs.lwcore.strategies.tf2x.Tf2xInnerStrategy._make_results
  File "perceptilabs\layers\legacy.py", line 19, in perceptilabs.layers.legacy.Tf2xLayer.__call__
  File "C:\Users\James\anaconda3\envs\myenv\lib\site-packages\tensorflow\python\keras\engine\base_layer.py", line 1030, in __call__
outputs = call_fn(inputs, *args, **kwargs)
  File "<rendered-code: 1640101720401 [DeepLearningRecurrent]>", line 24, in call

  File "C:\Users\James\anaconda3\envs\myenv\lib\site-packages\tensorflow\python\keras\layers\recurrent.py", line 668, in __call__
return super(RNN, self).__call__(inputs, **kwargs)
  File "C:\Users\James\anaconda3\envs\myenv\lib\site-packages\tensorflow\python\keras\engine\base_layer.py", line 1013, in __call__
input_spec.assert_input_compatibility(self.input_spec, inputs, self.name)
  File "C:\Users\James\anaconda3\envs\myenv\lib\site-packages\tensorflow\python\keras\engine\input_spec.py", line 215, in assert_input_compatibility
raise ValueError('Input ' + str(input_index) + ' of layer ' +
ValueError: Input 0 of layer lstm_30 is incompatible with the layer: expected ndim=3, found ndim=2. Full shape received: (1, 24)

Here is my model:

I did try using reshape after the recurrent layer and it did work, but i’m not sure if that achieving what we want.

Just add return_sequences=True as an extra argument at the end. And do NOT select “return sequence” in the options prior to that, it does som extra stuff that changes the dimensions :slight_smile:

2 Likes

Thank you, it is running now. I was editing return_states to return_sequences and turning return_sequence on in the gui.