Pytorch support

Hi, is there support for pytorch ?

Hi Yashovardhan,

The current code generated by the components is TensorFlow 1.15.
We are currently working on updating it to TensorFlow 2.x. However, our goal is to make PerceptiLabs as open as possible for integrations and we have gotten requests about PyTorch before.

In what way is PyTorch most important for you - For working in the code editor or to be able to export the model in a PyTorch format (or both)?

For both. I think it might make sense to allow the abillity to export to onnx also as it is open source and can be converted to both tensorflow and pytorch.

Thank you @ysh for letting us know! We are working on the onnx export feature, so it’s good to hear that it resonates with you too :slight_smile:

Hi there! Just checking in on this. Has support for PyTorch been integrated yet? Would love to see it.

Hi @carsondwilber,
Welcome to the forum!

We are still based on TensorFlow. I’m sure we will integrate PyTorch at some point, but as we are a layer on top of the frameworks, the experience would be almost identical while inside the tool. The main differences (that I can see) being the pre-trained/pre-built models that can be pulled in and used in PerceptiLabs or if there is a specific export format you are interested in.

What would be the biggest benefit of having PyTorch supported for you? :slight_smile:

Hi @carsondwilber

Seeing this activity on your question prompted me to investigate - out of curiosity!

There seem to be some standard capabilities now for onnx, e.g. Tensorflow-ONNX - and in reverse! - ONNX-Tensorflow

I haven’t used these tools but even if they are not (yet?) integrated into PL, the could certainly be added to an env and used manually to take advantage of PL capabilities, e.g. if you wanted to take work from PL and develop/transfer elsewhere.

Hope that helps!

(mmDN doesn’t seem to have progressed as much - no releases for >1 year)