Model View wait circles - 40s wait?

SeeingInTheDark model - with valid images :slight_smile:

Config: Win 10 Home 64-bit Build 20H2, perceptilabs (CPU) 0.11.7 running in latest Google Chrome

Switch from the model view to the model hub and back again and it takes 40s on this fast machine for the waiting circles to disappear, and “Save” does not work until that has occurred.

It’s as though it is initialising something… reading files… etc. and I don’t understand why it should be doing anything at all once it has been initialised once.

(Reporting for a particular model in order to be specific, but this waiting issue applies to ~all models I have worked with)

Can it be stopped, to make use of model hub etc. more practical, amongst other things?

Hi,

This is likely going to be smoother in the next release which features TF2, as we then will use eager mode for getting all the previews rather than graph mode.
Will have more updates when we are getting a little closer to it :slight_smile:

All the best,
Robert

@robertl Great, thx for the info. So, they’ll update as changes propagate but otherwise be instant on?

@JulianSMoore Exactly :slight_smile:

Just an observation: I am trying to play with the Textile demo to look at your ResNet approach (about this, a specific comment in a new thread to come)

Somehow the model.json got into a state such that the node displays would never update - the circles just went on forever. I restored the original model.json and all was ok again.

Now, I had previously noted that it did not seem to be possible (though it might have been 0.11.6.1 rather than 0.11.7) to save until the node displays had updated, but in the “defective” model.json file I noted this attribute - “chartDataIsLoading”: 1, suggesting that the model had saved before the chart data had been loaded.

Just mentioning it in case screen updating, model state etc. can get out of sync and that is part of the problem.

I also note that there are attributes that may or may not appear in the json, such as “isTrained”, which has boolean values. It would seem more natural for such attributes to be always present with an appropriate value, rather than to be introduced with a value (e.g. False).

NB: chartDataIsLoading is another attribute that appears when it might naively be thought to be always present with an appropriate value; note also that chartDataIsLoading has numeric value and isTrained has Boolean. I believe Python handles such cases naturally, but “is…” attributes do naturally suggest Boolean

Thanks for highlighting this.
Do you happen to have the broken model.json still so we can compare it with a working one?

It would seem more natural for such attributes to be always present with an appropriate value, rather than to be introduced with a value (e.g. False).

Hmm, I’m not quite following, do you mean that it should always exist as an internal value inside the kernel rather than be saved in the json? :slight_smile:

@robertl I have attached as zip my textile model.json and the original model.json file; I think “model.json” is one that doesn’t refresh properly on screen… however I had to restore these from backup (since I had deleted the old stuff once I had restored the original model and then trained is successfully) and can’t guarantee that model.json is defective.

To clarify the other points: I was trying to say (not very well) that IMO chartDataIsLoading and isTrained should exist ab initio in the model.json (because in the beginning the model is not trained), and that isTrained will start off as False, and chartDataIsLoading could well be false initially as well (and obviously, that they should be of the same datatype, Boolean)

Textile model jsons.zip (5.3 KB)

Ah now I’m with you, that would make sense!

Thanks for the models, I’ll take a look.