Need a bit of help getting the prediction output I want

I tried setting up a quick prediction after doing some reading around and the output I have been getting wasn’t what I expected. I was expecting an array of floats of all the classes like [float, float, float] but returns the class name of the highest predicted class.

Did a bunch of reading into it and tried some stuff but im still lost, any ideas?
image

It’s doing exactly what I would have expected - returning the single most probably label for the input image, but I know what you are after: you want an array of the softmax float values (i.e. probabilities totalling 1) of all the potential classes. I have/have seen info on this somewhere but I will need some time to track it down

For info, how many classes do you have apart from SFW, NSFW? (Because: if only 2, do you really want that array - or is this just a warmup for a more nuanced classification?)

UPDATE

Not sure if this will help (it’s a bit old, TF1) but just in case: https://stackoverflow.com/questions/47599436/returning-probabilities-in-a-classification-prediction-in-keras

See also this: https://stackoverflow.com/questions/56119773/my-cnn-always-result-in-0-or-1-and-never-a-percentage-why?noredirect=1&lq=1

What activation function do you have at the end? If not softmax, it maybe worth trying that; if it is already softmax, someone more knowledgeable needs to speak up! (? @robertl)

Yes! That’s exactly what I want, every prebuilt I played with for NSFW classification works like this.

My last layer is Softmax unless im missing something… image

I have 3 classes, Yes I do want that output regardless of amount of classes (they dont have to total 1) it allows me to code math functions around values like adding thresholds instead of just getting an “all or nothing” result which is how it looks to be working in my screenshot. It also allows for stacking multiple predictors with different models and apply more math functions to create a score system to make decisions but that’s getting way ahead of myself.

I haven’t found the answer yet, but I did stumble upon this piece of advice re use of softmax

Softmax assumes that each example is a member of exactly one class. Some examples, however, can simultaneously be a member of multiple classes. For such examples:

  • You may not use Softmax.
  • You must rely on multiple logistic regressions.

So, some caution in order depending on your categories.

I know I’ve seen exactly what you want, but can’t find the right query to dig it up again

UPDATE

This outputs your sort of thing it seems…a jupyternotebook on github via medium article

After playing around I dont think I can get that kind of output even if I wanted to, I didnt notice this before but there’s something happening on the labeling because these two should reflect each other 1:1 on their values and they dont also looks like it’s non-editable :frowning:
Unless im having a big misunderstanding…

image

Hey :wave:

@RBR, for your latest comment, the last Dense layer in the model you are looking at is untrained, so it’s very unlikely that it will 100% reflect your label (the green component).
What you are seeing from the Dense layer is its output distribution - how probable it thinks each output is. When you do inference on it, it will automatically pick the highest value (through ArgMax) which in this case would be label nr 1 (as opposed to 0).
I might have misunderstood your concern though, so let me know if I didn’t answer it properly :slight_smile:
Also, what would you want to edit with the labels? They are automatically set based on the datatype and CSV which you are using.

As for the earlier question in this thread - how to get the output distribution rather than the single output value:
We have a feature coming in soon where you can toggle pre-and-post-processing on or off, and without the post-processing you would get the output distribution.
Until then though, you can do something like this:

import tensorflow as tf
import os
from PIL import Image
import numpy as np
from tensorflow import keras
from keras import backend as K

mapping = {0: "No", 1: "Yes"}
path_to_model = "Brain_tumor/model"

def load_image(path):
    image = Image.open(path)
    image = np.array(image, dtype=np.float32)
    image = np.expand_dims(image, axis=0)
    return image

model = keras.models.load_model(path_to_model)

#Load some image
image = load_image("C:/Users/lundb/Documents/PerceptiLabs/Default/Code/Brain_tumor/dataset/no/1 no.jpeg")

#Loads the model
model = keras.models.load_model(path_to_model)

#Take the input to the post-processing layer as the output of the model
inp = model.input
input_to_postprocessing = model.layers[-1].input
model_func = K.function(inp, input_to_postprocessing)

#Makes some predictions and catogirizes them
prediction1 = model_func(image)
print(prediction1) #This prints the output distribution
print(mapping[np.asarray(prediction1).argmax()]) #This prints the value you would get if you use postprocessing

Hope that helps! :slight_smile:

1 Like

So in PL there is (currently) an argmax for this sort of thing! That’s what I wasn’t seeing - all the examples I could find online were just doing prediction and getting a distribution.

NB These resources may also be useful for future reference

However, I can’t find documentation for this in TF2.

@RBR I finally found the examples I was looking for on distributions but there was no code with them so it wouldn’t have helped :expressionless:

I see so the final graph represents the final output not represent the prior output after operations have been applied on that layer. I had it backwards :smiley:

I dont want to edit labels or names I wasnt even expecting them to be retained as a prediction output but have their int representation.

What I was looking for was a list like output [0.7000000, 0.1000000, 0.2000000] that would represent each class by int 0,1,2 which I can assign a name in code if need be.

As for your code it threw an exception unfortunately.
image

EDIT: nvm I got the code to work, Yes thank you that’s closer to what I was looking for

I’m glad you’ve got it going - would you mind sharing how you resolved the exception?

Instead of

from keras import backend as K

I changed it to

from tensorflow.keras import backend as K
2 Likes