Skip to main content
corrected spelling
Source Link
John Ladasky
  • 1.1k
  • 8
  • 18

I answered my own question. I'm posting the solution for anyone who may come across this same problem.

I tried using my TF loss function directly in Keras, as was inependentlyindependently suggested by Matias Valdenegro. I did not provoke any errors from Keras by doing so, however, the loss value went immediately to NaN.

Eventually I identified the problem. The calling convention for a Keras loss function is first y_true (which I called tgt), then y_pred (my pred). But the calling convention for a TensorFlow loss function is pred first, then tgt. So if you want to keep a Tensorflow-native version of the loss function around, this fix works:

def keras_l2_angle_distance(tgt, pred):
    return l2_angle_distance(pred, tgt)

<snip>

model.compile(loss = keras_l2_angle_distance, optimizer = "something")

Maybe Theano or CNTK uses the same parameter order as Keras, I don't know. But I'm back in business.

I answered my own question. I'm posting the solution for anyone who may come across this same problem.

I tried using my TF loss function directly in Keras, as was inependently suggested by Matias Valdenegro. I did not provoke any errors from Keras by doing so, however, the loss value went immediately to NaN.

Eventually I identified the problem. The calling convention for a Keras loss function is first y_true (which I called tgt), then y_pred (my pred). But the calling convention for a TensorFlow loss function is pred first, then tgt. So if you want to keep a Tensorflow-native version of the loss function around, this fix works:

def keras_l2_angle_distance(tgt, pred):
    return l2_angle_distance(pred, tgt)

<snip>

model.compile(loss = keras_l2_angle_distance, optimizer = "something")

Maybe Theano or CNTK uses the same parameter order as Keras, I don't know. But I'm back in business.

I answered my own question. I'm posting the solution for anyone who may come across this same problem.

I tried using my TF loss function directly in Keras, as was independently suggested by Matias Valdenegro. I did not provoke any errors from Keras by doing so, however, the loss value went immediately to NaN.

Eventually I identified the problem. The calling convention for a Keras loss function is first y_true (which I called tgt), then y_pred (my pred). But the calling convention for a TensorFlow loss function is pred first, then tgt. So if you want to keep a Tensorflow-native version of the loss function around, this fix works:

def keras_l2_angle_distance(tgt, pred):
    return l2_angle_distance(pred, tgt)

<snip>

model.compile(loss = keras_l2_angle_distance, optimizer = "something")

Maybe Theano or CNTK uses the same parameter order as Keras, I don't know. But I'm back in business.

Source Link
John Ladasky
  • 1.1k
  • 8
  • 18

I answered my own question. I'm posting the solution for anyone who may come across this same problem.

I tried using my TF loss function directly in Keras, as was inependently suggested by Matias Valdenegro. I did not provoke any errors from Keras by doing so, however, the loss value went immediately to NaN.

Eventually I identified the problem. The calling convention for a Keras loss function is first y_true (which I called tgt), then y_pred (my pred). But the calling convention for a TensorFlow loss function is pred first, then tgt. So if you want to keep a Tensorflow-native version of the loss function around, this fix works:

def keras_l2_angle_distance(tgt, pred):
    return l2_angle_distance(pred, tgt)

<snip>

model.compile(loss = keras_l2_angle_distance, optimizer = "something")

Maybe Theano or CNTK uses the same parameter order as Keras, I don't know. But I'm back in business.