I am currently trying to modify how the error of one of the variables' my network is trying to predict is computed. I still want to use MSE but I would like to modify the "difference" part of the equation (because the variable represents angle degrees).
I have tried a few things, but none has worked yet :
I first tried in a some naive iterative fashion
def custom_mean_squared_loss(y_true, y_pred):
for sample in range(35):
for timestep in range(data_shape[1]):
error1 = tf.abs(diff[sample][timestep][6])
error2 = 360 - error1
corrected_err = tf.minimum(error1, error2)
test = tf.gather_nd(diff, [[sample, timestep, 6]])
test.assign(corrected_err)
But as far as I understand, tensorflow needs to have operations clearly "stated" in order to evaluate them and compute the gradient of the loss function, so I tried to remove the loops and let it to the job :
diff = y_true - y_pred
data_shape = y_pred.get_shape()
error1 = tf.abs(diff[:][:][6])
error2 = 360 - error1
corrected_err = tf.minimum(error1, error2)
diff[:][:][6].assign(corrected_err)
return tf.mean(tf.square(diff), axis=-1)
However, I can't manage to make the assignment line compile :
ValueError: Sliced assignment is only supported for variables
y_true
andy_pred
?