0
$\begingroup$

As the title I asked.

For example: a model that predicts the probability of a stock price rising/falling. Let's say this is a triple-classification problem.

If it predicts "RISING", while the truth is "NO CHANGE", I want to give it a normal loss result;

If it predicts "RISING", but the truth is "FALLING", I want to give it more loss result.

How to get it?

$\endgroup$

1 Answer 1

1
$\begingroup$

Yes, but it would be good for you to know the reason for mis-classing, usually the reason is that data are imbalanced, so you should look at targert class distribution.

Depending on the model you can use some regularization technique or assign weights to the desired class, for example in sklearn with decision tree.

# class 1 with weight 0, so clf only predicts class 0                         
clf = DecisionTreeClassifier(random_state=0, class_weight={0: 1, 1: 0})

https://scikit-learn.org/stable/modules/generated/sklearn.tree.DecisionTreeClassifier.html

$\endgroup$
2
  • $\begingroup$ Thanks for your answer! I'm using a GRU model built with pytorch. I have resolved the imbalance of the classes by randomly abandon negtive sample. Would you pls give some hint about RNN/GRU liked model? $\endgroup$
    – EvilRoach
    Commented Sep 26, 2022 at 1:08
  • $\begingroup$ I can't give you any specific tips, as I know little about gru, I've only used lstm once, but I recommend to use a hyperparameter optimizer like optuna (I currently use this one) or hyperopt. Then you could test different hyperparameters including the regularization. $\endgroup$
    – gaspar
    Commented Sep 26, 2022 at 1:44

Not the answer you're looking for? Browse other questions tagged or ask your own question.