Questions tagged [feature-importances]
The feature-importances tag has no usage guidance.
42
questions
0
votes
0
answers
16
views
Feature importance and classes importance
I have the following trained time series classification tensorflow model :
...
0
votes
0
answers
16
views
Understanding most important features from an additional column
I'm fairly new to data science in general and I'm doing some analysis. Let us say I have N rows and D features, and I have a ...
0
votes
0
answers
32
views
How to get Shap feature importance values for a random forest classifier at local level
I have a random forest classifier (binary) model that I'm using to run prediction on unseen test data. For each observation in the test data, I want to get the feature importance (which feature was ...
0
votes
0
answers
20
views
how to get feature importance on unseen test data
I trained a random forest classifier with a set of features and saved the model. (the features were selected based on their correlations with the response variable. Only those features with ...
0
votes
0
answers
26
views
Which method is better to understand key drivers/feature importance in prediction?
After applying two different classifiers (EBM Classifier and Random Forest Classifier) and getting similar scores, I used InterpretML functionality to identify the most relevant features in each model....
0
votes
1
answer
36
views
Effect of feature selection when coupled with XGB models
I ran Boruta feature selection prior to XGB training\testing step and didn't see any difference, although ~30/200 features were rejected prior to going into the training.
Can it be that internal ...
0
votes
0
answers
32
views
Measuring features effect and importance in Partial Least Square (PLS) regression
Context: it is possible to assess features importance and effect for a model using model-independent scoring techniques such as Partial Dependence (PD) profile, Acculumated Local Effect (ALE) profile, ...
1
vote
0
answers
212
views
Feature importance using random forest vs. SHAP
I recently came across SHAP while looking for feature-importance methods. To use SHAP, first a model needs to be created, and then based on the predictions made by the model, SHAP values are ...
1
vote
3
answers
224
views
Feature importance score for a feature that contains mostly 0's in XGBoost
I have read that the feature importance scores are calculated based on how a split on that feature improves performance.
I have a binary classification dataset and am running XGBoost classifier on it. ...
1
vote
0
answers
64
views
Feature Importance in Stacked Model
I have built a stacked model using mlxtend StakingCVClassifier. I want to know the feature importance scores now. Is there any way I can calculate feature importance scores for the stacked model? If ...
1
vote
1
answer
262
views
Why the marginal contribution of a feature is the difference between the feature effect minus the average effect
In several sources the marginal contribution is defined as
the difference between the prediction with and without the feature.
However, recently I read an article where the marginal contribution was ...
2
votes
2
answers
464
views
Feature importance in xgboost
I've been reading that feature importance in xgboost is computed the same way as in random forests. However, the learning rate reduces the effect of downstream trees. Is the learning rate taken into ...
0
votes
1
answer
963
views
Shapley Values - How to interpret each value for each feature for a specific instance?
I am using Shap Values(the 'shap' module in python) to help me understand a bit better the relation between my features and my target. I am currently working on a binary classification problem.
I know ...
3
votes
0
answers
479
views
Capturing the 'direction' of feature importances using TreeSHAP?
I'm a machine learning / python novice, so apologies if my question is simple, but I haven't been able to find it addressed.
I'm very interested in using ML to determine the most important features ...
0
votes
1
answer
3k
views
Decision tree vs logistic regression feature importances
I have trained Logistic regression and decision tree in skearn on the same standardized dataset (binary classification).
Top important coefficients for the decision tree are (sorted by ...