MulaiMulai sekarang secara gratis

Evaluating SHAP explanation consistency

Evaluate the consistency of feature importance explanations using SHAP values across two different subsets of the insurance dataset.

The subsets X1, X2, y1, and y2 have been pre-loaded for you along with model1 trained on the first subset and model2 trained on the second subset.

Latihan ini adalah bagian dari kursus

Explainable AI in Python

Lihat Kursus

Petunjuk latihan

  • Calculate shap_values1 and feature_importance1 for model1.
  • Calculate shap_values2 and feature_importance2 for model2.
  • Calculate consistency between feature importances.

Latihan interaktif praktis

Cobalah latihan ini dengan menyelesaikan kode contoh berikut.

# Calculate SHAP values and feature importance for model1
explainer1 = shap.TreeExplainer(model1)
shap_values1 = ____
feature_importance1 = ____

# Calculate SHAP values and feature importance for model2
explainer2 = shap.TreeExplainer(model2)
shap_values2 = ___
feature_importance2 =____

# Consistency calculation
consistency = ____
print("Consistency between SHAP values:", consistency)
Edit dan Jalankan Kode