使用LSTM进行预测,具体是使用气温降水等9个特征变量预测径流量,总共数据量是746个月数据,训练集占80%,测试集20%,滑动窗口为12。想使用SHAP来解释这个模型,求解SHAP值的时候出现报错。
import shap
background = test_dataset[0:,]
explainer=shap.DeepExplainer(model,background)
shap_values=explainer.shap_values(test_dataset[0:,])
AttributeError Traceback (most recent call last)
C:\Users\BIANLE~1\AppData\Local\Temp/ipykernel_25780/2367290389.py in
5 #eval_model = load_model('best_model.hdf5')
6 explainer=shap.DeepExplainer(model,background)
-> 7 shap_values=explainer.shap_values(test_dataset[0:,])
E:\Anaconda\envs\tensorflow\lib\site-packages\shap\explainers_deep_init_py in shap_values(self, X, ranked_outputs, output_rank_order, check_additivity)
122 were chosen as "top"
123
-> 124 return self.explainer.shap_values(X, ranked_outputs, output_rank_order, check_additivity=check_additivity)
E:\Anaconda\envs\tensorflow\lib\site-packages\shap\explainers_deep\deep tf.py in shap values(self, X, ranked_outputs, output_rank_order, check_additivity)
306 # run attribution computation graph
307 feature_ind = model_output_ranks[j,i]
-> 308 sample_phis = self.run(self.phi_symbolic(feature_ind), self.model_inputs, joint_input)
309
310 # assign the attributions to the right part of the output arrays
E:\Anaconda\envs\tensorflow\lib\site-packages\shap\explainers_deep\deep_tf.py in run(self, out, model_inputs, X)
363
364 return final_out
-> 365 return self.execute_with_overridden_gradients(anon)
366
367 def custom_grad(self, op, *grads):
E:\Anaconda\envs\tensorflow\lib\site-packages\shap\explainers_deep\deep_tf.py in execute_with_overridden_gradients(self, f)
399 # define the computation graph for the attribution values using a custom gradient-like computation
400 try:
-> 401 out = f()
402 finally:
403 # reinstate the backpropagatable check
E:\Anaconda\envs\tensorflow\lib\site-packages\shap\explainers_deep\deep_tf.py in anon()
359 v = tf.constant(data, dtype=self.model_inputs[i].dtype)
360 inputs.append(v)
-> 361 final_out = out(inputs)
362 tf_execute.record_gradient = tf_backprop._record_gradient
363
E:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\util\traceback_utils.py in error_handler(*args, **kwargs)
151 except Exception as e:
152 filtered_tb = _process_traceback_frames(e.traceback)
-> 153 raise e.with_traceback(filtered_tb) from None
154 finally:
155 del filtered_tb
E:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\framework\func_graph.py in autograph_handler(*args, **kwargs)
1145 except Exception as e: # pylint:disable=broad-except
1146 if hasattr(e, "ag_error_metadata"):
-> 1147 raise e.ag_error_metadata.to_exception(e)
1148 else:
1149 raise
AttributeError: in user code:
File "E:\Anaconda\envs\tensorflow\lib\site-packages\shap\explainers\_deep\deep_tf.py", line 243, in grad_graph *
out = self.model(shap_rAnD)
File "E:\Anaconda\envs\tensorflow\lib\site-packages\keras\utils\traceback_utils.py", line 67, in error_handler **
raise e.with_traceback(filtered_tb) from None
File "E:\Anaconda\envs\tensorflow\lib\site-packages\shap\explainers\_deep\deep_tf.py", line 26, in custom_record_gradient
out = tf_backprop._record_gradient("shap_"+op_name, inputs, attrs, results)
AttributeError: Exception encountered when calling layer "LSTM" (type LSTM).
module 'tensorflow.python.eager.backprop' has no attribute '_record_gradient'
Call arguments received:
• inputs=tf.Tensor(shape=(280, 12, 9), dtype=float32)
• mask=None
• training=False
• initial_state=None
这段代码一直在报错,最终代码不报错了,出现了这个错误
想要求解出shap值,能够看出对预测结果影响较大的特征变量