-
Notifications
You must be signed in to change notification settings - Fork 409
Getting ValueError: Unsupported explanation type when calling - ResponsibleAIDashboard(rai_insights) #2599
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
It is part of example - notebooks/responsibleaidashboard/text/responsibleaidashboard-blbooksgenre-binary-text-classification-model-debugging.ipynb |
@bandipavan sorry about the issue you are encountering. Can you please run:
and downgrade/upgrade shap:
There were some breaking changes in shap and this issue looks like it might be related to that package. |
Thank you for the prompt response, applied the changes, no difference pip install --upgrade "shap<=0.44.0" Installing collected packages: shap python .\ValidateModel.py
|
This looks like a problem with using an older version of the interpret package. |
@bandipavan The key error in the stack trace above is:
this is coming from numpy>1.26 and older interpret package code. |
@imatiach-msft Applied the suggested changes, but unfortunately I am still getting the same error, here is the list of packages with their version Package Version aiohappyeyeballs 2.5.0 Stacktrace File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\responsibleai_text\managers\explainer_manager.py:317, in ExplainerManager._compute_global_importances(self, explanation) File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\shap_explanation.py:422, in Explanation.getitem(self, item) File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\slicer\slicer.py:112, in Slicer.getitem(self, item) File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\slicer\slicer_internal.py:69, in AtomicSlicer.getitem(self, item) File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\slicer\slicer_internal.py:583, in UnifiedDataHandler.slice(cls, o, index_tup, max_dim) File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\slicer\slicer_internal.py:443, in ArrayHandler.tail_slice(cls, o, tail_index, max_dim, flatten) ValueError: setting an array element with a sequence. The requested array has an inhomogeneous shape after 1 dimensions. The detected shape was (20,) + inhomogeneous part. The above exception was the direct cause of the following exception: ValueError Traceback (most recent call last) File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\raiwidgets\responsibleai_dashboard.py:40, in ResponsibleAIDashboard.init(self, analysis, public_ip, port, locale, cohort_list, is_private_link, **kwargs) File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\raiwidgets\responsibleai_dashboard_input.py:41, in ResponsibleAIDashboardInput.init(self, analysis, cohort_list) File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\responsibleai_text\rai_text_insights\rai_text_insights.py:490, in RAITextInsights.get_data(self) File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\responsibleai_text\managers\explainer_manager.py:263, in ExplainerManager.get_data(self) File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\responsibleai_text\managers\explainer_manager.py:263, in (.0) File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\responsibleai_text\managers\explainer_manager.py:303, in ExplainerManager._get_interpret(self, explanation) ValueError: Unsupported explanation type |
@bandipavan can you try to upgrade slicer to 0.0.8:
I think this commit may have fixed it: Or at least it seems like the package should work better with the latest version of numpy. I'm hoping that the latest 0.0.8 should work: |
Describe the bug
Getting ValueError: Unsupported explanation type when calling - ResponsibleAIDashboard(rai_insights)
To Reproduce
this is part of responsibleaidashboard-blbooksgenre-binary-text-classification-model-debugging.ipynb
Stack trace
ValueError Traceback (most recent call last)
File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\responsibleai_text\managers\explainer_manager.py:290, in ExplainerManager._get_interpret(self, explanation)
289 importances = FeatureImportance()
--> 290 features, scores, intercept = self._compute_global_importances(
291 explanation)
292 importances.featureNames = features
File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\responsibleai_text\managers\explainer_manager.py:317, in ExplainerManager._compute_global_importances(self, explanation)
316 if is_classif_task:
--> 317 global_exp = explanation[:, :, :].mean(0)
318 features = convert_to_list(global_exp.feature_names)
File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\shap_explanation.py:422, in Explanation.getitem(self, item)
421 new_self = copy.copy(self)
--> 422 new_self._s = new_self._s.getitem(item)
423 new_self.op_history.append({
424 "name": "getitem",
425 "args": (item,),
426 "prev_shape": self.shape
427 })
File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\slicer\slicer.py:112, in Slicer.getitem(self, item)
111 slicer_index = index_slicer[tracked.dim]
--> 112 sliced_o = tracked[slicer_index]
113 sliced_dim = resolve_dim(index_tup, tracked.dim)
File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\slicer\slicer_internal.py:69, in AtomicSlicer.getitem(self, item)
68 # Slice according to object type.
---> 69 return UnifiedDataHandler.slice(self.o, index_tup, self.max_dim)
File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\slicer\slicer_internal.py:583, in UnifiedDataHandler.slice(cls, o, index_tup, max_dim)
582 is_element, sliced_o, cut = head_slice(o, index_tup, max_dim)
--> 583 out = tail_slice(sliced_o, index_tup[cut:], max_dim - cut, is_element)
584 return out
File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\slicer\slicer_internal.py:443, in ArrayHandler.tail_slice(cls, o, tail_index, max_dim, flatten)
441 import numpy
--> 443 return numpy.array(inner)
444 elif _safe_isinstance(o, "torch", "Tensor"):
ValueError: setting an array element with a sequence. The requested array has an inhomogeneous shape after 1 dimensions. The detected shape was (20,) + inhomogeneous part.
The above exception was the direct cause of the following exception:
ValueError Traceback (most recent call last)
Cell In[1], line 98
94 rai_insights.error_analysis.add()
96 rai_insights.compute()
---> 98 ResponsibleAIDashboard(rai_insights)
File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\raiwidgets\responsibleai_dashboard.py:40, in ResponsibleAIDashboard.init(self, analysis, public_ip, port, locale, cohort_list, is_private_link, **kwargs)
36 def init(self, analysis: RAIInsights,
37 public_ip=None, port=None, locale=None,
38 cohort_list=None, is_private_link=False,
39 **kwargs):
---> 40 self.input = ResponsibleAIDashboardInput(
41 analysis, cohort_list=cohort_list)
43 super(ResponsibleAIDashboard, self).init(
44 dashboard_type="ResponsibleAI",
45 model_data=self.input.dashboard_input,
(...) 50 is_private_link=is_private_link,
51 **kwargs)
53 def predict():
File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\raiwidgets\responsibleai_dashboard_input.py:41, in ResponsibleAIDashboardInput.init(self, analysis, cohort_list)
39 model = analysis.model
40 self._is_classifier = is_classifier(model)
---> 41 self.dashboard_input = analysis.get_data()
43 self._validate_cohort_list(cohort_list)
45 self._feature_length = len(self.dashboard_input.dataset.feature_names)
File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\responsibleai_text\rai_text_insights\rai_text_insights.py:490, in RAITextInsights.get_data(self)
488 data = RAIInsightsData()
489 data.dataset = self._get_dataset()
--> 490 data.modelExplanationData = self.explainer.get_data()
491 data.errorAnalysisData = self.error_analysis.get_data()
492 return data
File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\responsibleai_text\managers\explainer_manager.py:263, in ExplainerManager.get_data(self)
257 def get_data(self):
258 """Get explanation data
259
260 :return: A array of ModelExplanationData.
261 :rtype: List[ModelExplanationData]
262 """
--> 263 return [self._get_interpret(i) for i in self.get()]
File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\responsibleai_text\managers\explainer_manager.py:263, in (.0)
257 def get_data(self):
258 """Get explanation data
259
260 :return: A array of ModelExplanationData.
261 :rtype: List[ModelExplanationData]
262 """
--> 263 return [self._get_interpret(i) for i in self.get()]
File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\responsibleai_text\managers\explainer_manager.py:303, in ExplainerManager._get_interpret(self, explanation)
301 interpretation.precomputedExplanations = precomputedExplanations
302 except Exception as ex:
--> 303 raise ValueError(
304 "Unsupported explanation type") from ex
305 return interpretation
ValueError: Unsupported explanation type
Expected behavior
A clear and concise description of what you expected to happen.
Screenshots
If applicable, add screenshots to help explain your problem.
Desktop (please complete the following information):
To get the package versions please run in your command line:
Additional context
Add any other context about the problem here.
The text was updated successfully, but these errors were encountered: