Skip to content

Getting ValueError: Unsupported explanation type when calling - ResponsibleAIDashboard(rai_insights) #2599

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
bandipavan opened this issue Mar 6, 2025 · 7 comments

Comments

@bandipavan
Copy link

Describe the bug
Getting ValueError: Unsupported explanation type when calling - ResponsibleAIDashboard(rai_insights)

To Reproduce
this is part of responsibleaidashboard-blbooksgenre-binary-text-classification-model-debugging.ipynb

  1. Go to '...'
  2. Click on '....'
  3. Scroll down to '....'
  4. See error

Stack trace
ValueError Traceback (most recent call last)
File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\responsibleai_text\managers\explainer_manager.py:290, in ExplainerManager._get_interpret(self, explanation)
289 importances = FeatureImportance()
--> 290 features, scores, intercept = self._compute_global_importances(
291 explanation)
292 importances.featureNames = features

File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\responsibleai_text\managers\explainer_manager.py:317, in ExplainerManager._compute_global_importances(self, explanation)
316 if is_classif_task:
--> 317 global_exp = explanation[:, :, :].mean(0)
318 features = convert_to_list(global_exp.feature_names)

File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\shap_explanation.py:422, in Explanation.getitem(self, item)
421 new_self = copy.copy(self)
--> 422 new_self._s = new_self._s.getitem(item)
423 new_self.op_history.append({
424 "name": "getitem",
425 "args": (item,),
426 "prev_shape": self.shape
427 })

File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\slicer\slicer.py:112, in Slicer.getitem(self, item)
111 slicer_index = index_slicer[tracked.dim]
--> 112 sliced_o = tracked[slicer_index]
113 sliced_dim = resolve_dim(index_tup, tracked.dim)

File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\slicer\slicer_internal.py:69, in AtomicSlicer.getitem(self, item)
68 # Slice according to object type.
---> 69 return UnifiedDataHandler.slice(self.o, index_tup, self.max_dim)

File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\slicer\slicer_internal.py:583, in UnifiedDataHandler.slice(cls, o, index_tup, max_dim)
582 is_element, sliced_o, cut = head_slice(o, index_tup, max_dim)
--> 583 out = tail_slice(sliced_o, index_tup[cut:], max_dim - cut, is_element)
584 return out

File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\slicer\slicer_internal.py:443, in ArrayHandler.tail_slice(cls, o, tail_index, max_dim, flatten)
441 import numpy
--> 443 return numpy.array(inner)
444 elif _safe_isinstance(o, "torch", "Tensor"):

ValueError: setting an array element with a sequence. The requested array has an inhomogeneous shape after 1 dimensions. The detected shape was (20,) + inhomogeneous part.

The above exception was the direct cause of the following exception:

ValueError Traceback (most recent call last)
Cell In[1], line 98
94 rai_insights.error_analysis.add()
96 rai_insights.compute()
---> 98 ResponsibleAIDashboard(rai_insights)

File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\raiwidgets\responsibleai_dashboard.py:40, in ResponsibleAIDashboard.init(self, analysis, public_ip, port, locale, cohort_list, is_private_link, **kwargs)
36 def init(self, analysis: RAIInsights,
37 public_ip=None, port=None, locale=None,
38 cohort_list=None, is_private_link=False,
39 **kwargs):
---> 40 self.input = ResponsibleAIDashboardInput(
41 analysis, cohort_list=cohort_list)
43 super(ResponsibleAIDashboard, self).init(
44 dashboard_type="ResponsibleAI",
45 model_data=self.input.dashboard_input,
(...) 50 is_private_link=is_private_link,
51 **kwargs)
53 def predict():

File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\raiwidgets\responsibleai_dashboard_input.py:41, in ResponsibleAIDashboardInput.init(self, analysis, cohort_list)
39 model = analysis.model
40 self._is_classifier = is_classifier(model)
---> 41 self.dashboard_input = analysis.get_data()
43 self._validate_cohort_list(cohort_list)
45 self._feature_length = len(self.dashboard_input.dataset.feature_names)

File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\responsibleai_text\rai_text_insights\rai_text_insights.py:490, in RAITextInsights.get_data(self)
488 data = RAIInsightsData()
489 data.dataset = self._get_dataset()
--> 490 data.modelExplanationData = self.explainer.get_data()
491 data.errorAnalysisData = self.error_analysis.get_data()
492 return data

File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\responsibleai_text\managers\explainer_manager.py:263, in ExplainerManager.get_data(self)
257 def get_data(self):
258 """Get explanation data
259
260 :return: A array of ModelExplanationData.
261 :rtype: List[ModelExplanationData]
262 """
--> 263 return [self._get_interpret(i) for i in self.get()]

File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\responsibleai_text\managers\explainer_manager.py:263, in (.0)
257 def get_data(self):
258 """Get explanation data
259
260 :return: A array of ModelExplanationData.
261 :rtype: List[ModelExplanationData]
262 """
--> 263 return [self._get_interpret(i) for i in self.get()]

File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\responsibleai_text\managers\explainer_manager.py:303, in ExplainerManager._get_interpret(self, explanation)
301 interpretation.precomputedExplanations = precomputedExplanations
302 except Exception as ex:
--> 303 raise ValueError(
304 "Unsupported explanation type") from ex
305 return interpretation

ValueError: Unsupported explanation type
Expected behavior
A clear and concise description of what you expected to happen.

Screenshots
If applicable, add screenshots to help explain your problem.

Desktop (please complete the following information):

  • OS: [e.g. iOS]
  • Browser [e.g. chrome, safari]
  • Python version: [e.g. 3.9.12]
  • raiwidgets and responsibleai package versions [e.g. 0.19.0]

To get the package versions please run in your command line:

pip show raiwidgets
pip show responsibleai

Additional context
Add any other context about the problem here.

@bandipavan
Copy link
Author

It is part of example - notebooks/responsibleaidashboard/text/responsibleaidashboard-blbooksgenre-binary-text-classification-model-debugging.ipynb

@imatiach-msft
Copy link
Contributor

imatiach-msft commented Mar 6, 2025

@bandipavan sorry about the issue you are encountering. Can you please run:

pip show shap

and downgrade/upgrade shap:

pip install --upgrade "shap<=0.44.0"

There were some breaking changes in shap and this issue looks like it might be related to that package.

@bandipavan
Copy link
Author

bandipavan commented Mar 7, 2025

Thank you for the prompt response, applied the changes, no difference

pip install --upgrade "shap<=0.44.0"

Installing collected packages: shap
Attempting uninstall: shap
Found existing installation: shap 0.43.0
Uninstalling shap-0.43.0:
Successfully uninstalled shap-0.43.0
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
econml 0.15.1 requires shap<0.44.0,>=0.38.1, but you have shap 0.44.0 which is incompatible.
responsibleai 0.36.0 requires pandas<2.0.0,>=0.25.1, but you have pandas 2.2.3 which is incompatible.
Successfully installed shap-0.44.0

python .\ValidateModel.py
Model download attempt 1 of 4
Device set to use cpu
C:\Users\Pavan.Bandi\AppData\Local\Programs\Python\Python311\Lib\site-packages\transformers\pipelines\text_classification.py:106: UserWarning: return_all_scores is now deprecated, if want a similar functionality use top_k=None instead of return_all_scores=True or top_k=1 instead of return_all_scores=False.
warnings.warn(
number of errors on test dataset: 1
Dataset download attempt 1 of 4
20it [00:00, 48.26it/s]
PartitionExplainer explainer: 21it [07:54, 23.73s/it]

Error Analysis
Current Status: Generating error analysis reports.
Current Status: Finished generating error analysis reports.
Time taken: 0.0 min 0.19244809999509016 sec**

Traceback (most recent call last):
File "C:\Users\Pavan.Bandi\AppData\Local\Programs\Python\Python311\Lib\site-packages\responsibleai_text\managers\explainer_manager.py", line 290, in _get_interpret
features, scores, intercept = self._compute_global_importances(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Pavan.Bandi\AppData\Local\Programs\Python\Python311\Lib\site-packages\responsibleai_text\managers\explainer_manager.py", line 317, in _compute_global_importances
global_exp = explanation[:, :, :].mean(0)
~~~~~~~~~~~^^^^^^^^^
File "C:\Users\Pavan.Bandi\AppData\Local\Programs\Python\Python311\Lib\site-packages\shap_explanation.py", line 418, in getitem
new_self._s = new_self._s.getitem(item)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Pavan.Bandi\AppData\Local\Programs\Python\Python311\Lib\site-packages\slicer\slicer.py", line 112, in getitem
sliced_o = tracked[slicer_index]
~~~~~~~^^^^^^^^^^^^^^
File "C:\Users\Pavan.Bandi\AppData\Local\Programs\Python\Python311\Lib\site-packages\slicer\slicer_internal.py", line 69, in getitem
return UnifiedDataHandler.slice(self.o, index_tup, self.max_dim)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Pavan.Bandi\AppData\Local\Programs\Python\Python311\Lib\site-packages\slicer\slicer_internal.py", line 583, in slice
out = tail_slice(sliced_o, index_tup[cut:], max_dim - cut, is_element)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Pavan.Bandi\AppData\Local\Programs\Python\Python311\Lib\site-packages\slicer\slicer_internal.py", line 443, in tail_slice
return numpy.array(inner)
^^^^^^^^^^^^^^^^^^
ValueError: setting an array element with a sequence. The requested array has an inhomogeneous shape after 1 dimensions. The detected shape was (20,) + inhomogeneous part.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "C:\Users\Pavan.Bandi\AIValidator\ValidateModel.py", line 98, in
ResponsibleAIDashboard(rai_insights)
File "C:\Users\Pavan.Bandi\AppData\Local\Programs\Python\Python311\Lib\site-packages\raiwidgets\responsibleai_dashboard.py", line 40, in init
self.input = ResponsibleAIDashboardInput(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Pavan.Bandi\AppData\Local\Programs\Python\Python311\Lib\site-packages\raiwidgets\responsibleai_dashboard_input.py", line 41, in init
self.dashboard_input = analysis.get_data()
^^^^^^^^^^^^^^^^^^^
File "C:\Users\Pavan.Bandi\AppData\Local\Programs\Python\Python311\Lib\site-packages\responsibleai_text\rai_text_insights\rai_text_insights.py", line 490, in get_data
data.modelExplanationData = self.explainer.get_data()
^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Pavan.Bandi\AppData\Local\Programs\Python\Python311\Lib\site-packages\responsibleai_text\managers\explainer_manager.py", line 263, in get_data
return [self._get_interpret(i) for i in self.get()]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Pavan.Bandi\AppData\Local\Programs\Python\Python311\Lib\site-packages\responsibleai_text\managers\explainer_manager.py", line 263, in
return [self._get_interpret(i) for i in self.get()]
^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Pavan.Bandi\AppData\Local\Programs\Python\Python311\Lib\site-packages\responsibleai_text\managers\explainer_manager.py", line 303, in _get_interpret
raise ValueError(
ValueError: Unsupported explanation type

@imatiach-msft
Copy link
Contributor

This looks like a problem with using an older version of the interpret package.
I believe this is fixed in the latest interpret-community 0.32.0 which was released a month ago and updated to pandas>2.0 and numpy>2.0, but this repository hasn't been updated yet:
https://github.com/interpretml/interpret-community/releases/tag/v0.32.0
For now I would install:
pip install "pandas<2.0" "numpy<1.26.0"
We will upgrade this repository to support >2.0 for all workflows, no confirmed ETA yet though.

@imatiach-msft
Copy link
Contributor

imatiach-msft commented Mar 10, 2025

@bandipavan The key error in the stack trace above is:

ValueError: setting an array element with a sequence. The requested array has an inhomogeneous shape after 1 dimensions. The detected shape was (20,) + inhomogeneous part.

this is coming from numpy>1.26 and older interpret package code.
I fixed a similar issue here:
#2594
The quick fix is to install the older version of numpy<1.26.0. I also noticed based on logs that pandas>2.0 and that might encounter some issues as well.
The other possibility is to update interpret-core package to latest but there could be other issues there so I wouldn't recommend it.

@bandipavan
Copy link
Author

@imatiach-msft Applied the suggested changes, but unfortunately I am still getting the same error, here is the list of packages with their version

Package Version


aiohappyeyeballs 2.5.0
aiohttp 3.11.13
aiosignal 1.3.2
annotated-types 0.7.0
anyio 4.8.0
argon2-cffi 23.1.0
argon2-cffi-bindings 21.2.0
arrow 1.3.0
asttokens 3.0.0
async-lru 2.0.4
attrs 25.1.0
babel 2.17.0
beautifulsoup4 4.13.3
bidict 0.23.1
bleach 6.2.0
blinker 1.9.0
blis 1.2.0
catalogue 2.0.10
certifi 2025.1.31
cffi 1.17.1
charset-normalizer 3.4.1
click 8.1.8
cloudpathlib 0.21.0
cloudpickle 3.1.1
colorama 0.4.6
comm 0.2.2
confection 0.1.5
cymem 2.0.11
datasets 3.3.2
debugpy 1.8.13
decorator 5.2.1
defusedxml 0.7.1
dice-ml 0.11
dill 0.3.8
econml 0.15.1
en_core_web_sm 3.8.0
erroranalysis 0.5.5
executing 2.2.0
fairlearn 0.7.0
fastjsonschema 2.21.1
filelock 3.17.0
Flask 3.1.0
flask-cors 5.0.1
Flask-SocketIO 5.5.1
fqdn 1.5.1
frozenlist 1.5.0
fsspec 2024.12.0
gender-guesser 0.4.0
gevent 24.11.1
greenlet 3.1.1
h11 0.14.0
httpcore 1.0.7
httpx 0.28.1
huggingface-hub 0.29.2
idna 3.10
interpret_community 0.32.0
interpret-core 0.6.9
ipykernel 6.29.5
ipython 9.0.1
ipython_pygments_lexers 1.1.1
isoduration 20.11.0
itsdangerous 2.2.0
jedi 0.19.2
Jinja2 3.1.6
joblib 1.4.2
json5 0.10.0
jsonpointer 3.0.0
jsonschema 4.23.0
jsonschema-specifications 2024.10.1
jupyter_client 8.6.3
jupyter_core 5.7.2
jupyter-events 0.12.0
jupyter-lsp 2.2.5
jupyter_server 2.15.0
jupyter_server_terminals 0.5.3
jupyterlab 4.3.5
jupyterlab_pygments 0.3.0
jupyterlab_server 2.27.3
langcodes 3.5.0
language_data 1.3.0
lightgbm 4.6.0
llvmlite 0.41.1
marisa-trie 1.2.1
markdown-it-py 3.0.0
MarkupSafe 3.0.2
matplotlib-inline 0.1.7
mdurl 0.1.2
mistune 3.1.2
ml_wrappers 0.6.0
mpmath 1.3.0
multidict 6.1.0
multiprocess 0.70.16
murmurhash 1.0.12
nbclient 0.10.2
nbconvert 7.16.6
nbformat 5.10.4
negspacy 1.0.4
nest-asyncio 1.6.0
networkx 2.5
nlp-feature-extractors 0.1.0
notebook 7.3.2
notebook_shim 0.2.4
numba 0.58.1
numpy 1.25.2
overrides 7.7.0
packaging 24.2
pandas 1.5.3
pandocfilters 1.5.1
parso 0.8.4
patsy 1.0.1
pillow 11.1.0
pip 25.0.1
platformdirs 4.3.6
preshed 3.0.9
prometheus_client 0.21.1
prompt_toolkit 3.0.50
propcache 0.3.0
psutil 7.0.0
pure_eval 0.2.3
pyarrow 19.0.1
pycparser 2.22
pydantic 2.10.6
pydantic_core 2.27.2
Pygments 2.19.1
python-dateutil 2.9.0.post0
python-dotenv 1.0.1
python-engineio 4.11.2
python-json-logger 3.2.1
python-socketio 5.12.1
pytz 2025.1
pywin32 308
pywinpty 2.0.15
PyYAML 6.0.2
pyzmq 26.2.1
rai_core_flask 0.7.6
raiutils 0.4.2
raiwidgets 0.36.0
referencing 0.36.2
regex 2024.11.6
requests 2.32.3
responsibleai 0.36.0
responsibleai_text 0.2.7
rfc3339-validator 0.1.4
rfc3986-validator 0.1.1
rich 13.9.4
rpds-py 0.23.1
safetensors 0.5.3
scikit-learn 1.5.1
scipy 1.15.2
semver 2.13.0
Send2Trash 1.8.3
sentencepiece 0.2.0
setuptools 65.5.0
shap 0.43.0
shellingham 1.5.4
simple-websocket 1.1.0
six 1.17.0
slicer 0.0.7
smart-open 7.1.0
sniffio 1.3.1
soupsieve 2.6
spacy 3.8.4
spacy-legacy 3.0.12
spacy-loggers 1.0.5
sparse 0.15.5
srsly 2.5.1
stack-data 0.6.3
statsmodels 0.13.5
sympy 1.13.1
terminado 0.18.1
thinc 8.3.4
threadpoolctl 3.5.0
tinycss2 1.4.0
tokenizers 0.21.0
torch 2.6.0
tornado 6.4.2
tqdm 4.67.1
traitlets 5.14.3
transformers 4.49.0
typer 0.15.2
types-python-dateutil 2.9.0.20241206
typing_extensions 4.12.2
tzdata 2025.1
uri-template 1.3.0
urllib3 2.3.0
wasabi 1.1.3
wcwidth 0.2.13
weasel 0.4.1
webcolors 24.11.1
webencodings 0.5.1
websocket-client 1.8.0
Werkzeug 3.1.3
wrapt 1.17.2
wsproto 1.2.0
xxhash 3.5.0
yarl 1.18.3
zope.event 5.0
zope.interface 7.2

Stacktrace
alueError Traceback (most recent call last)
File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\responsibleai_text\managers\explainer_manager.py:290, in ExplainerManager._get_interpret(self, explanation)
289 importances = FeatureImportance()
--> 290 features, scores, intercept = self._compute_global_importances(
291 explanation)
292 importances.featureNames = features

File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\responsibleai_text\managers\explainer_manager.py:317, in ExplainerManager._compute_global_importances(self, explanation)
316 if is_classif_task:
--> 317 global_exp = explanation[:, :, :].mean(0)
318 features = convert_to_list(global_exp.feature_names)

File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\shap_explanation.py:422, in Explanation.getitem(self, item)
421 new_self = copy.copy(self)
--> 422 new_self._s = new_self._s.getitem(item)
423 new_self.op_history.append({
424 "name": "getitem",
425 "args": (item,),
426 "prev_shape": self.shape
427 })

File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\slicer\slicer.py:112, in Slicer.getitem(self, item)
111 slicer_index = index_slicer[tracked.dim]
--> 112 sliced_o = tracked[slicer_index]
113 sliced_dim = resolve_dim(index_tup, tracked.dim)

File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\slicer\slicer_internal.py:69, in AtomicSlicer.getitem(self, item)
68 # Slice according to object type.
---> 69 return UnifiedDataHandler.slice(self.o, index_tup, self.max_dim)

File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\slicer\slicer_internal.py:583, in UnifiedDataHandler.slice(cls, o, index_tup, max_dim)
582 is_element, sliced_o, cut = head_slice(o, index_tup, max_dim)
--> 583 out = tail_slice(sliced_o, index_tup[cut:], max_dim - cut, is_element)
584 return out

File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\slicer\slicer_internal.py:443, in ArrayHandler.tail_slice(cls, o, tail_index, max_dim, flatten)
441 import numpy
--> 443 return numpy.array(inner)
444 elif _safe_isinstance(o, "torch", "Tensor"):

ValueError: setting an array element with a sequence. The requested array has an inhomogeneous shape after 1 dimensions. The detected shape was (20,) + inhomogeneous part.

The above exception was the direct cause of the following exception:

ValueError Traceback (most recent call last)
Cell In[1], line 98
94 rai_insights.error_analysis.add()
96 rai_insights.compute()
---> 98 ResponsibleAIDashboard(rai_insights)

File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\raiwidgets\responsibleai_dashboard.py:40, in ResponsibleAIDashboard.init(self, analysis, public_ip, port, locale, cohort_list, is_private_link, **kwargs)
36 def init(self, analysis: RAIInsights,
37 public_ip=None, port=None, locale=None,
38 cohort_list=None, is_private_link=False,
39 **kwargs):
---> 40 self.input = ResponsibleAIDashboardInput(
41 analysis, cohort_list=cohort_list)
43 super(ResponsibleAIDashboard, self).init(
44 dashboard_type="ResponsibleAI",
45 model_data=self.input.dashboard_input,
(...) 50 is_private_link=is_private_link,
51 **kwargs)
53 def predict():

File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\raiwidgets\responsibleai_dashboard_input.py:41, in ResponsibleAIDashboardInput.init(self, analysis, cohort_list)
39 model = analysis.model
40 self._is_classifier = is_classifier(model)
---> 41 self.dashboard_input = analysis.get_data()
43 self._validate_cohort_list(cohort_list)
45 self._feature_length = len(self.dashboard_input.dataset.feature_names)

File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\responsibleai_text\rai_text_insights\rai_text_insights.py:490, in RAITextInsights.get_data(self)
488 data = RAIInsightsData()
489 data.dataset = self._get_dataset()
--> 490 data.modelExplanationData = self.explainer.get_data()
491 data.errorAnalysisData = self.error_analysis.get_data()
492 return data

File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\responsibleai_text\managers\explainer_manager.py:263, in ExplainerManager.get_data(self)
257 def get_data(self):
258 """Get explanation data
259
260 :return: A array of ModelExplanationData.
261 :rtype: List[ModelExplanationData]
262 """
--> 263 return [self._get_interpret(i) for i in self.get()]

File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\responsibleai_text\managers\explainer_manager.py:263, in (.0)
257 def get_data(self):
258 """Get explanation data
259
260 :return: A array of ModelExplanationData.
261 :rtype: List[ModelExplanationData]
262 """
--> 263 return [self._get_interpret(i) for i in self.get()]

File ~\AppData\Local\Programs\Python\Python311\Lib\site-packages\responsibleai_text\managers\explainer_manager.py:303, in ExplainerManager._get_interpret(self, explanation)
301 interpretation.precomputedExplanations = precomputedExplanations
302 except Exception as ex:
--> 303 raise ValueError(
304 "Unsupported explanation type") from ex
305 return interpretation

ValueError: Unsupported explanation type

@imatiach-msft
Copy link
Contributor

@bandipavan can you try to upgrade slicer to 0.0.8:

pip install --upgrade slicer

I think this commit may have fixed it:
interpretml/slicer@74b3683

Or at least it seems like the package should work better with the latest version of numpy.

I'm hoping that the latest 0.0.8 should work:
https://pypi.org/project/slicer/0.0.8/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants