wallaroo.explainability
WindowRequestInput(start: Union[str, NoneType], end: Union[str, NoneType], num_samples: Union[int, NoneType])
Built-in mutable sequence.
If no argument is given, the constructor creates a new empty list. The argument must be an iterable if specified.
Inherited Members
- builtins.list
- list
- clear
- copy
- append
- insert
- extend
- pop
- remove
- index
- count
- reverse
- sort
Adds a _repr_html_ to a list of explainability requests.
Inherited Members
- builtins.list
- list
- clear
- copy
- append
- insert
- extend
- pop
- remove
- index
- count
- reverse
- sort
FeatureBounds(min: float, max: float, xs: List[float])
This class specifies an explainability configuration that can be used to later submit explainability requests which cause the server to do the analysis and create explainability results.
ExplainabilityConfig are necessary to ensure the explainability pipeline is created and is deployed and so that various requests are processed in the same manner and can be compared.
id, status, feature_bounds and reference_pipeline_version are optional and will be filled out when processed and saved to the database.
workspace id must match the users/pipelines workspace and reference_pipeline_version must refer to a valid pipeline version that the user has access too.
num_points specifies how many samples to take when varying the values of a feature for the PDP/ICE analysis through the feature_bounds.
feature_names are convinince for the user. output_names is not currently used.
Get the full explainability result whether completed or not.
Submit an analysis on reference or adhoc data using a particular config
This class outlines what should be submitted to start the explainability analysis with a particular config.
The request can be to analyze reference data, historical data from the ref pipeline, or new adhoc data submitted with the request or both.
id and status are optional and are filled in by the processing steps.
If the request has use_reference_data = True, num_sample inference logs are sampled from between the start and end dates or the entire (last 100_000) inferences.
Get the full explainability result whether completed or not.
This class holds the PDP/ICE part of the results. PDP/ICE results are generated for each observation by holding all but one feature constant, varying that feature and analyzing that prediction. Thus the results are per inference per feature.
feature_name is the feature that this result is for. xs is the list of x values that the feature was varied through.
pdp_vals is the list of resulting values. model, shap and feature expected values are the mean/expected values for that model, shap and feature.
WindowResult(data: numpy.ndarray, shap_values: numpy.ndarray, base_values: numpy.ndarray, pdp_results: List[wallaroo.explainability.PDPResult])
Gets the pdp result object for the specified feature.
This class holds the explainability result created by processing an explainability request.
id and status are optional and will be filled in by processing. The id will be the same as the request id since the results are stored in minio.
num_inferences and num_batches are nice to know status information and could be brought into the status object in the future.
reference and adhoc data are the actual inferences used in the analysis.
reference and adhoc shap values are the shap values for each feature for each prediction.
base_values are the expected value for each prediction. These values will all be the same so may be changed to a single float in the future.
pdp results are a list of pdp/ice results for each feature.
Returns a dataframe summarizing the mean feature effects of the reference data as well as the feature effects for each adhoc inference.
Returns a dataframe with the expected/mean values and the shap adjustments.
Creates a bar plot of the mean or max abs feature effects.
Creates a combination ICE plot for the adhoc data if any in custom colors and the reference data if any in translucent blue.
Convinience function to parse json into the full result object we want.