shap.AdditiveExplainer

class shap.AdditiveExplainer(model, masker)

Computes SHAP values for generalized additive models.

This assumes that the model only has first order effects. Extending this to 2nd and third order effects is future work (if you apply this to those models right now you will get incorrect answers that fail additivity).

__init__(model, masker)

Build an explainers.Exact object for the given model using the given masker object.

Parameters
modelfunction

A callable python object that executes the model given a set of input data samples.

maskerfunction or numpy.array or pandas.DataFrame

A callable python object used to “mask” out hidden features of the form masker(mask, *fargs). It takes a single a binary mask and an input sample and returns a matrix of masked samples. These masked samples are evaluated using the model function and the outputs are then averaged. As a shortcut for the standard masking used by SHAP you can pass a background data matrix instead of a function and that matrix will be used for masking. To use a clustering game structure you can pass a shap.maskers.Tabular(data, hclustering=”correlation”) object, but note that this structure information has no effect on the explanations of additive models.

Methods

__init__(model, masker)

Build an explainers.Exact object for the given model using the given masker object.

explain_row(*row_args, max_evals, silent)

Explains a single row and returns the tuple (row_values, row_expected_values, row_mask_shapes).

supports_model(model)

Determines if this explainer can handle the given model.

explain_row(*row_args, max_evals, silent)

Explains a single row and returns the tuple (row_values, row_expected_values, row_mask_shapes).

static supports_model(model)

Determines if this explainer can handle the given model.

This is an abstract static method meant to be implemented by each subclass.