This release is just intended to push better auto-deploy bundles out of travis and appveyor.
This release includes:
- Support for sklearn isolation forest courtesy of JiechengZhao
- New check_additivity tests to ensure no errors in DeepExplainer and TreeExplainer
- Fix 861, 860
- Fix missing readme example html file
- Support for spark decision tree regressor courtesy of QuentinAmbard
- Better safe isinstance checking courtesy of parsatorb
- Fix eager execution in TF < 2 courtesy of bottydim
This release contains several new features and bug fixes:
- GradientExplainer now supports TensorFlow 2.0.
- We now do a lazy load of the plotting dependencies, which means a pip install no longer needs to also pull in matplotlib, skimage, and ipython. This should make installs much lighter, especially those that don't need plotting :)
- Added a new BruteForceExplainer for easy testing and comparison on small problems.
- Added a new partial_dependence_plot function. This function will be used to illustrate the close connections between partial dependence plots and SHAP values in future example notebooks.
- Handle the multiclass case with no intercept in LinearExplainer courtesy of gabrieltseng
- Some extras_require options during the pip install courtesy of AbdealiJK
- Other small bug fixes and updates
This release is primarily to remove a dependency on dill that was not in setup.py. It also includes:
- A typo fix in force.py courtesy of jonlwowski012
- Test code cleanup courtesy of jorgecarleitao
- Fix floating point rounding mismatches in recent sklearn versions of tree models
- An update to allow easier loading of custom tree ensemble models by TreeExplainer.
- `decision_plot` documentation updates courtesy of floidgilbert
- New decision_plot function courtesy of floidgilbert
- Add alpha version of the new model agnostic PartitionExplainer
- ensure data is all on the same device for pytorch in DeepExplainer courtesy of gabrieltseng
- fix lightgbm edge case issue courtesy of imatiach-msft
- create binder setup for shap courtesy of jamesmyatt
- Allow for multiple inputs in the gradient explainer courtesy of gabrieltseng
- New KernelExplainer unit tests courtesy of jorgecarleitao
- Add python 2/3 trove classifiers courtesy of proinsias
- support for pyspark trees courtesy of QuentinAmbard
- many other bug fixes courtesy of Rygu, Kylecrif, trams, imatiach-msft, yunchuankong, invokermain, lupusomniator, satyarta, jotsif, parkerzf, jaller94, gabrieltseng, and others
- Fixes an issue in DeepExplainer caused by a change in TensorFlow 1.14.
Various bug fixes and improvements including:
- adding SHAP values for binary classification to CatBoost courtesy of dvpolyakov
- Integer division fix for plots courtesy of pmeier-tiplu
- Support passing in an Axes object to dependence_plot courtesy of mqk
- Add adaptive average pooling and conv transpose layers courtesy of of gabrieltseng
- fix import errors on a missing matplotlib backend courtesy of hchandola
- fix TreeExplainer GradientBoostingClassifier bug courtesy of prempiyush
- make tqdm play nicer with notebooks courtesy of KOLANICH
- Allow deep_pytorch to use cuda models courtesy of juliusbierk
- Fix sklearn GradientBoostingRegressor bug courtesy of nasir-bhanpuri
- adding sparse support to shap linear explainer courtesy of imatiach-msft
Fixes to support changes in the most recent version of sklearn
A few contribution highlights of this release (in chronological order)
- Better testing courtesy of jorgecarleitao
- Image plot customizations courtesy of verdimrc
- Batch norm support for PyTorch in DeepExplainer courtesy of JiechengZhao
- Leaky ReLU and other conv layer support for pytorch deep explainer courtesy of gabrieltseng
- Fixed keras multi input in gradient explainer and improved random seeds courtesy of moritzaugustin
- Support for catBoost ranker courtesy of doramir
- Added XGBRanker and LGBMRanker to TreeExplainer courtesy of imatiach-msft
- Fix embedding lookup with tf.keras in DeepExplainer courtesy of andriy-nikolov
- Custom dependence_plot colors maps courtesy of rcarneva
- Fix divide by zero issues possible with CatBoost models courtesy of dvpolyakov
- Lots of other bug fixes/improvements!
This release is just to refresh the Windows builds on AppVeyor that didn't complete for 0.28.4
- Fixes memory corruption error from TreeExplainer (courtesy of imatiach-msft)
- Adds support for skopt Random Forest and ExtraTrees Regressors (courtesy of Bacoknight)
- Adds support for matplotlib forceplot with text rotation (courtesy of vatsan)
- Adds a save_html function
- Fix some plot coloring issues introduced by 0.28 (such as 406)
- Downgrade numpy API usage to support older versions.
- Fixes a byte-alignment issue on Windows when loading XGBoost models.
- Now matches tree_limit use in XGBoost models courtesy of HughChen
- Fix an issue with the expected_value of transformed model outputs in TreeExplainer
- Add support for rank-based feature selection in `KernelExplainer`.
- Depreciate `l1_reg="auto"` in `KernelExplainer` in favor of eventually defaulting to `l1_reg="num_features(10)"`
- New color scales based on the Lch color space.
- Better auto-color choices for multi-class summary plots.
- Better plotting of NaN values in dependence_plots
- Updates for Pytorch 1.0 courtesy of gabrieltseng
- Fix the sklearn DecisionTreeClassifier handling to correctly normalize to a probability output
- Enable multi-output model support for `TreeExplainer` when `feature_dependence="independent"`
- Correctly load the objective of LightGBM models for use in explaining the model loss.
- Fix numerical precision mismatch with sklearn models.
- Fix numerical precision mismatch with XGBoost models by now directly loading from memory instead of JSON.
- Better hierarchal clustering orderings that now rotate subtrees to give more continuity.
- Work around XGBoost JSON issue.
- Account for NaNs when doing auto interaction detection.
- PyTorch fixes.
- Updated LinearExplainer.
- Complete refactor of TreeExplainer to support deeper C++ integration
- The ability to explain transformed outputs of tree models in TreeExplainer, including the loss. In collaboration with HughChen
- Allow for a dynamic reference value in DeepExplainer courtesy of AvantiShri
- Add `x_jitter` option for categorical dependence plots courtesy of ihopethiswillfi
- Added support for GradientBoostingRegressor with quantile loss courtesy of dmilad
- Better plotting support for NaN values
- Fixes several bugs.
- Allows ordering_keys to be given to force_plot courtesy of JasonTam
- Fixes sparse nonzero background issue with KernelExplainer courtesy of imatiach-msft
- Fix to support tf.concat in DeepExplainer.
Fixes a problem where tree_shap.h was not included in the pip bundle.
- Support for PyTorch in GradientExplainer and preliminary support for PyTorch in DeepExplainer courtesy of gabrieltseng.
- A matplotlib version of the single sample force_plot courtesy of jverre.
- Support functional Keras models in GradientExplainer.
- KernelExplainer speed improvements.
- Various performance improvements and bug fixes.
New improvements include: Faster KernelExplainer execution for sparse inputs. Support for sklearn gradient boosting classifiers. DeepExplainer extended to support very deep models.
This fixes numerical stability issues with the softmax operator for DeepExplainer. It also fixes a minor alignment issue with image_plot.
This release includes a nice update courtesy of imatiach-msft for KernelExplainer. KernelExplainer now runs faster and supports sparse data matrices!
We have also refactored DeepExplainer and made it compatible with TensorFlow 1.10. There are still a few issues to track down, but DeepExplainer is getting more complete :)
Fixes a problem with DeepExplainer on TensorFlow >= 1.9. Fixes a bar plotting issue.
Fix a pip packaging error with `other` explainers.
Fix an import error introduced in the last release when installing from pip.
Integrates the JS code from `iml` into `shap` to simplify dependencies. Adds support for more TensorFlow components in `DeepExplainer`. Refactors the plotting functions and removes some long-deprecated functions. Fixes an error in KernelExplainer when using a non-zero reference value (192).
A new LinearExplainer that can estimate SHAP values for linear models while accounting for correlations among the input features.
Fixes some issues with categorical features in LightGBM. Also fixes some issues created by the v.20 API changes.