Visit ComfyUI Online for ready-to-use ComfyUI environment
Generate SHAP summary plot and text report highlighting top feature importance for model interpretation.
The NntSHAPSummaryNode is designed to provide a comprehensive analysis of model predictions using SHAP (SHapley Additive exPlanations) values, which are a popular method for interpreting machine learning models. This node generates a visual summary of feature importance, helping you understand which features most significantly impact the model's predictions. By leveraging SHAP values, the node offers insights into the model's decision-making process, making it easier to identify key features and their contributions. This is particularly beneficial for AI artists and developers who want to ensure their models are not only accurate but also interpretable. The node's primary function is to create a SHAP summary plot and a text report that highlights the top features by importance, providing a clear and concise overview of the model's behavior.
The model
parameter represents the machine learning model you wish to analyze. It is crucial as it determines the context in which SHAP values are calculated, directly affecting the interpretation of feature importance. There are no specific minimum or maximum values for this parameter, but it should be a trained model compatible with SHAP's KernelExplainer.
The X_train_sample
parameter is a sample of your training data used to establish a baseline for SHAP value calculations. It impacts the accuracy and relevance of the SHAP explanations. The minimum size of this sample should be equal to or greater than the background_sample_size
parameter to avoid errors.
The X_test_sample
parameter is the data sample for which you want to generate SHAP explanations. It is essential for determining how the model's predictions are influenced by different features. There are no specific size constraints, but it should be representative of the data you are interested in analyzing.
The plot_type
parameter specifies the type of plot to generate for visualizing SHAP values. Options include "dot", "bar", and other SHAP-supported plot types. The choice of plot type affects how the feature importance is visually represented, with "dot" being the default option for a detailed view.
The background_sample_size
parameter defines the number of samples from X_train_sample
to use as a background for SHAP value calculations. It influences the stability and accuracy of the SHAP explanations. The default value is 100, and it should not exceed the size of X_train_sample
.
The image_tensor
output is a visual representation of the SHAP summary plot, converted into a tensor format. This output is crucial for visually interpreting the feature importance and understanding the model's decision-making process. It provides a normalized image tensor that can be easily integrated into further analysis or visualization workflows.
The text_report
output is a textual summary of the top features by importance, based on SHAP values. This report is essential for quickly identifying which features have the most significant impact on the model's predictions, offering a concise and interpretable overview of feature importance.
X_train_sample
is large enough to accommodate the background_sample_size
to avoid errors and ensure accurate SHAP value calculations.plot_type
parameter to customize the visualization according to your preferences and the specific insights you wish to gain from the SHAP summary.background_sample_size
is larger than the number of samples in X_train_sample
.background_sample_size
to be equal to or less than the size of X_train_sample
.<specific error message>
RunComfy is the premier ComfyUI platform, offering ComfyUI online environment and services, along with ComfyUI workflows featuring stunning visuals. RunComfy also provides AI Playground, enabling artists to harness the latest AI tools to create incredible art.