SmoothGrad was introduced by D. Smilkov et al. (2017) and is an extension to the classical Vanilla Gradient method. It takes the mean of the gradients for n perturbations of each data point, i.e., with $$\epsilon \sim N(0,\sigma)$$ $$1/n \sum_n d f(x+ \epsilon)_j / d x_j.$$ Analogous to the Gradient$$\times$$Input method, you can also use the argument times_input to multiply the gradients by the inputs before taking the average (SmoothGrad$$\times$$Input).

The R6 class can also be initialized using the run_smoothgrad function as a helper function so that no prior knowledge of R6 classes is required.

References

D. Smilkov et al. (2017) SmoothGrad: removing noise by adding noise. CoRR, abs/1706.03825

Other methods: ConnectionWeights, DeepLift, DeepSHAP, ExpectedGradient, Gradient, IntegratedGradient, LIME, LRP, SHAP

Super classes

innsight::InterpretingMethod -> innsight::GradientBased -> SmoothGrad

Public fields

n

(integer(1))
Number of perturbations of the input data (default: $$50$$).

noise_level

(numeric(1))
The standard deviation of the Gaussian perturbation, i.e., $$\sigma = (max(x) - min(x)) *$$ noise_level.

Methods

Public methods

Inherited methods

Method new()

Create a new instance of the SmoothGrad R6 class. When initialized, the method SmoothGrad or SmoothGrad$$\times$$Input is applied to the given data and the results are stored in the field result.

Arguments

deep

Whether to make a deep clone.

Examples

# ------------------------- Example 1: Torch -------------------------------
library(torch)

# Create nn_sequential model and data
model <- nn_sequential(
nn_linear(5, 10),
nn_relu(),
nn_linear(10, 2),
nn_sigmoid()
)
data <- torch_randn(25, 5)

# Create Converter
converter <- convert(model, input_dim = c(5))

# You can also use the helper function run_smoothgrad for initializing

# Print the result as a data.frame for first 5 rows
#>     data model_input model_output feature output_node       value      pred
#> 1 data_1     Input_1     Output_1      X1          Y1  0.07527412 0.6016443
#> 2 data_2     Input_1     Output_1      X1          Y1  0.02443361 0.5961083
#> 3 data_3     Input_1     Output_1      X1          Y1  0.02494310 0.6162397
#> 4 data_4     Input_1     Output_1      X1          Y1  0.09630632 0.6364036
#> 5 data_5     Input_1     Output_1      X1          Y1 -0.04847851 0.5773661
#>    decomp_sum decomp_goal input_dimension
#> 1 -0.05964340   0.4123209               1
#> 2 -0.05971592   0.3892754               1
#> 3 -0.21266545   0.4736177               1
#> 4 -0.04296483   0.5597886               1
#> 5 -0.06627831   0.3119700               1

# Plot the result for both classes

# Plot the boxplot of all datapoints

# ------------------------- Example 2: Neuralnet ---------------------------
if (require("neuralnet")) {
library(neuralnet)
data(iris)

# Train a neural network
nn <- neuralnet(Species ~ ., iris,
linear.output = FALSE,
hidden = c(10, 5),
act.fct = "logistic",
rep = 1
)

# Convert the trained model
converter <- convert(nn)

# Plot the result for the first and 60th data point and all classes
plot(smoothgrad, data_idx = c(1, 60), output_idx = 1:3)

# Calculate SmoothGrad x Input and do not ignore the last activation

# Plot the result again
plot(smoothgrad, data_idx = c(1, 60), output_idx = 1:3)
}

# ------------------------- Example 3: Keras -------------------------------
if (require("keras") & keras::is_keras_available()) {
library(keras)

# Make sure keras is installed properly
is_keras_available()

data <- array(rnorm(64 * 60 * 3), dim = c(64, 60, 3))

model <- keras_model_sequential()
model %>%
layer_conv_1d(
input_shape = c(60, 3), kernel_size = 8, filters = 8,
activation = "softplus", padding = "valid") %>%
layer_conv_1d(
kernel_size = 8, filters = 4, activation = "tanh",
layer_conv_1d(
kernel_size = 4, filters = 2, activation = "relu",
layer_flatten() %>%
layer_dense(units = 64, activation = "relu") %>%
layer_dense(units = 16, activation = "relu") %>%
layer_dense(units = 3, activation = "softmax")

# Convert the model
converter <- convert(model)

# Plot the result for the first datapoint and all classes

# Plot the result as boxplots for first two classes
}

#------------------------- Plotly plots ------------------------------------
if (require("plotly")) {
# You can also create an interactive plot with plotly.
# This is a suggested package, so make sure that it is installed
library(plotly)

# Result as boxplots