Calculate SmoothGrad by averaging gradients over noisy samples of the input. This function provides a lightweight alternative to run_smoothgrad and SmoothGrad that is more efficient for torch models as it avoids model conversion overhead and uses native torch autograd directly. Therefore, it can be used for any torch::nn_module without restrictions on architecture or layers.

torch_smoothgrad(
  model,
  data,
  output_idx = NULL,
  n = 50,
  noise_level = 0.1,
  times_input = FALSE,
  dtype = "float",
  return_object = FALSE
)

Arguments

model

(nn_module)
A torch model.

data

(torch_tensor, array, or matrix)
Input data.

output_idx

(integer)
Index or indices of output nodes. Default: NULL (all outputs).

n

(integer(1))
Number of noisy samples. Default: 50.

noise_level

(numeric(1))
Standard deviation of noise relative to input range: \(\sigma = (max(x) - min(x)) *\) noise_level. Default: 0.1.

times_input

(logical(1))
If TRUE, multiplies gradients by input (SmoothGrad×Input). Default: FALSE.

dtype

(character(1))
Data type: "float" or "double". Default: "float".

return_object

(logical(1))
If TRUE, returns a InterpretingMethod object with methods like plot() and get_result(). If FALSE (default), returns a raw torch_tensor.

Value

If return_object = FALSE (default): A torch_tensor containing the smoothed gradients with shape (batch_size, ..., n_outputs). If return_object = TRUE: A InterpretingMethod object.

Details

SmoothGrad computes gradients for n noisy versions of each input and averages them. With \(\epsilon \sim N(0,\sigma)\):

$$1/n \sum_{i=1}^n \frac{\partial f(x+ \epsilon_i)}{\partial x}$$

This reduces noise in gradient-based explanations.

References

D. Smilkov et al. (2017) SmoothGrad: removing noise by adding noise. arXiv:1706.03825

See also

Examples

library(torch)

model <- nn_sequential(nn_linear(10, 3))
data <- torch_randn(5, 10)

# Standard SmoothGrad
smooth_grads <- torch_smoothgrad(model, data)

# SmoothGrad×Input
smooth_grads <- torch_smoothgrad(model, data, times_input = TRUE)

# More samples for smoother result
smooth_grads <- torch_smoothgrad(model, data, n = 100)