DPPerLayerOptimizer¶
- class opacus.optimizers.perlayeroptimizer.DPPerLayerOptimizer(optimizer, *, noise_multiplier, max_grad_norm, expected_batch_size, loss_reduction='mean', generator=None, secure_mode=False)[source]¶
DPOptimizer
that implements per layer clipping strategy- Parameters:
optimizer (
Optimizer
) – wrapped optimizer.noise_multiplier (
float
) – noise multipliermax_grad_norm (
List
[float
]) – max grad norm used for gradient clippingexpected_batch_size (
Optional
[int
]) – batch_size used for averaging gradients. When using Poisson sampling averaging denominator can’t be inferred from the actual batch size. Required isloss_reduction="mean"
, ignored ifloss_reduction="sum"
loss_reduction (
str
) – Indicates if the loss reduction (for aggregating the gradients) is a sum or a mean operation. Can take values “sum” or “mean”generator – torch.Generator() object used as a source of randomness for the noise
secure_mode (
bool
) – ifTrue
uses noise generation approach robust to floating point arithmetic attacks. See_generate_noise()
for details