# Clipping Utils¶

class opacus.utils.clipping.ClippingMethod(value)[source]

An enumeration.

class opacus.utils.clipping.ConstantFlatClipper(flat_value)[source]

A clipper that clips all gradients in such a way that their norm is at most equal to a specified value. This value is shared for all layers in a model. Note that the process of clipping really involves multiplying all gradients by a scaling factor. If this scaling factor is > 1.0, it is instead capped at 1.0. The net effect is that the final norm of the scaled gradients will be less than the specified value in such a case. Thus it is better to think of the specified value as an upper bound on the norm of final clipped gradients.

Parameters

flat_value (float) – Constant value that is used to normalize gradients such that their norm equals this value before clipping. This threshold value is used for all layers.

calc_clipping_factors(norms)[source]

Calculates the clipping factor based on the given norm of gradients for all layers, so that the new norm of clipped gradients is at most equal to self.flat_value.

Parameters

norms (List[Tensor]) – List containing a single tensor of dimension (1,) with the norm of all gradients.

Return type
Returns

Tensor containing the single threshold value to be used for all layers.

property is_per_layer: bool

Returns indicator as to whether different clipping is applied to each layer in the model. For this clipper, it is False.

Return type

bool

Returns

Flag with value False

property thresholds: torch.Tensor

Returns singleton tensor of dimension (1,) containing the common threshold value used for clipping all layers in the model.

Return type

Tensor

Returns

Threshold values

class opacus.utils.clipping.ConstantPerLayerClipper(flat_values)[source]

A clipper that clips all gradients in such a way that their norm is at most equal to a specified value. This value is specified for each layer in a model. Note that the process of clipping really involves multiplying all gradients by a scaling factor. If this scaling factor is > 1.0, it is instead capped at 1.0. The net effect is that the final norm of the scaled gradients will be less than the specified value in such a case. Thus it is better to think of the specified value as an upper bound on the norm of final clipped gradients.

Parameters

flat_values (List[float]) – List of values that is used to normalize gradients for each layer such that the norm equals the corresponding value before clipping.

calc_clipping_factors(norms)[source]

Calculates separate clipping factors for each layer based on its corresponding norm of gradients, such that its new norm is at most equal to the flat value specified for that layer when instantiating the object of ConstantPerLayerClipper.

Parameters

norms (List[Tensor]) – List containing the desired norm of gradients for each layer.

Return type
Returns

List of tensors, each containing a single value specifying the clipping factor per layer.

property is_per_layer: bool

Returns indicator as to whether different clipping is applied to each layer in the model. For this clipper, it is True.

Return type

bool

Returns

Flag with value True

property thresholds: torch.Tensor

Returns a tensor of values that are used to normalize gradients for each layer such that the norm at most equals the corresponding value before clipping.

Return type

Tensor

Returns

Tensor of thresholds

class opacus.utils.clipping.NormClipper[source]

An abstract class to calculate the clipping factor

calc_clipping_factors(norms)[source]

Calculates the clipping factor(s) based on the given parameters. A concrete subclass must implement this.

Return type
Returns

The clipping factors

property is_per_layer: bool

Depending on type of clipper, returns indicator as to whether different clipping is applied to each layer in the model.

Return type

bool

Returns

Flag indicator as to whether different clipping is applied to each layer in the model.

property thresholds: torch.Tensor

Depending on the type of clipper, returns threshold values.

Return type

Tensor

Returns

The threshold values