Module Utils¶
- opacus.utils.module_utils.are_state_dict_equal(sd1, sd2)[source]¶
Compares two state dicts, while logging discrepancies
- opacus.utils.module_utils.clone_module(module)[source]¶
Handy utility to clone an nn.Module. PyTorch doesn’t always support copy.deepcopy(), so it is just easier to serialize the model to a BytesIO and read it from there.
- opacus.utils.module_utils.get_submodule(module, target)[source]¶
Returns the submodule given by target if it exists, otherwise throws an error.
This is copy-pasta of Pytorch 1.9’s
get_submodule()
implementation; and is included here to also support Pytorch 1.8. This function can be removed in favour ofmodule.get_submodule()
once Opacus abandons support for torch 1.8.See more details at https://pytorch.org/docs/stable/generated/torch.nn.Module.html?highlight=get_submodule#torch.nn.Module.get_submodule
- Parameters:
- Return type:
- Returns:
The submodule given by target if it exists
- Raises:
AttributeError – If submodule doesn’t exist
- opacus.utils.module_utils.parametrized_modules(module)[source]¶
Recursively iterates over all submodules, returning those that have parameters (as opposed to “wrapper modules” that just organize modules).
- opacus.utils.module_utils.requires_grad(module, *, recurse=False)[source]¶
Checks if any parameters in a specified module require gradients.