Module Utils¶
- opacus.utils.module_utils.are_state_dict_equal(sd1, sd2)[source]¶
- Compares two state dicts, while logging discrepancies 
- opacus.utils.module_utils.clone_module(module)[source]¶
- Handy utility to clone an nn.Module. PyTorch doesn’t always support copy.deepcopy(), so it is just easier to serialize the model to a BytesIO and read it from there. When - weights_only=False,- torch.load()uses “pickle” module implicity, which is known to be insecure. Only load the model you trust.
- opacus.utils.module_utils.get_submodule(module, target)[source]¶
- Returns the submodule given by target if it exists, otherwise throws an error. - This is copy-pasta of Pytorch 1.9’s - get_submodule()implementation; and is included here to also support Pytorch 1.8. This function can be removed in favour of- module.get_submodule()once Opacus abandons support for torch 1.8.- See more details at https://pytorch.org/docs/stable/generated/torch.nn.Module.html?highlight=get_submodule#torch.nn.Module.get_submodule - Parameters:
- Return type:
- Returns:
- The submodule given by target if it exists 
- Raises:
- AttributeError – If submodule doesn’t exist 
 
- opacus.utils.module_utils.parametrized_modules(module)[source]¶
- Recursively iterates over all submodules, returning those that have parameters (as opposed to “wrapper modules” that just organize modules). 
- opacus.utils.module_utils.requires_grad(module, *, recurse=False)[source]¶
- Checks if any parameters in a specified module require gradients.