Opacus
  • Introduction
  • FAQ
  • Tutorials
  • API Reference
  • GitHub
Project Logo

Train PyTorch models with Differential Privacy

Introduction
Get Started
Tutorials

Key Features

Scalable logo

Scalable

Vectorized per-sample gradient computation that is 10x faster than microbatching

PyTorch logo

Built on PyTorch

Supports most types of PyTorch models and can be used with minimal modification to the original neural network.

Extensible logo

Extensible

Open source, modular API for differential privacy research. Everyone is welcome to contribute.

Get Started

  1. Installing Opacus:

    Via pip:
    pip install opacus
    
    From source:
    
    git clone https://github.com/pytorch/opacus.git
    cd opacus
    pip install -e .
                    
    
  2. Getting started

    Training with differential privacy is as simple as instantiating a PrivacyEngine:
    model = Net()
    optimizer = SGD(model.parameters(), lr=0.05)
    privacy_engine = PrivacyEngine(
        model,
        batch_size,
        sample_size,
        alphas=[1, 10, 100],
        noise_multiplier=1.3,
        max_grad_norm=1.0,
    )
    privacy_engine.attach(optimizer)
    # Now it's business as usual
    
Opacus
Docs
IntroductionFAQTutorialsAPI Reference
Social
opacus
Legal
PrivacyTerms
Facebook Open Source
Copyright © 2021 Facebook Inc.