A benchmark and implementations for differentially private convex optimization algorithms.
The algorithms implemented in this repository are as follows:
- Approximate Minima Perturbation - An original algorithm proposed in our paper.
- Hyperparameter-free Approximate Minima Perturbation - A hyperparameter-free version of Approximate Minima Perturbation.
- Private Stochastic Gradient Descent from BST'14, ACTMMTZ'16
- Private Convex Perturbation-Based Stochastic Gradient Descent from WLKCJN'17
- Private Strongly Convex Perturbation-Based Stochastic Gradient Descent from WLKCJN'17
- Private Frank-Wolfe from TTZ'16
These instructions will get you a copy of the project up and running on your local machine for development and testing purposes. The codes are currently implemented using NumPy. They require Python version 3.5 or newer. You will also need to install all dependencies listed in the requirements.txt file in the repository. The recommended way to do this is through the use of a Python virtual environment.
You can set up a virtual environment as follows:
- Navigate to the directory that you have checked out this repository in.
- Create a virtual environment named venv by running:
python3 -m venv venv
If any needed packages are missing, you should get an error message telling you which ones to install.
- Activate the virtual environment by running the following command on Posix systems:
source venv/bin/activate
There will be a script for Windows systems located at venv/Scripts/activate. However, none of the code in this repository has been tested on Windows.
cycler==0.10.0
matplotlib==2.0.2
numpy==1.13.0
pyparsing==2.2.0
python-dateutil==2.6.0
pytz==2017.2
scipy==0.19.0
scikit-learn==0.18.1
six==1.10.0
xlrd==1.0.0
tensorflow==1.11.0
- Navigate to the this repository.
- Run the following command line.
pip install -r requirements.txt
- Navigate to the this repository.
- Open requirements.txt in the repo, and run ''pip install'' for all of the prequisities in order.
- Navigate to the ''datasets'' directory.
- Run the following command line to download and preprocess all the benchmark datasets automatically.
python main_preprocess.py all
- If you want to download one of the datasets, just replace ''all'' with the name of the dataset. All available datasets are listed as following.
adult, covertype, gisette, kddcup99, mnist, realsim, rcv1
- Navigate to this repository.
- Run algorithms on one dataset using the following command.
python gridsearch.py [ALG_NAME] [DATASET_NAME] [MODEL_NAME]
- Available ALG_NAME
ALL: all the algorithms
AMP: Approximate Minima Perturbation
AMP-NT: Hyperparameter-free Approximate Minima Perturbation
PSGD: Private Stochastic Gradient Descent
PPSGD: Private Convex Perturbation-Based Stochastic Gradient Descent
PPSSGD: Private Strongly Convex Perturbation-Based Stochastic Gradient Descent
FW: Private Frank-Wolfe
- Available DATASET_NAME
adult, covertype, gisette, kddcup99, mnist, realsim, rcv1
- Available MODEL_NAME
LR: Logistic Regression
SVM: Huber SVM without kernel functions
- The results are stored in csv files in ''dpml-algorithms/results/rough_results''
- Navigate to this repository.
- Run the following command to get the graph after running the corresponding benchmark.
python draw.py [DATASET_NAME] [ALG_NAME] [MODEL_NAME]
- The graphs are in ''dpml-algorithms/results/graphs''.