Codes for NeurIPS 2021 paper "On the Equivalence between Neural Network and Support Vector Machine".

Overview

On the Equivalence between Neural Network and Support Vector Machine

Codes for NeurIPS 2021 paper "On the Equivalence between Neural Network and Support Vector Machine".

Cite our paper

Yilan Chen, Wei Huang, Lam M. Nguyen, Tsui-Wei Weng, "On the Equivalence between Neural Network and Support Vector Machine", NeurIPS 2021.

@inproceedings{chen2021equiv,
  title={On the equivalence between neural network and support vector machine},
  author={Yilan Chen and Wei Huang and Lam M. Nguyen and Tsui-Wei Weng},
  booktitle={Advances in Neural Information Processing Systems},
  year={2021}
}

Overview

In this paper, we prove the equivalence between neural network (NN) and support vector machine (SVM), specifically, the infinitely wide NN trained by soft margin loss and the standard soft margin SVM with NTK trained by subgradient descent. Our main theoretical results include establishing the equivalence between NN and a broad family of L2 regularized kernel machines (KMs) with finite-width bounds, which cannot be handled by prior work, and showing that every finite-width NN trained by such regularized loss functions is approximately a KM.

Furthermore, we demonstrate our theory can enable three practical applications, including

  • non-vacuous generalization bound of NN via the corresponding KM;
  • non-trivial robustness certificate for the infinite-width NN (while existing robustness verification methods (e.g. IBP, Fast-Lin, CROWN) would provide vacuous bounds);
  • intrinsically more robust infinite-width NNs than those from previous kernel regression.

See our paper and slides for details.

Equivalence between infinite-width NNs and a family of KMs

Code overview

  • train_sgd.py: train the NN and SVM with NTK with stochastic subgradient descent. Plot the results to verify the equivalence.

  • generalization.py: compute non-vacuous generalization bound of NN via the corresponding KM.

  • regression.py: kernel ridge regression with NTK.

  • robust_svm.py:

    • test(): evaluate the robustness of NN using IBP or SVM with our method in the paper.
    • test_regressions(): evaluate the robustness of kernel ridge regression models using our method.
    • bound_ntk():calculate the lower and upper bound for NTK of two-layer fully-connected NN.
  • ibp.py: functions to calculate IBP bounds. Specified for NTK parameterization.

  • models/model.py: codes for constructing fully-connected neural networks with NTK parameterization.

  • config/:

    • svm_sgd.yaml: configurations and hyper-parameters to train NN and SVM.
    • svm_gene.yaml: configurations and hyper-parameters to calculate generalization bound.

Required environments:

This code is tested on the below environments:

python==3.8.8
torch==1.8.1
neural-tangents==0.3.6

Other required packages can be installed using Conda as follows,

conda create -n equiv-nn-svm python=3.8
conda activate equiv-nn-svm
conda install numpy tqdm matplotlib seaborn pyyaml

For the installation of PyTorch, please reference the instructions from https://pytorch.org/get-started/locally/. For the installation and usage of neural-tangents, please reference the instructions at https://github.com/google/neural-tangents.

Experiments

Train NN and SVM to verify the equivalence

python train_sgd.py

Example of the SGD results

SGD results

Example of the GD results

GD results

Computing non-vacuous generalization bound of NN via the corresponding KM

python generalization.py

Example of the generalization bound results

Generalization bound results

Robustness verification of NN

Add your paths to your NN models in the code and separate by the width. Specify the width of the models you want to verify. Then run the test() function in robust_svm.py.

python -c "import robust_svm; robust_svm.test('nn')"

Robustness verification of SVM

Add your paths to your SVM models in the code. Then run the test() function in robust_svm.py.

python -c "import robust_svm; robust_svm.test('svm')"

robustness verification results

Train kernel ridge regression with NTK models

python regression.py

Robustness verification of kernel ridge regression models

Run test_regressions() function in robust_svm.py.

python -c "import robust_svm; robust_svm.test_regressions()"

robustness verification results

Owner
Leslie
Leslie
CSPML (crystal structure prediction with machine learning-based element substitution)

CSPML (crystal structure prediction with machine learning-based element substitution) CSPML is a unique methodology for the crystal structure predicti

8 Dec 20, 2022
Code for C2-Matching (CVPR2021). Paper: Robust Reference-based Super-Resolution via C2-Matching.

C2-Matching (CVPR2021) This repository contains the implementation of the following paper: Robust Reference-based Super-Resolution via C2-Matching Yum

Yuming Jiang 151 Dec 26, 2022
PyTorch implementation of our ICCV 2021 paper, Interpretation of Emergent Communication in Heterogeneous Collaborative Embodied Agents.

PyTorch implementation of our ICCV 2021 paper, Interpretation of Emergent Communication in Heterogeneous Collaborative Embodied Agents.

Saim Wani 4 May 08, 2022
i-SpaSP: Structured Neural Pruning via Sparse Signal Recovery

i-SpaSP: Structured Neural Pruning via Sparse Signal Recovery This is a public code repository for the publication: i-SpaSP: Structured Neural Pruning

Cameron Ronald Wolfe 5 Nov 04, 2022
Very simple NCHW and NHWC conversion tool for ONNX. Change to the specified input order for each and every input OP. Also, change the channel order of RGB and BGR. Simple Channel Converter for ONNX.

scc4onnx Very simple NCHW and NHWC conversion tool for ONNX. Change to the specified input order for each and every input OP. Also, change the channel

Katsuya Hyodo 16 Dec 22, 2022
Unofficial Tensorflow Implementation of ConvNeXt from A ConvNet for the 2020s

Tensorflow Implementation of "A ConvNet for the 2020s" This is the unofficial Tensorflow Implementation of ConvNeXt from "A ConvNet for the 2020s" pap

DK 11 Oct 12, 2022
Pytorch implementation for Semantic Segmentation/Scene Parsing on MIT ADE20K dataset

Semantic Segmentation on MIT ADE20K dataset in PyTorch This is a PyTorch implementation of semantic segmentation models on MIT ADE20K scene parsing da

MIT CSAIL Computer Vision 4.5k Jan 08, 2023
AWS provides a Python SDK, "Boto3" ,which can be used to access the AWS-account from the local.

Boto3 - The AWS SDK for Python Boto3 is the Amazon Web Services (AWS) Software Development Kit (SDK) for Python, which allows Python developers to wri

Shreyas Srivastava 1 Oct 25, 2021
PyTorch implementation of D2C: Diffuison-Decoding Models for Few-shot Conditional Generation.

D2C: Diffuison-Decoding Models for Few-shot Conditional Generation Project | Paper PyTorch implementation of D2C: Diffuison-Decoding Models for Few-sh

Jiaming Song 90 Dec 27, 2022
Classical OCR DCNN reproduction based on PaddlePaddle framework.

Paddle-SVHN Classical OCR DCNN reproduction based on PaddlePaddle framework. This project reproduces Multi-digit Number Recognition from Street View I

1 Nov 12, 2021
Public implementation of "Learning from Suboptimal Demonstration via Self-Supervised Reward Regression" from CoRL'21

Self-Supervised Reward Regression (SSRR) Codebase for CoRL 2021 paper "Learning from Suboptimal Demonstration via Self-Supervised Reward Regression "

19 Dec 12, 2022
High dimensional black-box optimizer using Latent Action Monte Carlo Tree Search algorithm

LA-MCTS The code is based of paper Learning Search Space Partition for Black-box Optimization using Monte Carlo Tree Search. Component LA-MCTS has thr

Meta Research 18 Oct 24, 2022
Re-implememtation of MAE (Masked Autoencoders Are Scalable Vision Learners) using PyTorch.

mae-repo PyTorch re-implememtation of "masked autoencoders are scalable vision learners". In this repo, it heavily borrows codes from codebase https:/

Peng Qiao 1 Dec 14, 2021
This repo contains the code for the paper "Efficient hierarchical Bayesian inference for spatio-temporal regression models in neuroimaging" that has been accepted to NeurIPS 2021.

Dugh-NeurIPS-2021 This repo contains the code for the paper "Efficient hierarchical Bayesian inference for spatio-temporal regression models in neuroi

Ali Hashemi 5 Jul 12, 2022
A large-scale database for graph representation learning

A large-scale database for graph representation learning

Scott Freitas 29 Nov 25, 2022
A pytorch implementation of Detectron. Both training from scratch and inferring directly from pretrained Detectron weights are available.

Use this instead: https://github.com/facebookresearch/maskrcnn-benchmark A Pytorch Implementation of Detectron Example output of e2e_mask_rcnn-R-101-F

Roy 2.8k Dec 29, 2022
Unsupervised Feature Loss (UFLoss) for High Fidelity Deep learning (DL)-based reconstruction

Unsupervised Feature Loss (UFLoss) for High Fidelity Deep learning (DL)-based reconstruction Official github repository for the paper High Fidelity De

28 Dec 16, 2022
Analyzing basic network responses to novel classes

novelty-detection Analyzing how AlexNet responds to novel classes with varying degrees of similarity to pretrained classes from ImageNet. If you find

Noam Eshed 34 Oct 02, 2022
PyTorch implementation of PP-LCNet: A Lightweight CPU Convolutional Neural Network

PyTorch implementation of PP-LCNet Reproduction of PP-LCNet architecture as described in PP-LCNet: A Lightweight CPU Convolutional Neural Network by C

Quan Nguyen (Fly) 47 Nov 02, 2022
Predicting path with preference based on user demonstration using Maximum Entropy Deep Inverse Reinforcement Learning in a continuous environment

Preference-Planning-Deep-IRL Introduction Check my portfolio post Dependencies Gym stable-baselines3 PyTorch Usage Take Demonstration python3 record.

Tianyu Li 9 Oct 26, 2022