Experiments for Neural Flows paper

Overview

Neural Flows: Efficient Alternative to Neural ODEs [arxiv]

TL;DR: We directly model the neural ODE solutions with neural flows, which is much faster and achieves better results on time series applications, since it avoids using expensive numerical solvers.

image

Marin Biloš, Johanna Sommer, Syama Sundar Rangapuram, Tim Januschowski, Stephan Günnemann

Abstract: Neural ordinary differential equations describe how values change in time. This is the reason why they gained importance in modeling sequential data, especially when the observations are made at irregular intervals. In this paper we propose an alternative by directly modeling the solution curves - the flow of an ODE - with a neural network. This immediately eliminates the need for expensive numerical solvers while still maintaining the modeling capability of neural ODEs. We propose several flow architectures suitable for different applications by establishing precise conditions on when a function defines a valid flow. Apart from computational efficiency, we also provide empirical evidence of favorable generalization performance via applications in time series modeling, forecasting, and density estimation.

This repository acts as a supplementary material which implements the models and experiments as described in the main paper. The definition of models relies on the stribor package for normalizing and neural flows. The baselines use torchdiffeq package for differentiable ODE solvers.

Installation

Install the local package nfe (which will also install all the dependencies):

pip install -e .

Download data

Download and preprocess real-world data and generate synthetic data (or run commands in download_all.sh manually):

. scripts/download_all.sh

Many experiments will automatically download data if it's not already downloaded so this step is optional.

Note: MIMIC-III and IV have to be downloaded manually. Use notebooks in data_preproc to preprocess data.

After downloading everything, your directory tree should look like this:

├── nfe
│   ├── experiments
│   │   ├── base_experiment.py
│   │   ├── data
│   │   │   ├── activity
│   │   │   ├── hopper
│   │   │   ├── mimic3
│   │   │   ├── mimic4
│   │   │   ├── physionet
│   │   │   ├── stpp
│   │   │   ├── synth
│   │   │   └── tpp
│   │   ├── gru_ode_bayes
│   │   ├── latent_ode
│   │   ├── stpp
│   │   ├── synthetic
│   │   └── tpp
│   ├── models
│   └── train.py
├── scripts
│   ├── download_all.sh
│   └── run_all.sh
└── setup.py

Models

Models are located in nfe/models. It contains the implementation of CouplingFlow and ResNetFlow. The ODE models and continuous (ODE or flow-based) GRU and LSTM layers can be found in the same directory.

Example: Coupling flow

import torch
from nfe import CouplingFlow

dim = 4
model = CouplingFlow(
    dim,
    n_layers=2, # Number of flow layers
    hidden_dims=[32, 32], # Hidden layers in single flow
    time_net='TimeLinear', # Time embedding network
)

t = torch.rand(3, 10, 1) # Time points at which IVP is evaluated
x0 = torch.randn(3, 1, dim) # Initial conditions at t=0

xt = model(x0, t) # IVP solutions at t given x0
xt.shape # torch.Size([3, 10, 4])

Example: GRU flow

import torch
from nfe import GRUFlow

dim = 4
model = GRUFlow(
    dim,
    n_layers=2, # Number of flow layers
    hidden_dims=[32, 32], # Hidden layers in single flow
    time_net='TimeTanh', # Time embedding network
)

t = torch.rand(3, 10, 1) # Time points at which IVP is evaluated
x = torch.randn(3, 10, dim) # Initial conditions, RNN inputs

xt = model(x, t) # IVP solutions at t_i given x_{1:i}
xt.shape # torch.Size([3, 10, 4])

Experiments

Run all experiments: . scripts/run_all.sh. Or run individual commands manually.

Synthetic

Example:

python -m nfe.train --experiment synthetic --data [ellipse|sawtooth|sink|square|triangle] --model [ode|flow] --flow-model [coupling|resnet] --solver [rk4|dopri5]

Smoothing

Example:

python -m nfe.train --experiment latent_ode --data [activity|hopper|physionet] --classify [0|1] --model [ode|flow] --flow-model [coupling|resnet]

Reference:

  • Yulia Rubanova, Ricky Chen, David Duvenaud. "Latent ODEs for Irregularly-Sampled Time Series" (2019) [paper]. We adapted the code from here.

Filtering

Request MIMIC-III and IV data, and download locally. Use notebooks to preprocess data.

Example:

python -m nfe.train --experiment gru_ode_bayes --data [mimic3|mimic4] --model [ode|flow] --odenet gru --flow-model [gru|resnet]

Reference:

  • Edward De Brouwer, Jaak Simm, Adam Arany, Yves Moreau. "GRU-ODE-Bayes: Continuous modeling of sporadically-observed time series" (2019) [paper]. We adapted the code from here.

Temporal point process

Example:

python -m nfe.train --experiment tpp --data [poisson|renewal|hawkes1|hawkes2|mooc|reddit|wiki] --model [rnn|ode|flow] --flow-model [coupling|resnet] --decoder [continuous|mixture] --rnn [gru|lstm] --marks [0|1]

Reference:

  • Junteng Jia, Austin R. Benson. "Neural Jump Stochastic Differential Equations" (2019) [paper]. We adapted the code from here.

Spatio-temporal

Example:

python -m nfe.train --experiment stpp --data [bike|covid|earthquake] --model [ode|flow] --density-model [independent|attention]

Reference:

  • Ricky T. Q. Chen, Brandon Amos, Maximilian Nickel. "Neural Spatio-Temporal Point Processes" (2021) [paper]. We adapted the code from here.

Citation

@article{bilos2021neuralflows,
  title={{N}eural Flows: {E}fficient Alternative to Neural {ODE}s},
  author={Bilo{\v{s}}, Marin and Sommer, Johanna and Rangapuram, Syama Sundar and Januschowski, Tim and G{\"u}nnemann, Stephan},
  journal={Advances in Neural Information Processing Systems},
  year={2021}
}
Quickly and easily create / train a custom DeepDream model

Dream-Creator This project aims to simplify the process of creating a custom DeepDream model by using pretrained GoogleNet models and custom image dat

55 Dec 27, 2022
A object detecting neural network powered by the yolo architecture and leveraging the PyTorch framework and associated libraries.

Yolo-Powered-Detector A object detecting neural network powered by the yolo architecture and leveraging the PyTorch framework and associated libraries

Luke Wilson 1 Dec 03, 2021
End-to-end face detection, cropping, norm estimation, and landmark detection in a single onnx model

onnx-facial-lmk-detector End-to-end face detection, cropping, norm estimation, and landmark detection in a single onnx model, model.onnx. Demo You can

atksh 42 Dec 30, 2022
Kaggle G2Net Gravitational Wave Detection : 2nd place solution

Kaggle G2Net Gravitational Wave Detection : 2nd place solution

Hiroshechka Y 33 Dec 26, 2022
Fuzzification helps developers protect the released, binary-only software from attackers who are capable of applying state-of-the-art fuzzing techniques

About Fuzzification Fuzzification helps developers protect the released, binary-only software from attackers who are capable of applying state-of-the-

gts3.org (<a href=[email protected])"> 55 Oct 25, 2022
This repository is an implementation of paper : Improving the Training of Graph Neural Networks with Consistency Regularization

CRGNN Paper : Improving the Training of Graph Neural Networks with Consistency Regularization Environments Implementing environment: GeForce RTX™ 3090

THUDM 28 Dec 09, 2022
Code for reproducing our analysis in the paper titled: Image Cropping on Twitter: Fairness Metrics, their Limitations, and the Importance of Representation, Design, and Agency

Image Crop Analysis This is a repo for the code used for reproducing our Image Crop Analysis paper as shared on our blog post. If you plan to use this

Twitter Research 239 Jan 02, 2023
A collection of loss functions for medical image segmentation

A collection of loss functions for medical image segmentation

Jun 3.1k Jan 03, 2023
A module for solving and visualizing Schrödinger equation.

qmsolve This is an attempt at making a solid, easy to use solver, capable of solving and visualize the Schrödinger equation for multiple particles, an

506 Dec 28, 2022
Python Auto-ML Package for Tabular Datasets

Tabular-AutoML AutoML Package for tabular datasets Tabular dataset tuning is now hassle free! Run one liner command and get best tuning and processed

Sagnik Roy 18 Nov 20, 2022
"MST++: Multi-stage Spectral-wise Transformer for Efficient Spectral Reconstruction" (CVPRW 2022) & (Winner of NTIRE 2022 Challenge on Spectral Reconstruction from RGB)

MST++: Multi-stage Spectral-wise Transformer for Efficient Spectral Reconstruction (CVPRW 2022) Yuanhao Cai, Jing Lin, Zudi Lin, Haoqian Wang, Yulun Z

Yuanhao Cai 274 Jan 05, 2023
SLAMP: Stochastic Latent Appearance and Motion Prediction

SLAMP: Stochastic Latent Appearance and Motion Prediction Official implementation of the paper SLAMP: Stochastic Latent Appearance and Motion Predicti

Kaan Akan 34 Dec 08, 2022
Fast and exact ILP-based solvers for the Minimum Flow Decomposition (MFD) problem, and variants of it.

MFD-ILP Fast and exact ILP-based solvers for the Minimum Flow Decomposition (MFD) problem, and variants of it. The solvers are implemented using Pytho

Algorithmic Bioinformatics Group @ University of Helsinki 4 Oct 23, 2022
Paper: De-rendering Stylized Texts

Paper: De-rendering Stylized Texts Wataru Shimoda1, Daichi Haraguchi2, Seiichi Uchida2, Kota Yamaguchi1 1CyberAgent.Inc, 2 Kyushu University Accepted

CyberAgent AI Lab 55 Dec 18, 2022
SuRE Evaluation: A Supplementary Material

SuRE Evaluation: A Supplementary Material This repository contains supplementary material regarding the evaluations presented in the paper Visual Expl

NYU Visualization Lab 0 Dec 14, 2021
GT China coal model

GT China coal model The full version of a China coal transport model with a very high spatial reslution. What it does The code works in a few steps: T

0 Dec 13, 2021
Code for paper 'Hand-Object Contact Consistency Reasoning for Human Grasps Generation' at ICCV 2021

GraspTTA Hand-Object Contact Consistency Reasoning for Human Grasps Generation (ICCV 2021). Project Page with Videos Demo Quick Results Visualization

Hanwen Jiang 47 Dec 09, 2022
Hyperparameter Optimization for TensorFlow, Keras and PyTorch

Hyperparameter Optimization for Keras Talos • Key Features • Examples • Install • Support • Docs • Issues • License • Download Talos radically changes

Autonomio 1.6k Dec 15, 2022
Data cleaning, missing value handle, EDA use in this project

Lending Club Case Study Project Brief Solving this assignment will give you an idea about how real business problems are solved using EDA. In this cas

Dhruvil Sheth 1 Jan 05, 2022
Mesh Graphormer is a new transformer-based method for human pose and mesh reconsruction from an input image

MeshGraphormer ✨ ✨ This is our research code of Mesh Graphormer. Mesh Graphormer is a new transformer-based method for human pose and mesh reconsructi

Microsoft 251 Jan 08, 2023