Pytorch implementation for "Density-aware Chamfer Distance as a Comprehensive Metric for Point Cloud Completion" (NeurIPS 2021)

Overview

Density-aware Chamfer Distance

This repository contains the official PyTorch implementation of our paper:

Density-aware Chamfer Distance as a Comprehensive Metric for Point Cloud Completion, NeurIPS 2021

Tong Wu, Liang Pan, Junzhe Zhang, Tai Wang, Ziwei Liu, Dahua Lin

avatar

We present a new point cloud similarity measure named Density-aware Chamfer Distance (DCD). It is derived from CD and benefits from several desirable properties: 1) it can detect disparity of density distributions and is thus a more intensive measure of similarity compared to CD; 2) it is stricter with detailed structures and significantly more computationally efficient than EMD; 3) the bounded value range encourages a more stable and reasonable evaluation over the whole test set. DCD can be used as both an evaluation metric and the training loss. We mainly validate its performance on point cloud completion in our paper.

This repository includes:

  • Implementation of Density-aware Chamfer Distance (DCD).
  • Implementation of our method for this task and the pre-trained model.

Installation

Requirements

  • PyTorch 1.2.0
  • Open3D 0.9.0
  • Other dependencies are listed in requirements.txt.

Install

Install PyTorch 1.2.0 first, and then get the other requirements by running the following command:

bash setup.sh

Dataset

We use the MVP Dataset. Please download the train set and test set and then modify the data path in data/mvp_new.py to the your own data location. Please refer to their codebase for further instructions.

Usage

Density-aware Chamfer Distance

The function for DCD calculation is defined in def calc_dcd() in utils/model_utils.py.

Users of higher PyTorch versions may try def calc_dcd() in utils_v2/model_utils.py, which has been tested on PyTorch 1.6.0 .

Model training and evaluation

  • To train a model: run python train.py ./cfgs/*.yaml, for example:
python train.py ./cfgs/vrc_plus.yaml
  • To test a model: run python train.py ./cfgs/*.yaml --test_only, for example:
python train.py ./cfgs/vrc_plus_eval.yaml --test_only
  • Config for each algorithm can be found in cfgs/.
  • run_train.sh and run_test.sh are provided for SLURM users.

We provide the following config files:

  • pcn.yaml: PCN trained with CD loss.
  • vrc.yaml: VRCNet trained with CD loss.
  • pcn_dcd.yaml: PCN trained with DCD loss.
  • vrc_dcd.yaml: VRCNet trained with DCD loss.
  • vrc_plus.yaml: training with our method.
  • vrc_plus_eval.yaml: evaluation of our method with guided down-sampling.

Attention: We empirically find that using DP or DDP for training would slightly hurt the performance. So training on multiple cards is not well supported currently.

Pre-trained models

We provide the pre-trained model that reproduce the results in our paper. Download and extract it to the ./log/pretrained/ directory, and then evaluate it with cfgs/vrc_plus_eval.yaml. The setting prob_sample: True turns on the guided down-sampling. We also provide the model for VRCNet trained with DCD loss here.

Citation

If you find our code or paper useful, please cite our paper:

@inproceedings{wu2021densityaware,
  title={Density-aware Chamfer Distance as a Comprehensive Metric for Point Cloud Completion},
  author={Tong Wu, Liang Pan, Junzhe Zhang, Tai WANG, Ziwei Liu, Dahua Lin},
  booktitle={In Advances in Neural Information Processing Systems (NeurIPS), 2021},
  year={2021}
}

Acknowledgement

The code is based on the VRCNet implementation. We include the following PyTorch 3rd-party libraries: ChamferDistancePytorch, emd, expansion_penalty, MDS, and Pointnet2.PyTorch. Thanks for these great projects.

Contact

Please contact @wutong16 for questions, comments and reporting bugs.

Owner
Tong WU
Tong WU
"Moshpit SGD: Communication-Efficient Decentralized Training on Heterogeneous Unreliable Devices", official implementation

Moshpit SGD: Communication-Efficient Decentralized Training on Heterogeneous Unreliable Devices This repository contains the official PyTorch implemen

Yandex Research 21 Oct 18, 2022
This code is for eCaReNet: explainable Cancer Relapse Prediction Network.

eCaReNet This code is for eCaReNet: explainable Cancer Relapse Prediction Network. (Towards Explainable End-to-End Prostate Cancer Relapse Prediction

Institute of Medical Systems Biology 2 Jul 28, 2022
basic tutorial on pytorch

Quick Tutorial on PyTorch PyTorch Basics Linear Regression Logistic Regression Artificial Neural Networks Convolutional Neural Networks Recurrent Neur

7 Sep 15, 2022
MicRank is a Learning to Rank neural channel selection framework where a DNN is trained to rank microphone channels.

MicRank: Learning to Rank Microphones for Distant Speech Recognition Application Scenario Many applications nowadays envision the presence of multiple

Samuele Cornell 20 Nov 10, 2022
Official PaddlePaddle implementation of Paint Transformer

Paint Transformer: Feed Forward Neural Painting with Stroke Prediction [Paper] [Paddle Implementation] Update We have optimized the serial inference p

TianweiLin 284 Dec 31, 2022
GeoMol: Torsional Geometric Generation of Molecular 3D Conformer Ensembles

GeoMol: Torsional Geometric Generation of Molecular 3D Conformer Ensembles This repository contains a method to generate 3D conformer ensembles direct

127 Dec 20, 2022
Send text to girlfriend in the morning

Girlfriend Text Send text to girlfriend (or really anyone with a phone number) in the morning 1. Configure your settings in utils.py. phone_number = "

Paras Adhikary 199 Oct 25, 2022
Codebase for the paper titled "Continual learning with local module selection"

This repository contains the codebase for the paper Continual Learning via Local Module Composition. Setting up the environemnt Create a new conda env

Oleksiy Ostapenko 20 Dec 10, 2022
TrackTech: Real-time tracking of subjects and objects on multiple cameras

TrackTech: Real-time tracking of subjects and objects on multiple cameras This project is part of the 2021 spring bachelor final project of the Bachel

5 Jun 17, 2022
Our CIKM21 Paper "Incorporating Query Reformulating Behavior into Web Search Evaluation"

Reformulation-Aware-Metrics Introduction This codebase contains source-code of the Python-based implementation of our CIKM 2021 paper. Chen, Jia, et a

xuanyuan14 5 Mar 05, 2022
ViewFormer: NeRF-free Neural Rendering from Few Images Using Transformers

ViewFormer: NeRF-free Neural Rendering from Few Images Using Transformers Official implementation of ViewFormer. ViewFormer is a NeRF-free neural rend

Jonáš Kulhánek 169 Dec 30, 2022
PyG (PyTorch Geometric) - A library built upon PyTorch to easily write and train Graph Neural Networks (GNNs)

PyG (PyTorch Geometric) is a library built upon PyTorch to easily write and train Graph Neural Networks (GNNs) for a wide range of applications related to structured data.

PyG 16.5k Jan 08, 2023
Official implementation of NeurIPS 2021 paper "Contextual Similarity Aggregation with Self-attention for Visual Re-ranking"

CSA: Contextual Similarity Aggregation with Self-attention for Visual Re-ranking PyTorch training code for CSA (Contextual Similarity Aggregation). We

Hui Wu 19 Oct 21, 2022
This is an official implementation for the WTW Dataset in "Parsing Table Structures in the Wild " on table detection and table structure recognition.

WTW-Dataset This is an official implementation for the WTW Dataset in "Parsing Table Structures in the Wild " on ICCV 2021. Here, you can download the

109 Dec 29, 2022
BLEND: A Fast, Memory-Efficient, and Accurate Mechanism to Find Fuzzy Seed Matches

BLEND is a mechanism that can efficiently find fuzzy seed matches between sequences to significantly improve the performance and accuracy while reducing the memory space usage of two important applic

SAFARI Research Group at ETH Zurich and Carnegie Mellon University 19 Dec 26, 2022
FwordCTF 2021 Infrastructure and Source code of Web/Bash challenges

FwordCTF 2021 You can find here the source code of the challenges I wrote (Web and Bash) in FwordCTF 2021 and the source code of the platform with our

Kahla 5 Nov 25, 2022
Sharing of contents on mitochondrial encounter networks

mito-network-sharing Sharing of contents on mitochondrial encounter networks Required: R with igraph, brainGraph, ggplot2, and XML libraries; igraph l

Stochastic Biology Group 0 Oct 01, 2021
Constructing Neural Network-Based Models for Simulating Dynamical Systems

Constructing Neural Network-Based Models for Simulating Dynamical Systems Note this repo is work in progress prior to reviewing This is a companion re

Christian Møldrup Legaard 21 Nov 25, 2022
Logistic Bandit experiments. Official code for the paper "Jointly Efficient and Optimal Algorithms for Logistic Bandits".

Code for the paper Jointly Efficient and Optimal Algorithms for Logistic Bandits, by Louis Faury, Marc Abeille, Clément Calauzènes and Kwang-Sun Jun.

Faury Louis 1 Jan 22, 2022
Official git for "CTAB-GAN: Effective Table Data Synthesizing"

CTAB-GAN This is the official git paper CTAB-GAN: Effective Table Data Synthesizing. The paper is published on Asian Conference on Machine Learning (A

30 Dec 26, 2022