(NeurIPS 2021) Realistic Evaluation of Transductive Few-Shot Learning

Overview

Realistic evaluation of transductive few-shot learning

Introduction

This repo contains the code for our NeurIPS 2021 submitted paper "Realistic evaluation of transductive few-shot learning". This is a framework that regroups all methods evaluated in our paper except for SIB and LR-ICI. Results provided in the paper can be reproduced with this repo. Code was developed under python 3.8.3 and pytorch 1.4.0.

1. Getting started

1.1 Quick installation (recommended) (Download datasets and models)

To download datasets and pre-trained models (checkpoints), follow instructions 1.1.1 to 1.1.2 of NeurIPS 2020 paper "TIM: Transductive Information Maximization" public implementation (https://github.com/mboudiaf/TIM)

1.1.1 Place datasets

Make sure to place the downloaded datasets (data/ folder) at the root of the directory.

1.1.2 Place models

Make sure to place the downloaded pre-trained models (checkpoints/ folder) at the root of the directory.

1.2 Manual installation

Follow instruction 1.2 of NeurIPS 2020 paper "TIM: Transductive Information Maximization" public implementation (https://github.com/mboudiaf/TIM) if facing issues with previous steps. Make sure to place data/ and checkpoints/ folders at the root of the directory.

2. Requirements

To install requirements:

conda create --name <env> --file requirements.txt

Where <env> is the name of your environment

3. Reproducing the main results

Before anything, activate the environment:

source activate <env>

3.1 Table 1 and 2 results in paper

Evaluation in a 5-shot scenario on mini-Imagenet using RN-18 as backbone (Table 1. in paper)

Method 1-shot 5-shot 10-shot 20-shot
SimpleShot 63.0 80.1 84.0 86.1
PT-MAP 60.1 (↓16.8) 67.1 (↓18.2) 68.8 (↓18.0) 70.4 (↓17.4)
LaplacianShot 65.4 (↓4.7) 81.6 (↓0.5) 84.1 (↓0.2) 86.0 (↑0.5)
BDCSPN 67.0 (↓2.4) 80.2 (↓1.8) 82.7 (↓1.4) 84.6 (↓1.1)
TIM 67.3 (↓4.5) 79.8 (↓4.1) 82.3 (↓3.8) 84.2 (↓3.7)
α-TIM 67.4 82.5 85.9 87.9

To reproduce the results from Table 1. and 2. in the paper, from the root of the directory execute this python command.

python3 -m src.main --base_config <path_to_base_config_file> --method_config <path_to_method_config_file> 

The <path_to_base_config_file> follows this hierarchy:

config/<balanced or dirichlet>/base_config/<resnet18 or wideres>/<mini or tiered or cub>/base_config.yaml

The <path_to_method_config_file> follows this hierarchy:

config/<balanced or dirichlet>/methods_config/<alpha_tim or baseline or baseline_pp or bdcspn or entropy_min or laplacianshot or protonet or pt_map or simpleshot or tim>.yaml

For instance, if you want to reproduce the results in the balanced setting on mini-Imagenet, using ResNet-18, with alpha-TIM method go to the root of the directory and execute:

python3 -m src.main --base_config config/balanced/base_config/resnet18/mini/base_config.yaml --method_config config/balanced/methods_config/alpha_tim.yaml

If you want to reproduce the results in the randomly balanced setting on mini-Imagenet, using ResNet-18, with alpha-TIM method go to the root of the directory and execute:

python3 -m src.main --base_config config/dirichlet/base_config/resnet18/mini/base_config.yaml --method_config config/dirichlet/methods_config/alpha_tim.yaml

Reusable data sampler module

One of our main contribution is our realistic task sampling method following Dirichlet's distribution. plot

Our realistic sampler can be found in sampler.py file. The sampler has been implemented following Pytorch's norms and in a way that it can be easily reused and integrated in other projects.

The following notebook exemple_realistic_sampler.ipynb is an exemple that shows how to initialize and use our realistic category sampler.

Contact

For further questions or details, reach out to Olivier Veilleux ([email protected])

Acknowledgements

Special thanks to the authors of NeurIPS 2020 paper "TIM: Transductive Information Maximization" (TIM) (https://github.com/mboudiaf/TIM) for publicly sharing their pre-trained models and their source code from which this repo was inspired from.

Owner
Olivier Veilleux
Olivier Veilleux
Patient-Survival - Using Python, I developed a Machine Learning model using classification techniques such as Random Forest and SVM classifiers to predict a patient's survival status that have undergone breast cancer surgery.

Patient-Survival - Using Python, I developed a Machine Learning model using classification techniques such as Random Forest and SVM classifiers to predict a patient's survival status that have underg

Nafis Ahmed 1 Dec 28, 2021
Pytorch implementations of Bayes By Backprop, MC Dropout, SGLD, the Local Reparametrization Trick, KF-Laplace, SG-HMC and more

Bayesian Neural Networks Pytorch implementations for the following approximate inference methods: Bayes by Backprop Bayes by Backprop + Local Reparame

1.4k Jan 07, 2023
[ICCV2021] Official Pytorch implementation for SDGZSL (Semantics Disentangling for Generalized Zero-Shot Learning)

Semantics Disentangling for Generalized Zero-shot Learning This is the official implementation for paper Zhi Chen, Yadan Luo, Ruihong Qiu, Zi Huang, J

25 Dec 06, 2022
A benchmark dataset for emulating atmospheric radiative transfer in weather and climate models with machine learning (NeurIPS 2021 Datasets and Benchmarks Track)

ClimART - A Benchmark Dataset for Emulating Atmospheric Radiative Transfer in Weather and Climate Models Official PyTorch Implementation Using deep le

21 Dec 31, 2022
NP DRAW paper released code

NP-DRAW: A Non-Parametric Structured Latent Variable Model for Image Generation This repo contains the official implementation for the NP-DRAW paper.

ZENG Xiaohui 22 Mar 13, 2022
Python3 / PyTorch implementation of the following paper: Fine-grained Semantics-aware Representation Enhancement for Self-supervisedMonocular Depth Estimation. ICCV 2021 (oral)

FSRE-Depth This is a Python3 / PyTorch implementation of FSRE-Depth, as described in the following paper: Fine-grained Semantics-aware Representation

77 Dec 28, 2022
Categorizing comments on YouTube into different categories.

Youtube Comments Categorization This repo is for categorizing comments on a youtube video into different categories. negative (grievances, complaints,

Rhitik 5 Nov 26, 2022
Explore the Expression: Facial Expression Generation using Auxiliary Classifier Generative Adversarial Network

Explore the Expression: Facial Expression Generation using Auxiliary Classifier Generative Adversarial Network This is the official implementation of

azad 2 Jul 09, 2022
existing and custom freqtrade strategies supporting the new hyperstrategy format.

freqtrade-strategies Description Existing and self-developed strategies, rewritten to support the new HyperStrategy format from the freqtrade-develop

39 Aug 20, 2021
official code for dynamic convolution decomposition

Revisiting Dynamic Convolution via Matrix Decomposition (ICLR 2021) A pytorch implementation of DCD. If you use this code in your research please cons

Yunsheng Li 110 Nov 23, 2022
Official PyTorch implementation of Data-free Knowledge Distillation for Object Detection, WACV 2021.

Introduction This repository is the official PyTorch implementation of Data-free Knowledge Distillation for Object Detection, WACV 2021. Data-free Kno

NVIDIA Research Projects 50 Jan 05, 2023
This is an official pytorch implementation of Fast Fourier Convolution.

Fast Fourier Convolution (FFC) for Image Classification This is the official code of Fast Fourier Convolution for image classification on ImageNet. Ma

pkumi 199 Jan 03, 2023
DeepCO3: Deep Instance Co-segmentation by Co-peak Search and Co-saliency

[CVPR19] DeepCO3: Deep Instance Co-segmentation by Co-peak Search and Co-saliency (Oral paper) Authors: Kuang-Jui Hsu, Yen-Yu Lin, Yung-Yu Chuang PDF:

Kuang-Jui Hsu 139 Dec 22, 2022
Generalized Matrix Means for Semi-Supervised Learning with Multilayer Graphs

Generalized Matrix Means for Semi-Supervised Learning with Multilayer Graphs MATLAB implementation of the paper: P. Mercado, F. Tudisco, and M. Hein,

Pedro Mercado 6 May 26, 2022
Reinforcement Learning with Q-Learning Algorithm on gym's frozen lake environment implemented in python

Reinforcement Learning with Q Learning Algorithm Q learning algorithm is trained on the gym's frozen lake environment. Libraries Used gym Numpy tqdm P

1 Nov 10, 2021
Contains source code for the winning solution of the xView3 challenge

Winning Solution for xView3 Challenge This repository contains source code and pretrained models for my (Eugene Khvedchenya) solution to xView 3 Chall

Eugene Khvedchenya 51 Dec 30, 2022
fklearn: Functional Machine Learning

fklearn: Functional Machine Learning fklearn uses functional programming principles to make it easier to solve real problems with Machine Learning. Th

nubank 1.4k Dec 07, 2022
MultiMix: Sparingly Supervised, Extreme Multitask Learning From Medical Images (ISBI 2021, MELBA 2021)

MultiMix This repository contains the implementation of MultiMix. Our publications for this project are listed below: "MultiMix: Sparingly Supervised,

Ayaan Haque 27 Dec 22, 2022
AI-generated-characters for Learning and Wellbeing

AI-generated-characters for Learning and Wellbeing Click here for the full project page. This repository contains the source code for the paper AI-gen

MIT Media Lab 214 Jan 01, 2023
Cache Requests in Deta Bases and Echo them with Deta Micros

Deta Echo Cache Leverage the awesome Deta Micros and Deta Base to cache requests and echo them as needed. Stop worrying about slow public APIs or agre

Gingerbreadfork 8 Dec 07, 2021