Dense Graph Propagation

The code for the paper Rethinking Knowledge Graph Propagation for Zero-Shot Learning.

Citation

@inproceedings{kampffmeyer2019rethinking,
  title={Rethinking knowledge graph propagation for zero-shot learning},
  author={Kampffmeyer, Michael and Chen, Yinbo and Liang, Xiaodan and Wang, Hao and Zhang, Yujia and Xing, Eric P},
  booktitle={Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition},
  pages={11487--11496},
  year={2019}
}

Requirements

Instructions

Materials Preparation

There is a folder materials/, which contains some meta data and programs already.

Glove Word Embedding

  1. Download: http://nlp.stanford.edu/data/glove.6B.zip
  2. Unzip it, find and put glove.6B.300d.txt to materials/.

Graphs

  1. cd materials/
  2. Run python make_induced_graph.py, get imagenet-induced-graph.json
  3. Run python make_dense_graph.py, get imagenet-dense-graph.json
  4. Run python make_dense_grouped_graph.py, get imagenet-dense-grouped-graph.json

Pretrained ResNet50

  1. Download: https://download.pytorch.org/models/resnet50-19c8e357.pth
  2. Rename and put it as materials/resnet50-raw.pth
  3. cd materials/, run python process_resnet.py, get fc-weights.json and resnet50-base.pth

ImageNet and AwA2

Download ImageNet and AwA2, create the softlinks (command ln -s): materials/datasets/imagenet and materials/datasets/awa2, to the root directory of the dataset.

An ImageNet root directory should contain image folders, each folder with the wordnet id of the class.

An AwA2 root directory should contain the folder JPEGImages.

Training

Make a directory save/ for saving models.

In most programs, use --gpu to specify the devices to run the code (default: use gpu 0).

Train Graph Networks

In the results folder:

Finetune ResNet

Run python train_resnet_fit.py with the args:

(In the paper's setting, --train-dir is the folder composed of 1K classes from fall2011.tar, with the missing class "teddy bear" from ILSVRC2012.)

Testing

ImageNet

Run python evaluate_imagenet.py with the args:

AwA2

Run python evaluate_awa2.py with the args: