Skip to content

Latest commit

 

History

History
94 lines (81 loc) · 3.85 KB

README.md

File metadata and controls

94 lines (81 loc) · 3.85 KB

CheXNet-with-localization

ADLxMLDS 2017 fall final

Team:XD

黃晴 (R06922014), 王思傑 (R06922019), 曹爗文 (R06922022), 傅敏桓 (R06922030), 湯忠憲 (R06946003)

Weakly supervised localization :

In this task, we have to plot bounding boxes for each disease finding in a single chest X-ray without goundtruth (X, Y, width, height) in training set. The workflow is shown below:

Workflow :

1) Predict findings 2) Use the classifier to plot heatmap (Grad-CAM) 3) Plot the bounding box base on Grad-CAM ### Package : `Pytorch==0.2.0`   `torchvision==0.2.0`  ` matplotlib`  ` scikit-image==0.13.1`  ` opencv_python==3.4.0.12`  ` numpy==1.13.3`  `matplotlib==2.1.1`  `scipy==1.0.0`   `sklearn==0.19.1`  

Environment:

  • OS: Linux
  • Python 3.5
  • GPU: 1080 Ti
  • CPU: Xeon(R) E5-2667 v4
  • RAM: 500 GB

Experiments process:

  1. preprocessing:
python3 preprocessing.py [path of images folder] [path to data_entry] [path to bbox_list_path] [path to train_txt] [path to valid_txt] [path of preprocessed output (folder)]
  1. training:
python3 train.py [path of preprocessed output (folder)]
  1. local testing:
python3 denseNet_localization.py [path to test.txt] [path of images folder]
  1. Output txt format:
    After running denseNet_localization.py, you would get a txt file. The format is shown below:
[image_path] [number_of_detection]
[disease] [x] [y] [width] [height]
[disease] [x] [y] [width] [height]
...
[image_path] [number_of_detection]
[disease] [x] [y] [width] [height]
[disease] [x] [y] [width] [height]
...

For DeepQ platform testing:

upload deepQ_25.zip to the platform. Then use following command:

python3 inference.py

  1. For visualization, please refers to issue. Credit to Sadam1195.

Note :

In our .py script, I used the following script to assign the task running on GPU 0.

import os
os.environ['CUDA_VISIBLE_DEVICES'] = "0"

Model :

* Image is modified from Ref [2].

Result :

Prediction

Heatmap per disease Alt Text Visualization of some heat maps with its ground-truth label (red) and its prediction (blue) selected from each disease class. (From top-left to bottom: Atelectasis, Cardiomegaly, Effusion, Infiltration, Mass, Nodule, Pneumonia and Pneumothorax)

Bounding Box per patient Alt Text Visualization of some images with its ground-truth label (red) and its prediction (blue) selected from each disease class.

Refers to the report for more experiment results.

Reference:

  1. ChestX-ray8: Hospital-scale Chest X-ray Database and Benchmarks on Weakly-Supervised Classification and Localization of Common Thorax Diseases [Arxiv]
  2. LEARNING TO DIAGNOSE FROM SCRATCH BY EXPLOITING DEPENDENCIES AMONG LABELS [Arxiv]
  3. CheXNet: Radiologist-Level Pneumonia Detection on Chest X-Rays with Deep Learning [Arxiv]
  4. Grad-CAM: Visual Explanations from Deep Networks via Gradient-based Localization [Arxiv]

Contact:

Feel free to contact me ([email protected]) if you have any problem.