AttentionBoost: Learning What to Attend by Boosting Fully Convolutional Networks

This code is the implementation of an error-driven multi-attention learning model called AttentionBoost that we proposed for instance segmentation. This AttentionBoost model designs a multi-stage network and adaptively learns what image parts (pixels) each stage needs to attend and the level of this attention directly on image data. To this end, it introduces a new loss adjustment mechanism that uses adaptive boosting for a dense prediction model. This mechanism modulates the attention of each stage to correct the mistakes of its previous stages, by adjusting the loss weight of each pixel separately according to how confident the previous stages are on their predictions for this pixel.

The source codes are provided here.

NOTE: The following source codes are provided for research purposes only. The authors have no responsibility for any consequences of use of these source codes. If you use any part of these codes, please cite the following paper.

  • G.N. Gunesli, C. Sokmensuer, and C. Gunduz-Demir, “AttentionBoost: Learning what to attend for gland segmentation in histopathological images by boosting fully convolutional networks,” IEEE Transactions on Medical Imaging, 39(12):4262-4273, 2020.

Source code

The provided zip file contains four files to make the necessary function calls:

  • This Python code includes the network architectures for the AttentionBoost model and the base model. It also includes the parameter settings used in these architectures.

  • This Python code includes the function call to train the proposed AttentionBoost model.

  • This Python code includes the function call to estimate the probability maps using a trained AttentionBoost model on a given test set ‘x_test’ and returns the estimated probability maps. It also includes a function to save the obtained probability maps to the specified directory ‘path’.

  • segmentGlands.m: This Matlab code is to locate glands when the average probability map is given.