This repository contains the code for the preprint:
"Transfer learning improves performance in volumetric electron microscopy organelle segmentation across tissues"
Authors: Ronald Xie, Ben Mulcahy, Ali Darbandi, Sagar Marwah, Fez Ali, Yuna Lee, Gunes Parlakgul, Gokhan Hotamisligil, Bo Wang, Sonya MacParland, Mei Zhen, and Gary D. Bader
The primary dataset can be temporarily accessed via this Zenodo link. Restrictions will be removed after the review process.
- SNEMI3D: https://snemi3d.grand-challenge.org/
- UroCell: https://github.com/MancaZerovnikMekuc/UroCell
- Mouse Liver: https://www.ebi.ac.uk/empiar/EMPIAR-10791/
The preprocessing steps, including scaling and normalization, are detailed in preprocess.ipynb
. These steps ensure consistency across datasets and prepare data for model training. Preprocessed data can be saved to disk for efficiency and accessed via the relevant classes in dataset.py
.
Use the train.py
script to pretrain or fine-tune models. Update the training and testing datasets in the script as needed. For fine-tuning, specify the path to the pretrained model weights.
Example command with default parameters:
python train.py -e 'finetune1' --subsample_frac '1' --epochs 51 --rotation_augs --contrast_augs --loss_type 'DiceCE' --loss_weights 5 --model_loc pretrain1/trained_model_epoch50.pth
The models
folder contains all the evaluated models, corresponding to the pretraining and fine-tuning combinations tested in Table 1 of the manuscript.
Run results_analysis.ipynb
to reproduce Table 1 from the manuscript.
If you use this code or data in your research, please cite the preprint.