This repository contains the official implementation of the SurgX paper (MICCAI 2025): SurgX: Neuron–Concept Association for Explainable Surgical Phase Recognition
Create a new conda environment and install all dependencies:
conda create -n SurgX --python=3.8
conda activate SurgX
pip install -r requirements.txtAnnotate neurons of ASFormer’s penultimate layer.
cd spr_models/ASFormerThere are four options for --action:
python main.py --action [train|extract_activations|predict|extract_contributions]- Train the model:
python main.py --action train- Save activations of the train dataset:
python main.py --action extract_activations- Run the neuron–concept annotation pipeline:
cd ../../
python 0_extract_sequence_features.py
python 0_extract_text_features.py
python 1_neuron_concept_annotation.pyOptional – visualize which concepts neurons learn:
python 2_visualize_neuron_concepts.py
python 3_make_videos.py
python 4_make_videos_with_info.py
python 5_make_integrated_videos_with_info.pyGenerate explanations for ASFormer predictions.
cd spr_models/ASFormer- Run prediction on the test dataset:
python main.py --action predict- Save contributions of the test dataset:
python main.py --action extract_contributions- Create explanations:
cd ../../
python 6_explain_prediction.pyOptional – visualize the explanations as MP4:
python 7_make_mp4.pyConcept sets are located under the concept_sets/ folder:
CholecT45-WCholecT45-SChoLec-270
The surgical phase recognition models are based on TeCNO and ASFormer.
The vision–language model used for neuron–concept annotation is SurgVLP.
We thank all the authors for their efforts and open-source contributions.