A Knowledge Distillation Method
for Genomic Survival Analysis by Handling Censoring

Survival analysis is a critical tool for cancer research, but knowledge from censored data poses significant challenges to survival prediction due to its inaccurate hazards as the supervision bias for machine learning approaches. We propose a simple but effective method, termed KD, that performs knowledge distillation upon uncensored data to rectify the supervision bias inherent in censored data and leverages the combined power of rectified censored data and uncensored data to improve survival prediction accuracy. Remarkably, our KD method not only effectively harnesses censored data but also better reflects the clinical reality, demonstrating its immense value in handling survival analysis.


Quick start

You can get KD simply by:

$ git clone git@github.com:HiangX/KD.git 
$ cd KD
$ pip install -r requirements.txt
$ python setup.py install

For detailed installation instruction, please refer to the installation guide.

KD provides several optional usages for different users.