Papers
Topics
Authors
Recent
Search
2000 character limit reached

Semi-supervised Transfer Learning for Evaluation of Model Classification Performance

Published 16 Aug 2022 in stat.ME | (2208.07927v2)

Abstract: In modern machine learning applications, frequent encounters of covariate shift and label scarcity have posed challenges to robust model training and evaluation. Numerous transfer learning methods have been developed to robustly adapt the model itself to some unlabeled target populations using existing labeled data in a source population. However, there is a paucity of literature on transferring performance metrics of a trained model. In this paper, we aim to evaluate the performance of a trained binary classifier on unlabeled target population based on receiver operating characteristic (ROC) analysis. We proposed $\bf S$emi-supervised $\bf T$ransfer l$\bf E$arning of $\bf A$ccuracy $\bf M$easures (STEAM), an efficient three-step estimation procedure that employs 1) double-index modeling to construct calibrated density ratio weights and 2) robust imputation to leverage the large amount of unlabeled data to improve estimation efficiency. We establish the consistency and asymptotic normality of the proposed estimators under correct specification of either the density ratio model or the outcome model. We also correct for potential overfitting bias in the estimators in finite samples with cross-validation. We compare our proposed estimators to existing methods and show reductions in bias and gains in efficiency through simulations. We illustrate the practical utility of the proposed method on evaluating prediction performance of a phenotyping model for Rheumatoid Arthritis (RA) on a temporally evolving EHR cohort.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.