Papers
Topics
Authors
Recent
Search
2000 character limit reached

Adaptive label thresholding methods for online multi-label classification

Published 4 Dec 2021 in cs.LG | (2112.02301v1)

Abstract: Existing online multi-label classification works cannot well handle the online label thresholding problem and lack the regret analysis for their online algorithms. This paper proposes a novel framework of adaptive label thresholding algorithms for online multi-label classification, with the aim to overcome the drawbacks of existing methods. The key feature of our framework is that both scoring and thresholding models are included as important components of the online multi-label classifier and are incorporated into one online optimization problem. Further, in order to establish the relationship between scoring and thresholding models, a novel multi-label classification loss function is derived, which measures to what an extent the multi-label classifier can distinguish between relevant labels and irrelevant ones for an incoming instance. Based on this new framework and loss function, we present a first-order linear algorithm and a second-order one, which both enjoy closed form update, but rely on different techniques for updating the multi-label classifier. Both algorithms are proved to achieve a sub-linear regret. Using Mercer kernels, our first-order algorithm has been extended to deal with nonlinear multi-label prediction tasks. Experiments show the advantage of our linear and nonlinear algorithms, in terms of various multi-label performance metrics.

Citations (8)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.