Papers
Topics
Authors
Recent
Search
2000 character limit reached

Positive-Unlabeled Node Classification with Structure-aware Graph Learning

Published 20 Oct 2023 in cs.LG and cs.AI | (2310.13538v1)

Abstract: Node classification on graphs is an important research problem with many applications. Real-world graph data sets may not be balanced and accurate as assumed by most existing works. A challenging setting is positive-unlabeled (PU) node classification, where labeled nodes are restricted to positive nodes. It has diverse applications, e.g., pandemic prediction or network anomaly detection. Existing works on PU node classification overlook information in the graph structure, which can be critical. In this paper, we propose to better utilize graph structure for PU node classification. We first propose a distance-aware PU loss that uses homophily in graphs to introduce more accurate supervision. We also propose a regularizer to align the model with graph structure. Theoretical analysis shows that minimizing the proposed loss also leads to minimizing the expected loss with both positive and negative labels. Extensive empirical evaluation on diverse graph data sets demonstrates its superior performance over existing state-of-the-art methods.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (21)
  1. Mixmatch: A holistic approach to semi-supervised learning. NeurIPS (2019).
  2. Self-pu: Self boosted and calibrated positive-unlabeled training. In ICML.
  3. Hybrid Spatio-Temporal Graph Convolutional Network: Improving Traffic Prediction with Navigation Data.. In SIGKDD.
  4. Analysis of learning from positive and unlabeled data. In NeurIPS.
  5. Confidence-aware graph regularization with heterogeneous pairwise features. In SIGIR.
  6. Instance-dependent positive and unlabeled learning with labeling bias estimation. TPAMI 44, 8 (2022), 4163–4177.
  7. Classification from positive, unlabeled and biased negative data. In ICML.
  8. Thomas N Kipf and Max Welling. 2017. Semi-supervised classification with graph convolutional networks. In ICLR.
  9. Positive-unlabeled learning with non-negative risk estimator. In NeurIPS.
  10. PULNS: positive-unlabeled learning with effective negative sample selector. In AAAI.
  11. Transfer graph neural networks for pandemic forecasting. In AAAI.
  12. Killing two birds with one stone: Concurrent ranking of tags and comments of social images. In SIGIR.
  13. Collective classification in network data. AI magazine 29, 3 (2008), 93–93.
  14. Arnetminer: extraction and mining of academic social networks. In SIGKDD. 990–998.
  15. Graph attention networks. ICLR.
  16. Online user representation learning across heterogeneous social networks. In SIGIR.
  17. Learning graph neural networks with positive and unlabeled nodes. TKDD 15, 6 (2021), 1–25.
  18. How powerful are graph neural networks?. In ICLR.
  19. Accurate graph-based PU learning without class prior. In ICDM.
  20. Dist-PU: Positive-Unlabeled Learning from a Label Distribution Perspective. In CVPR.
  21. Zhi-Hua Zhou. 2021. Semi-supervised learning. Machine Learning (2021), 315–341.
Citations (4)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.