Papers
Topics
Authors
Recent
Search
2000 character limit reached

Rank Supervised Contrastive Learning for Time Series Classification

Published 31 Jan 2024 in cs.LG | (2401.18057v2)

Abstract: Recently, various contrastive learning techniques have been developed to categorize time series data and exhibit promising performance. A general paradigm is to utilize appropriate augmentations and construct feasible positive samples such that the encoder can yield robust and discriminative representations by mapping similar data points closer together in the feature space while pushing dissimilar data points farther apart. Despite its efficacy, the fine-grained relative similarity (e.g., rank) information of positive samples is largely ignored, especially when labeled samples are limited. To this end, we present Rank Supervised Contrastive Learning (RankSCL) to perform time series classification. Different from conventional contrastive learning frameworks, RankSCL augments raw data in a targeted way in the embedding space and adopts certain filtering rules to select more informative positive and negative pairs of samples. Moreover, a novel rank loss is developed to assign different weights for different levels of positive samples, enable the encoder to extract the fine-grained information of the same class, and produce a clear boundary among different classes. Thoroughly empirical studies on 128 UCR datasets and 30 UEA datasets demonstrate that the proposed RankSCL can achieve state-of-the-art performance compared to existing baseline methods.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (38)
  1. Abien Fred Agarap. Deep learning using rectified linear units (relu). CoRR, abs/1803.08375, 2018.
  2. The great time series classification bake off: An experimental evaluation of recently proposed algorithms. extended version. CoRR, abs/1602.01711, 2016.
  3. The uea multivariate time series classification archive, 2018. CoRR, abs/1811.00075, 2018.
  4. A simple framework for contrastive learning of visual representations. In International conference on machine learning, pages 1597–1607. PMLR, 2020.
  5. Learning a similarity metric discriminatively, with application to face verification. In 2005 IEEE computer society conference on computer vision and pattern recognition (CVPR’05), volume 1, pages 539–546. IEEE, 2005.
  6. The ucr time series archive. CoRR, abs/1810.07758, 2018.
  7. Rocket: exceptionally fast and accurate time series classification using random convolutional kernels. Data Mining and Knowledge Discovery, 34(5):1454–1495, 2020.
  8. Time-series representation learning via temporal and contextual contrasting. CoRR, abs/2106.14112, 2021.
  9. Time-series representation learning via temporal and contextual contrasting, 2021.
  10. Unsupervised scalable representation learning for multivariate time series. Advances in neural information processing systems, 32, 2019.
  11. Supervised contrastive learning for pre-trained language model fine-tuning. In International Conference on Learning Representations, 2021.
  12. A new attention mechanism to classify multivariate time series. In Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence, 2020.
  13. Batch normalization: Accelerating deep network training by reducing internal covariate shift, 2015.
  14. Inceptiontime: Finding alexnet for time series classification. Data Mining and Knowledge Discovery, 34(6):1936–1962, 2020.
  15. Dtw-nn: A novel neural network for time series recognition using dynamic alignment between inputs and weights. Knowledge-Based Systems, 188:104971, 2020.
  16. Unsupervised learning of semantic audio representations. CoRR, abs/1711.02209, 2017.
  17. Supervised contrastive learning. Advances in neural information processing systems, 33:18661–18673, 2020.
  18. Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980, 2014.
  19. Multi-stage attention convolutional neural networks for hevc in-loop filtering. In 2020 2nd IEEE International Conference on Artificial Intelligence Circuits and Systems (AICAS), pages 173–177, 2020.
  20. J. Lines and A. Bagnall. Time series classification with ensembles of elastic distance measures. Data Mining and Knowledge Discovery, 29(3):565–592, 2015.
  21. Time series classification with hive-cote: The hierarchical vote collective of transformation-based ensembles. ACM Transactions on Knowledge Discovery from Data, 12(5):52:1–52:35, 2018.
  22. Time series contrastive learning with information-aware augmentations, 2023.
  23. Meinard Muller. Dynamic time warping. Information retrieval for music and motion, pages 69–84, 2007.
  24. Activity classification using realistic data from wearable sensors. IEEE Transactions on Information Technology in Biomedicine, 6883:119–128, 2006.
  25. Scikit-learn: Machine learning in python. the Journal of machine Learning research, 12:2825–2830, 2011.
  26. A scada based power plant monitoring and management system. Knowledge-Based and Intelligent Information and Engineering Systems, 6883:433–442, 2011.
  27. Ts-chief: a scalable and accurate forest algorithm for time series classification. Data Mining and Knowledge Discovery, 34(3):742–775, 2020.
  28. Kihyuk Sohn. Improved deep metric learning with multi-class n-pair loss objective. Advances in neural information processing systems, 29, 2016.
  29. Deep r-th root of rank supervised joint binary embedding for multivariate time series retrieval. In Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pages 2229–2238, 2018.
  30. Unsupervised representation learning for time series with temporal neighborhood coding. CoRR, abs/2106.00750, 2021.
  31. Unsupervised representation learning for time series with temporal neighborhood coding, 2021.
  32. Cost: Contrastive learning of disentangled seasonal-trend representations for time series forecasting. CoRR, abs/2202.01575, 2022.
  33. Timeclr: A self-supervised contrastive learning framework for univariate time series representation. Know.-Based Syst., 245(C), jun 2022.
  34. Stfnets: Learning sensing signals from the time-frequency perspective with short-time fourier neural networks. CoRR, abs/1902.07849, 2019.
  35. Ts2vec: Towards universal representation of time series. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 36, pages 8980–8987, 2022.
  36. A transformer-based framework for multivariate time series representation learning. CoRR, abs/2010.02803, 2020.
  37. Self-supervised contrastive pre-training for time series via time-frequency consistency, 2022.
  38. Informer: Beyond efficient transformer for long sequence time-series forecasting. In Proceedings of the AAAI conference on artificial intelligence, volume 35, pages 11106–11115, 2021.
Citations (1)

Summary

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.