Papers
Topics
Authors
Recent
Search
2000 character limit reached

Quantum Time Series Similarity Measures and Quantum Temporal Kernels

Published 4 Dec 2023 in quant-ph | (2312.01602v3)

Abstract: This article presents a quantum computing approach to designing of similarity measures and kernels for classification of stochastic symbolic time series. In the area of machine learning, kernels are important components of various similarity-based classification, clustering, and regression algorithms. An effective strategy for devising problem-specific kernels is leveraging existing generative models of the example space. In this study we assume that a quantum generative model, known as quantum hidden Markov model (QHMM), describes the underlying distributions of the examples. The sequence structure and probability are determined by transitions within model's density operator space. Consequently, the QHMM defines a mapping from the example space into the broader quantum space of density operators. Sequence similarity is evaluated using divergence measures such as trace and Bures distances between quantum states. We conducted extensive simulations to explore the relationship between the distribution of kernel-estimated similarity and the dimensionality of the QHMMs Hilbert space. As anticipated, a higher dimension of the Hilbert space corresponds to greater sequence distances and a more distinct separation of the examples. To empirically evaluate the performance of the kernels, we defined classification tasks based on a simplified generative model of directional price movement in the stock market. We implemented two widely-used kernel-based algorithms - support vector machines and k-nearest neighbors - using both classical and quantum kernels. Across all classification task scenarios, the quantum kernels consistently demonstrated superior performance compared to their classical counterparts.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (58)
  1. H. Shimodaira, K.-i. Noma, M. Nakai, and S. Sagayama, “Dynamic time-alignment kernel in support vector machine,” in Advances in Neural Information Processing Systems (T. Dietterich, S. Becker, and Z. Ghahramani, eds.), vol. 14, MIT Press, 2001.
  2. Y. Zheng, Q. Liu, E. Chen, Y. Ge, and J. L. Zhao, “Time series classification using multi-channels deep convolutional neural networks,” in Web-Age Information Management (F. Li, G. Li, S.-w. Hwang, B. Yao, and Z. Zhang, eds.), (Cham), pp. 298–310, Springer International Publishing, 2014.
  3. J. Ye, C. Xiao, R. M. Esteves, and C. Rong, “Time series similarity evaluation based on spearman’s correlation coefficients and distance measures,” in Cloud Computing and Big Data (W. Qiang, X. Zheng, and C.-H. Hsu, eds.), (Cham), pp. 319–331, Springer International Publishing, 2015.
  4. S. Aghabozorgi, A. Seyed Shirkhorshidi, and T. Ying Wah, “Time-series clustering – a decade review,” Information Systems, vol. 53, pp. 16–38, 2015.
  5. L. Chen and R. Ng, “- on the marriage of lp-norms and edit distance,” in Proceedings 2004 VLDB Conference (M. A. Nascimento, M. T. Özsu, D. Kossmann, R. J. Miller, J. A. Blakeley, and B. Schiefer, eds.), pp. 792–803, St Louis: Morgan Kaufmann, 2004.
  6. A. Stefan, V. Athitsos, and G. Das, “The move-split-merge metric for time series,” IEEE Transactions on Knowledge and Data Engineering, vol. 25, no. 6, pp. 1425–1438, 2013.
  7. H. Li, C. Guo, and W. Qiu, “Similarity measure based on piecewise linear approximation and derivative dynamic time warping for time series mining,” Expert Systems with Applications, vol. 38, no. 12, pp. 14732–14743, 2011.
  8. J. Lin, E. Keogh, S. Lonardi, and B. Chiu, “A symbolic representation of time series, with implications for streaming algorithms,” in Proceedings of the 8th ACM SIGMOD Workshop on Research Issues in Data Mining and Knowledge Discovery, DMKD ’03, pp. 2–11, 06 2003.
  9. J. Lines and A. Bagnall, “Time series classification with ensembles of elastic distance measures,” Data Mining and Knowledge Discovery, vol. 29, pp. 565–592, 2015.
  10. L. Ye and E. Keogh, “Time series shapelets: a novel technique that allows accurate, interpretable and fast classification,” Data Mining and Knowledge Discovery, vol. 22, pp. 149–182, 2011.
  11. MIT press Cambridge, 2001.
  12. V. N. Vapnik, Statistical Learning Theory. John Wiley & Sons, Inc, 1998.
  13. C. Cortes and V. N. Vapnik, “Support-vector networks,” Machine Learning, vol. 20, pp. 273–297, 1995.
  14. B. Boser, I. Guyon, and V. Vapnik, “A training algorithm for optimal margin classifiers,” in Proceedings of the Fifth Annual Workshop on Computational Learning Theory, (Pittsburgh), 1992.
  15. K. P. Murphy, Machine Learning: A Probabilistic Perspective. The MIT Press, 2012.
  16. E. Pȩkalska and B. Haasdonk, “Kernel discriminant analysis for positive definite and indefinite kernels,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 31, no. 6, pp. 1017–1032, 2009.
  17. B. Schölkopf, A. Smola, and K. Müller, “Nonlinear component analysis as a kernel eigen-value problem,” Neural Computation, vol. 10, no. 5, pp. 1299–1319, 1998.
  18. A. Pandey, H. De Meulemeester, B. De Moor, and J. A. Suykens, “Multi-view kernel pca for time series forecasting,” Neurocomputing, vol. 554, p. 126639, 2023.
  19. B. Schölkopf, S. Mika, C. J. C. Burges, P. Knirsch, K. R. Müller, G. Raetsch, and A. Smola, “Input space vs. feature space in kernel-based methods,” IEEE Transactions on Neural Networks, vol. 10, no. 5, pp. 1000–1017, 1999.
  20. F. R. Bach and M. I. Jordan, “Kernel independent component analysis,” Journal of Machine Learning Research, vol. 3, pp. 1–48, 2002.
  21. A. Schell and H. Oberhauser, “Nonlinear independent component analysis for discrete-time and continuous-time signals,” 2023.
  22. J. Shawe-Taylor and N. Cristianini, Kernel methods for pattern analysis. Cambridge university press, 2004.
  23. R. Langone, R. Mall, C. Alzate, and J. A. Suykens, “Kernel spectral clustering and applications,” Unsupervised learning algorithms, pp. 135–161, 2016.
  24. A. Harvey and V. Oryshchenko, “Kernel density estimation for time series data,” International Journal of Forecasting, vol. 28, no. 1, pp. 3–14, 2012. Special Section 1: The Predictability of Financial Markets Special Section 2: Credit Risk Modelling and Forecasting.
  25. T. Smith and M. Waterman, “Identification of common molecular subsequences,” J. Mol. Biol., vol. 147, pp. 195–197, 1981.
  26. J.-P. Vert, H. Saigo, and T. Akutsu, “Local alignment kernels for biologicalsequences,” in Kernel Methods in Computational Biology (B. Scholkopf, K. Tsuda, and J. Vert, eds.), p. 131–154, MIT Press, 2004.
  27. D. Haussler, “Convolution kernels on discrete structures,” Technical Report UCSC-CRL-99-10, UC Santa Cruz, 1999.
  28. D. Haussler, “Convolution kernels on discrete structures,” Tech. Rep. UCSC-CRL-99-10, University of California in Santa Cruz, Computer Science Department, July 1999.
  29. T. Jebara, R. Kondor, and A. G. Howard, “Probability product kernels,” J. Mach. Learn. Res., vol. 5, pp. 819–844, 2004.
  30. P. Moreno, P. Ho, and N. Vasconcelos, “A kullback-leibler divergence based kernel for svm classification in multimedia applications,” in Advances in Neural Information Processing Systems (S. Thrun, L. Saul, and B. Schölkopf, eds.), vol. 16, MIT Press, 2003.
  31. M. Cuturi, K. Fukumizu, and J.-P. Vert, “Semigroup kernels on measures,” Journal of Machine Learning Research, vol. 6, no. 40, pp. 1169–1198, 2005.
  32. T. Jaakkola, M. Diekhans, and D. Haussler, “Using the fisher kernel method to detect remote protein homologies,” in Proceedings of the International Conference on Intelligent Systems for Molecular Biology, August 1999.
  33. T. Jaakkola and D. Haussler, “Exploiting generative models in discriminative classifiers,” in Advances in Neural Information Processing Systems, pp. 487–493, 1999.
  34. H. Jaeger, M. Zhao, K. Kretzschmar, T. Oberstein, D. Popovici, and A. Kolling, “Learning observable operator models via the es algorithm,” New directions in statistical signal processing: From systems to brains, 2005.
  35. J. Carlyle and A. Paz, “Realizations by stochastic finite automata,” Journal of Computer and System Sciences, vol. 5, no. 1, pp. 26–40, 1971.
  36. G. R. Lanckriet, N. Cristianini, P. Bartlett, L. E. Ghaoui, and M. I. Jordan, “Learning the kernel matrix with semidefinite programming,” Journal of Machine learning research, vol. 5, no. Jan, pp. 27–72, 2004.
  37. A. Monras, A. Beige, and K. Wiesner, “Hidden quantum markov models and non-adaptive read-out of many-body states,” Applied Mathematical and Computational Sciences, vol. 3, no. 1, pp. 93–122, 2011.
  38. M. A. Nielsen and I. L. Chuang, Quantum Computation and Quantum Information: 10th Anniversary Edition. Cambridge University Press, 2010.
  39. W. F. Stinespring, “Positive functions on C*-algebras,” Proceedings of the American Mathematical Society, vol. 6, no. 2, p. 211–216, 1955.
  40. P. Rebentrost, M. Mohseni, and S. Lloyd, “Quantum support vector machine for big data classification,” Phys. Rev. Lett., vol. 113, p. 130503, Sep 2014.
  41. R. Chatterjee and T. Yu, “Generalized coherent states, reproducing kernels, and quantum support vector machines,” Quantum Information and Computation, vol. 17, Dec. 2017.
  42. M. Schuld and N. Killoran, “Quantum machine learning in feature hilbert spaces,” Physical review letters, vol. 122, no. 4, p. 040504, 2019.
  43. J.-E. Park, B. Quanz, S. Wood, H. Higgins, and R. Harishankar, “Practical application improvement to quantum svm: theory to practice,” 2020.
  44. V. Rastunkov, J.-E. Park, A. Mitra, B. Quanz, S. Wood, C. Codella, H. Higgins, and J. Broz, “Boosting method for automated feature space discovery in supervised quantum machine learning models,” 2022.
  45. M. Schuld and F. Petruccione, Machine learning with quantum computers. Springer, 2021.
  46. M. Cerezo, A. Poremba, L. Cincio, and P. J. Coles, “Variational quantum fidelity estimation,” Quantum, vol. 4, p. 248, 2020.
  47. Q. Wang, Z. Zhang, K. Chen, J. Guan, W. Fang, J. Liu, and M. Ying, “Quantum algorithm for fidelity estimation,” IEEE Transactions on Information Theory, vol. 69, no. 1, pp. 273–282, 2022.
  48. R. Chen, Z. Song, X. Zhao, and X. Wang, “Variational quantum algorithms for trace distance and fidelity estimation,” Quantum Science and Technology, vol. 7, no. 1, p. 015019, 2021.
  49. C. A. Fuchs, “Distinguishability and accessible information in quantum theory,” 1996.
  50. R. Jozsa, “Fidelity for mixed quantum states,” Journal of modern optics, vol. 41, no. 12, pp. 2315–2323, 1994.
  51. Y.-C. Liang, Y.-H. Yeh, P. E. Mendonça, R. Y. Teh, M. D. Reid, and P. D. Drummond, “Quantum fidelity measures for mixed states,” Reports on Progress in Physics, vol. 82, no. 7, p. 076001, 2019.
  52. V. Markov, V. Rastunkov, A. Deshmukh, D. Fry, and C. Stefanski, “Implementation and learning of quantum hidden markov models,” 2023.
  53. H. Buhrman, R. Cleve, J. Watrous, and R. de Wolf, “Quantum fingerprinting,” Phys. Rev. Lett., vol. 87, p. 167902, Sep 2001.
  54. H.-Y. Huang, M. Broughton, M. Mohseni, R. Babbush, S. Boixo, H. Neven, and J. R. McClean, “Power of data in quantum machine learning,” Nature Communications, vol. 12, no. 2631, 2021.
  55. A. J., A. Adedoyin, J. Ambrosiano, P. Anisimov, W. Casper, G. Chennupati, C. Coffrin, H. Djidjev, D. Gunter, S. Karra, N. Lemons, S. Lin, A. Malyzhenkov, D. Mascarenas, S. Mniszewski, B. Nadiga, D. O’malley, D. Oyen, S. Pakin, L. Prasad, R. Roberts, P. Romero, N. Santhi, N. Sinitsyn, P. J. Swart, J. G. Wendelberger, B. Yoon, R. Zamora, W. Zhu, S. Eidenbenz, A. Bärtschi, P. J. Coles, M. Vuffray, and A. Y. Lokhov, “Quantum algorithm implementations for beginners,” ACM Transactions on Quantum Computing, vol. 3, pp. 1–92, jul 2022.
  56. P. Das, S. S. Tannu, P. J. Nair, and M. Qureshi, “A case for multi-programming quantum computers,” in Proceedings of the 52nd Annual IEEE/ACM International Symposium on Microarchitecture, pp. 291–303, 2019.
  57. L. Liu and X. Dou, “Qucloud: A new qubit mapping mechanism for multi-programming quantum computing in cloud environment,” in 2021 IEEE International symposium on high-performance computer architecture (HPCA), pp. 167–178, IEEE, 2021.
  58. S. Niu and A. Todri-Sanial, “Multi-programming mechanism on near-term quantum computing,” in Quantum Computing: Circuits, Systems, Automation and Applications, pp. 19–54, Springer, 2023.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 2 tweets with 0 likes about this paper.