Literature DB >> 26285224

Sequence Prediction With Sparse Distributed Hyperdimensional Coding Applied to the Analysis of Mobile Phone Use Patterns.

Okko J Rasanen, Jukka P Saarinen.   

Abstract

Modeling and prediction of temporal sequences is central to many signal processing and machine learning applications. Prediction based on sequence history is typically performed using parametric models, such as fixed-order Markov chains ( n -grams), approximations of high-order Markov processes, such as mixed-order Markov models or mixtures of lagged bigram models, or with other machine learning techniques. This paper presents a method for sequence prediction based on sparse hyperdimensional coding of the sequence structure and describes how higher order temporal structures can be utilized in sparse coding in a balanced manner. The method is purely incremental, allowing real-time online learning and prediction with limited computational resources. Experiments with prediction of mobile phone use patterns, including the prediction of the next launched application, the next GPS location of the user, and the next artist played with the phone media player, reveal that the proposed method is able to capture the relevant variable-order structure from the sequences. In comparison with the n -grams and the mixed-order Markov models, the sparse hyperdimensional predictor clearly outperforms its peers in terms of unweighted average recall and achieves an equal level of weighted average recall as the mixed-order Markov chain but without the batch training of the mixed-order model.

Entities:  

Year:  2015        PMID: 26285224     DOI: 10.1109/TNNLS.2015.2462721

Source DB:  PubMed          Journal:  IEEE Trans Neural Netw Learn Syst        ISSN: 2162-237X            Impact factor:   10.451


  5 in total

1.  Stochastic-HD: Leveraging Stochastic Computing on the Hyper-Dimensional Computing Pipeline.

Authors:  Justin Morris; Yilun Hao; Saransh Gupta; Behnam Khaleghi; Baris Aksanli; Tajana Rosing
Journal:  Front Neurosci       Date:  2022-05-30       Impact factor: 5.152

2.  Cellular Automata Can Reduce Memory Requirements of Collective-State Computing.

Authors:  Denis Kleyko; Edward Paxon Frady; Friedrich T Sommer
Journal:  IEEE Trans Neural Netw Learn Syst       Date:  2022-06-01       Impact factor: 14.255

3.  Memory-inspired spiking hyperdimensional network for robust online learning.

Authors:  Zhuowen Zou; Haleh Alimohamadi; Ali Zakeri; Farhad Imani; Yeseong Kim; M Hassan Najafi; Mohsen Imani
Journal:  Sci Rep       Date:  2022-05-10       Impact factor: 4.996

4.  GrapHD: Graph-Based Hyperdimensional Memorization for Brain-Like Cognitive Learning.

Authors:  Prathyush Poduval; Haleh Alimohamadi; Ali Zakeri; Farhad Imani; M Hassan Najafi; Tony Givargis; Mohsen Imani
Journal:  Front Neurosci       Date:  2022-02-04       Impact factor: 4.677

5.  EventHD: Robust and efficient hyperdimensional learning with neuromorphic sensor.

Authors:  Zhuowen Zou; Haleh Alimohamadi; Yeseong Kim; M Hassan Najafi; Narayan Srinivasa; Mohsen Imani
Journal:  Front Neurosci       Date:  2022-07-27       Impact factor: 5.152

  5 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.