Employing an LSTM model with a self-attention mechanism to model variations in running pace
DOI:
https://doi.org/10.71451/sndcxa82Keywords:
Neural Network, LSTM, Running pace change fitting, Self-attention mechanism, Enhancing performanceAbstract
During the process of running, the pace varies with changes in time. Experienced runners adjust their pace by modifying stride length and frequency. To simulate the relationship between stride length, frequency, and pacing throughout a run using neural network methods, we can leverage the temporal patterns present in the data. The unique memory capabilities of Long Short-Term Memory (LSTM) networks make them well-suited for this task. To enhance the predictive accuracy of the LSTM model, we incorporate a self-attention mechanism that improves the model's ability to associate information from different positions within the input data. This self-attention mechanism enables the model to identify which parts of the data are more significant and thus allocate greater focus during predictions, ultimately enhancing performance. The LSTM method augmented with a self-attention mechanism can accurately fit changes in running pace based on historical exercise data, providing valuable insights for practitioners in running sports. Experimental results indicate that this model achieves a fitting error on the order of 10-4.
References
[1]Zhi-Hua Zhou. 2021. Machine Learning. Springer Singapore.
[2]Starbuck, C. 2023. Logistic Regression. In: The Fundamentals of People Analytics. Springer, Cham. https://doi.org/10.1007/978-3-031-28674-2_12 DOI: https://doi.org/10.1007/978-3-031-28674-2_12
[3]Mingchen Li and Zhiji Yang. 2022. Deep Twin Support Vector Networks. In Artificial Intelligence: Second CAAI International Conference, CICAI 2022, Beijing, China, August 27–28, 2022, Revised Selected Papers, Part III. Springer-Verlag, Berlin, Heidelberg, 94–106. https://doi.org/10.1007/978-3-031-20503-3_8. DOI: https://doi.org/10.1007/978-3-031-20503-3_8
[4]Etxegarai U, Portillo E & Irazusta J, et al. 2018. Estimation of lactate threshold with machine learning techniques in recreational runners. Applied Soft Computing 63: 181–196. DOI: https://doi.org/10.1016/j.asoc.2017.11.036
[5]Lee K, Han K & Ko J. 2024. Analyzing the impact of the automatic ball-strike system in professional baseball: A case study on kbo league data. ArXiv: 2407.15779.
[6]Chai J, Zeng H & Li A, et al. 2021. Deep learning in computer vision: A critical review of emerging techniques and application scenarios. Machine Learning with Applications 6: 100134. DOI: https://doi.org/10.1016/j.mlwa.2021.100134
[7]Gao G, Liao H Y & Hu Z. 2024. Ai for equitable tennis training: Leveraging ai for equitable and accurate classification of tennis skill levels and training phases. ArXiv: 2406.16987.
[8]Krizhevsky A, Sutskever I & Hinton G E. 2012. Imagenet classification with deep convolutional neural networks. Communications of the ACM 60: 84 - 90. DOI: https://doi.org/10.1145/3065386
[9]Lacan S. 2024. Stacking-based deep neural network for player scouting in football. ArXiv: 2403.08835.
[10]Sun H C, Lin T Y & Tsai Y L. 2022. Performance prediction in major league baseball by long short-term memory networks . ArXiv: 2206.09654.
[11]Chakwate R U. 2020. Analyzing long short term memory models for cricket match outcome prediction. International Journal for Research in Applied Science and Engineering Technology 8(11): 1–8. DOI: https://doi.org/10.22214/ijraset.2020.28203
[12]Vaswani A, Shazeer N & Parmar N, et al. 2023. Attention is all you need. 2023. ArXiv: 1706.03762.
[13]Shaeri P & Katanforoush A. 2024. A semi-supervised fake news detection using sentiment encoding and lstm with self-attention. ArXiv: 2407.19332.
[14]Li B, Zhang D & Zhao Z, et al. 2024. Quantum-inspired interpretable deep learning architecture for text sentiment analysis. ArXiv: 2408.07891.
[15]Lai G, Chang W C & Yang Y, et al. 2018. Modeling long- and short-term temporal patterns with deep neural networks. ArXiv: 1703.07015. DOI: https://doi.org/10.1145/3209978.3210006
[16]Bahdanau D, Cho K & Bengio Y. 2016. Neural machine translation by jointly learning to align and translate. ArXiv: 1409.0473.
[17]Hochreiter S & Schmidhuber J. 1997. Long short-term memory. Neural computation 9: 1735-80. DOI: https://doi.org/10.1162/neco.1997.9.8.1735
[18]Lipton Z C, Berkowitz J & Elkan C. 2015. A critical review of recurrent neural networks for sequence learning. ArXiv: 1506.00019.
Downloads
Published
Issue
Section
License
Copyright (c) 2025 International Scientific Technical and Economic Research

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
This work is licensed under the Creative Commons Attribution International License (CC BY 4.0).