Resource Demand Prediction and Optimization Based on Time Series Analysis in Cloud Computing Platform

Authors

  • Jiaying Huang EC2 Core Platform, Amazon.com Services LLC, Seattle, Washington, 98121, United States Author

DOI:

https://doi.org/10.71222/em27t277

Keywords:

cloud computing, resource prediction, time series analysis

Abstract

In the cloud computing environment, dynamic load changes pose higher requirements for the predictive ability of resource scheduling. To improve resource utilization and reduce the risk of SLA default, this paper proposes a multi-model integrated resource demand prediction framework, combining ARIMA and LSTM to capture linear and nonlinear features in time series data. The framework uses Alibaba Cloud Tianchi load data, including CPU, memory, and network metrics, as learning samples for model training and evaluation; Experiments show that the ARIMA-LSTM hybrid model has better performance in both RMSE and MAE indicators than the single model, with the minimum RMSE being 7.32. To further improve the efficiency of resource allocation driven by prediction, the study introduces the ensemble learning method based on LightGBM, the hierarchical time series analysis mechanism, and the ridge regression dynamic allocation strategy combined with L1 regularization. In the deployment environment of the hybrid cloud platform, this framework has increased resource utilization by 19.6% and reduced service Level Agreement (SLA) default events by more than 50%, mainly due to improved prediction accuracy that minimizes excessive reservations and prediction deviations. The research results provide practical and effective methods and examples for the accurate and secure resource scheduling of cloud service platforms and the implementation of cloud services.

References

1. S. Pang, et al., "Joint trajectory and energy consumption optimization based on UAV wireless charging in cloud computing system," IEEE Trans. Cloud Comput., vol. 11, no. 4, pp. 3426–3438, 2023, doi: 10.1109/TCC.2023.3288527.

2. M. Smendowski and P. Nawrocki, "Optimizing multi-time series forecasting for enhanced cloud resource utilization based on machine learning," Knowl.-Based Syst., vol. 304, p. 112489, 2024, doi: 10.1016/j.knosys.2024.112489.

3. N. Sanjay and S. Sreedharan, "Hybrid bio-inspired optimization-based cloud resource demand prediction using improved support vector machine," Int. J. Adv. Comput. Sci. Appl., vol. 15, no. 1, 2024, doi: 10.14569/ijacsa.2024.0150177.

4. R. Rathinam, et al., "SJFO: Sail jelly fish optimization enabled VM migration with DRNN-based prediction for load balancing in cloud computing," Netw. Comput. Neural Syst., vol. 35, no. 4, pp. 403–428, 2024, doi: 10.1080/0954898X.2024.2359609.

5. E. Kholdi and S. M. Babamir, "Reserve policy-aware VM positioning based on prediction in multi-cloud environment," J. Supercomput., vol. 80, no. 16, pp. 23736–23766, 2024, doi: 10.1007/s11227-024-06349-6.

Downloads

Published

25 June 2025

Issue

Section

Article

How to Cite

Huang, J. (2025). Resource Demand Prediction and Optimization Based on Time Series Analysis in Cloud Computing Platform. Journal of Computer, Signal, and System Research, 2(5), 1-7. https://doi.org/10.71222/em27t277