References
|
- Liang, Y., Wen, H., Nie, Y., Jiang, Y., … - Liang, Y., Wen, H., Nie, Y., Jiang, Y., Jin, M., Song, D., ... & Wen, Q. (2024, August). Foundation models for time series analysis: A tutorial and survey. In Proceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (pp. 6555-6565).
- Rasul, K., Ashok, A., Williams, A. R., Khorasani, A., Adamopoulos, G., Bhagwatkar, R., ... & Rish, I. (2023). Lag-llama: Towards foundation models for time series forecasting. arXiv preprint arXiv:2310.08278.
- Jin, M., Wang, S., Ma, L., Chu, Z., Zhang, J. Y., Shi, X., ... & Wen, Q. (2023). Time-llm: Time series forecasting by reprogramming large language models. arXiv preprint arXiv:2310.01728.
- Liu, Y., Qin, G., Huang, X., Wang, J., & Long, M. (2024). Autotimes: Autoregressive time series forecasters via large language models. arXiv preprint arXiv:2402.02370.
- Liu, X., Hu, J., Li, Y., Diao, S., Liang, Y., Hooi, B., & Zimmermann, R. (2024, May). Unitime: A language-empowered unified model for cross-domain time series forecasting. In Proceedings of the ACM on Web Conference 2024 (pp. 4095-4106).
- Huang, X., Tang, J., & Shen, Y. (2024). Long time series of ocean wave prediction based on PatchTST model. Ocean Engineering, 301, 117572.
- Zhou, H., Zhang, S., Peng, J., Zhang, S., Li, J., Xiong, H., & Zhang, W. (2021, May). Informer: Beyond efficient transformer for long sequence time-series forecasting. In Proceedings of the AAAI conference on artificial intelligence (Vol. 35, No. 12, pp. 11106-11115).
- Wu, H., Xu, J., Wang, J., & Long, M. (2021). Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting. Advances in neural information processing systems, 34, 22419-22430.
- Wang, Y., Wu, H., Dong, J., Liu, Y., Qiu, Y., Zhang, H., ... & Long, M. (2024). Timexer: Empowering transformers for time series forecasting with exogenous variables. arXiv preprint arXiv:2402.19072.ariables. arXiv preprint arXiv:2402.19072.
|