Adaptive Learning and Integrated Use of Information Flow Forecasting Methods
Abstract
Doi: 10.28991/ESJ-2023-07-03-03
Full Text: PDF
Keywords
References
Lu, J., Liu, A., Dong, F., Gu, F., Gama, J., & Zhang, G. (2019). Learning under Concept Drift: A Review. IEEE Transactions on Knowledge and Data Engineering, 31(12), 2346–2363. doi:10.1109/TKDE.2018.2876857.
Marinin, M., Karasev, M., Pospehov, G., Pomortseva, A., Kondakova, V., & Sushkova, V. (2023). Comprehensive study of filtration properties of pelletized sandy clay ores and filtration modes in the heap leaching stack. Journal of Mining Institute, 259, 30–40. doi:10.31897/pmi.2023.7.
Blyth, C. R. (1972). On Simpson’s Paradox and the Sure-Thing Principle. Journal of the American Statistical Association, 67(338), 364. doi:10.2307/2284382.
Tsai, S.-Y., & Chang, J.-Y. (2018). Parametric study and design of deep learning on leveling system for smart manufacturing. 2018 IEEE International Conference on Smart Manufacturing, Industrial & Logistics Engineering (SMILE). doi:10.1109/smile.2018.8353980.
Dang, Q., & Yuan, J. (2023). A Kalman filter-based prediction strategy for multiobjective multitasking optimization. Expert Systems with Applications, 213(B), 119025. doi:10.1016/j.eswa.2022.119025.
Pouyanfar, S., Sadiq, S., Yan, Y., Tian, H., Tao, Y., Reyes, M. P., Shyu, M. L., Chen, S. C., & Iyengar, S. S. (2018). A survey on deep learning: Algorithms, techniques, and applications. ACM Computing Surveys, 51(5), 1–36. doi:10.1145/3234150.
Pedersen, T. (2000). A simple approach to building ensembles of naive Bayesian classifiers for word sense disambiguation. arXiv Preprint cs/0005006. doi:10.48550/arXiv.cs/0005006
Sethi, T. S., & Kantardzic, M. (2018). Handling adversarial concept drift in streaming data. Expert Systems with Applications, 97, 18–40. doi:10.1016/j.eswa.2017.12.022.
Cerqueira, V., Torgo, L., & Soares, C. (2019). Machine Learning vs Statistical Methods for Time Series Forecasting: Size Matters. ArXiv. doi:10.48550/arXiv.1909.13316.
Liu, C., Fu, L., Li, H., & Chen, B. (2023). Dynamic Prediction Algorithm for Low-Voltage Distribution Network Power Loss in a Smart City Based on Classification Decision Tree and Marketing Data. Journal of Testing and Evaluation, 51(3), 20220096. doi:10.1520/JTE20220096.
Ye, X., & Zhao, J. (2023). Heterogeneous clustering via adversarial deep Bayesian generative model. Frontiers of Computer Science, 17(3), 173322. doi:10.1007/s11704-022-1376-2.
Zhou, Z. H., & Feng, J. (2019). Deep forest. National Science Review, 6(1), 74–86. doi:10.1093/nsr/nwy108.
Muksin, U., Riana, E., Rudyanto, A., Bauer, K., Simanjuntak, A. V. H., & Weber, M. (2023). Neural network-based classification of rock properties and seismic vulnerability. Global Journal of Environmental Science and Management, 9(1), 15–30. doi:10.22034/gjesm.2023.01.02.
Kim, J. Y., Kim, D., Li, Z. J., Dariva, C., Cao, Y., & Ellis, N. (2023). Predicting and optimizing syngas production from fluidized bed biomass gasifiers: A machine learning approach. Energy, 263, 125900. doi:10.1016/j.energy.2022.125900.
Park, J., & Kim, S. (2020). Machine Learning-Based Activity Pattern Classification Using Personal PM2.5 Exposure Information. International Journal of Environmental Research and Public Health, 17(18), 6573. doi:10.3390/ijerph17186573.
Ren, S., He, K., Girshick, R., & Sun, J. (2017). Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks. IEEE Transactions on Pattern Analysis and Machine Intelligence, 39(6), 1137–1149. doi:10.1109/TPAMI.2016.2577031.
Oikarinen, E., Tiittanen, H., Henelius, A., & Puolamäki, K. (2021). Detecting virtual concept drift of regressors without ground truth values. Data Mining and Knowledge Discovery, 35(3), 726–747. doi:10.1007/s10618-021-00739-7.
Takacs, A., Toledano-Ayala, M., Dominguez-Gonzalez, A., Pastrana-Palma, A., Velazquez, D. T., Ramos, J. M., & Rivas-Araiza, E. A. (2020). Descriptor Generation and Optimization for a Specific Outdoor Environment. IEEE Access, 8, 52550–52565. doi:10.1109/ACCESS.2020.2975474.
Widmer, G., Kubat, M. (1993). Effective learning in dynamic environments by explicit context tracking. In: Brazdil, P.B. (eds.) Machine Learning: ECML-93. ECML 1993. Lecture Notes in Computer Science, vol 667. Springer, Berlin, Heidelberg. doi:10.1007/3-540-56602-3_139.
Hulten, G., Spencer, L., & Domingos, P. (2001). Mining time-changing data streams. Proceedings of the Seventh ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. doi:10.1145/502512.502529.
Black, M., & Hickey, R. J. (1999). Maintaining the performance of a learned classifier under concept drift. Intelligent Data Analysis, 3(6), 453–474. doi:10.3233/IDA-1999-3604.
Jia, R., Dao, D., Wang, B., Hubis, F. A., Gurel, N. M., Li, B., ... & Song, D. (2019). Efficient task-specific data valuation for nearest neighbor algorithms. arXiv Preprint arXiv:1908.08619. doi:10.48550/arXiv.1908.08619.
Klinkenberg, R. (2001). Using labeled and unlabeled data to learn drifting concepts. Workshop notes of the IJCAI-01 Workshop on Learning from Temporal and Spatial Data, 16-24. Held in conjunction with the International Joint Conference on Artificial Intelligence (IJCAI): AAAI Press, 4-6 August, 2001, Menlo Park, United States.
Liu, Y., Liu, Y., Yu, B. X. B., Zhong, S., & Hu, Z. (2023). Noise-robust oversampling for imbalanced data classification. Pattern Recognition, 133. doi:10.1016/j.patcog.2022.109008.
Maletzke, A., Dos Reis, D., Cherman, E., & Batista, G. (2019). DyS: A Framework for Mixture Models in Quantification. Proceedings of the AAAI Conference on Artificial Intelligence, 33(01), 4552–4560. doi:10.1609/aaai.v33i01.33014552.
Li, P., Wu, X., & Hu, X. (2012). Mining recurring concept drifts with limited labeled streaming data. ACM Transactions on Intelligent Systems and Technology, 3(2). doi:10.1145/2089094.2089105.
Djouzi, K., Beghdad-Bey, K., & Amamra, A. (2022). A new adaptive sampling algorithm for big data classification. Journal of Computational Science, 61, 101653. doi:10.1016/j.jocs.2022.101653.
Nan, L. (2013). Classification algorithm for data streams with concept drift and its applications. Master Thesis, Fujian Normal University, Minhou, China.
Wang, G., & Wang, Y. (2023). Self-attention network for few-shot learning based on nearest-neighbor algorithm. Machine Vision and Applications, 34(2), 28. doi:10.1007/s00138-023-01375-5.
Wu, Z., Efros, A. A., & Yu, S. X. (2018). Improving generalization via scalable neighborhood component analysis. In Proceedings of the european conference on computer vision (ECCV), 685-701. doi:10.48550/arXiv.1808.04699.
Maillo, J., Ramírez, S., Triguero, I., & Herrera, F. (2017). kNN-IS: An Iterative Spark-based design of the k-Nearest Neighbors classifier for big data. Knowledge-Based Systems, 117, 3–15. doi:10.1016/j.knosys.2016.06.012.
Deng, Z., Zhu, X., Cheng, D., Zong, M., & Zhang, S. (2016). Efficient KNN classification algorithm for big data. Neurocomputing, 195, 143–148. doi:10.1016/j.neucom.2015.08.112.
Ou, G., He, Y., Fournier-Viger, P., & Huang, J. Z. (2022). A Novel Mixed-Attribute Fusion-Based Naive Bayesian Classifier. Applied Sciences (Switzerland), 12(20), 10443. doi:10.3390/app122010443.
Karegowda, A. G., V, P., Jayaram, M. A., & Manjunath, A. S. (2012). Rule based Classification for Diabetic Patients using Cascaded K-Means and Decision Tree C4.5. International Journal of Computer Applications, 45(12), 45–50. doi:10.5120/6836-9460.
Khan, S., & Yairi, T. (2018). A review on the application of deep learning in system health management. Mechanical Systems and Signal Processing, 107, 241–265. doi:10.1016/j.ymssp.2017.11.024.
Maletzke, A. G., dos Reis, D. M., & Batista, G. E. A. P. A. (2018). Combining instance selection and self-training to improve data stream quantification. Journal of the Brazilian Computer Society, 24(1). doi:10.1186/s13173-018-0076-0.
Gama, J., Zliobaite, I., Bifet, A., Pechenizkiy, M., & Bouchachia, A. (2014). A survey on concept drift adaptation. ACM Computing Surveys, 46(4), 1–37. doi:10.1145/2523813.
Huang, D. T. J., Koh, Y. S., Dobbie, G., & Pears, R. (2014). Detecting Volatility Shift in Data Streams. 2014 IEEE International Conference on Data Mining. doi:10.1109/icdm.2014.50.
Zheng, X., Aragam, B., Ravikumar, P. K., & Xing, E. P. (2018). Dags with no tears: Continuous optimization for structure learning. 32nd Conference on Neural Information Processing Systems (NeurIPS 2018), Montréal, Canada.
Di Franco, G., & Santurro, M. (2021). Machine learning, artificial neural networks and social research. Quality and Quantity, 55(3), 1007–1025. doi:10.1007/s11135-020-01037-y.
Scanagatta, M., Corani, G., Zaffalon, M., Yoo, J., & Kang, U. (2018). Efficient learning of bounded-treewidth Bayesian networks from complete and incomplete data sets. International Journal of Approximate Reasoning, 95, 152–166. doi:10.1016/j.ijar.2018.02.004.
Trevizan, B., Chamby-Diaz, J., Bazzan, A. L. C., & Recamonde-Mendoza, M. (2020). A comparative evaluation of aggregation methods for machine learning over vertically partitioned data. Expert Systems with Applications, 152, 113406. doi:10.1016/j.eswa.2020.113406.
Guo, Y., Chen, Q., Chen J., Wu Q., Shi Q., Tan M. (2019). Auto-embedding generative adversarial networks for high resolution image synthesis. IEEE Transactions on Multimedia, 21(11), 2726-2737. doi:10.1109/TMM.2019.2908352.
Lee, M. H., Kim, N., Yoo, J., Kim, H. K., Son, Y. D., Kim, Y. B., Oh, S. M., Kim, S., Lee, H., Jeon, J. E., & Lee, Y. J. (2021). Multitask fMRI and machine learning approach improve prediction of differential brain activity pattern in patients with insomnia disorder. Scientific Reports, 11(1), 9402. doi:10.1038/s41598-021-88845-w.
Zhu, X. (2010). Power Supply dataset. Stream Data Mining Repository: Florida Atlantic University. Available online: http://www.cse.fau.edu/~xqzhu/stream.html (accessed on April 2023).
Kaggle (2023). Energy generation dataset. Available online: https://www.kaggle.com/nicholasjhana/energy-consumption-generation-prices-and-weather/data?select=energy_dataset.csv (accessed on April 2023).
UCI. (2023). Steel Industry Energy Consumption Dataset. Machine Learning Repository: Center for Machine Learning and Intelligent Systems. Available online: http://archive.ics.uci.edu/ml/datasets/Steel+Industry+Energy+Consumption+Dataset (accessed on April 2023).
UCI (2014). Combined Cycle Power Plant Data Set. Machine Learning Repository: Center for Machine Learning and Intelligent Systems https://archive.ics.uci.edu/ml/datasets/combined+cycle+power+plant# (accessed on April 2023).
Meiseles, A., & Rokach, L. (2020). Source Model Selection for Deep Learning in the Time Series Domain. IEEE Access, 8, 6190–6200. doi:10.1109/ACCESS.2019.2963742.
Rousseeuw, P. J. (1987). Silhouettes: A graphical aid to the interpretation and validation of cluster analysis. Journal of Computational and Applied Mathematics, 20, 53–65. doi:10.1016/0377-0427(87)90125-7.
DOI: 10.28991/ESJ-2023-07-03-03
Refbacks
- There are currently no refbacks.
Copyright (c) 2023 Ilya Lebedev, Mikhail Sukhoparov