UDC 004.8
This paper addresses the systematic design of neural network architectures for tabular data processing under stringent resource constraints typical for embedded and edge systems. The work aims to develop a methodology that formalizes this task as a constrained multi-objective optimization problem within a discrete hyperparameter space, framing the synthesis process as decision support under uncertainty. A methodology for multi-criteria evolutionary synthesis is proposed, based on the Multi-Island Genetic Algorithm (MIGA), which integrates an island model of evolution to maintain population diversity and the NSGA-II selection mechanism to construct a Pareto front approximation. Conflicting optimization criteria include classification accuracy, memory footprint required for model storage, and inference latency. For the experimental validation of the methodology, three public tabular datasets representing different application scenarios and complexity levels were selected. A software framework with a three-tier architecture was developed and implemented, supporting the full cycle of automated design—from adaptive data analysis to results visualization. A comparative analysis with baseline methods (logistic regression, decision tree, gradient boosting) demonstrated that the proposed methodology can synthesize models that, with comparable accuracy, are orders of magnitude more compact and faster than gradient boosting models. In cases involving complex nonlinear dependencies with small sample sizes, the synthesized models statistically significantly outperform the baselines in accuracy. The results confirm the practical significance of the methodology for reducing design complexity and providing developers with a set of quantitatively justified trade-off solutions that comply with given hardware constraints.
Multi-objective optimization, evolutionary architecture synthesis, neural networks, tabular data, resource-constrained systems, multilayer perceptron, systems analysis, decision-making, Pareto optimality, automated design
1. Elsken T., Metzen J. H., Hutter F. Neural architecture search: A survey //Journal of Machine Learning Research. – 2019. – V. 20. – №. 55. – P. 1-21.
2. Chen T., Guestrin C. XGBoost: A Scalable Tree Boosting System // Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD '16). – 2016. – P. 785–794. DOI:https://doi.org/10.1145/2939672.293978
3. Grinsztajn L., Oyallon E., Varoquaux G. Why do tree-based models still outperform deep learning on typical tabular data? //Advances in neural information processing systems. – 2022. – V. 35. – P. 507-520.
4. Dorogush A.V., Ershov V., Gulin A. CatBoost: gradient boosting with categorical features support // arXiv:1810,11363 [cs.LG]. – 2018. DOI:https://doi.org/10.48550/arXiv.1810.11363
5. Prokhorenkova L. et al. CatBoost: unbiased boosting with categorical features // Advances in Neural Information Processing Systems. – 2018. – V. 31. – P. 6637–6647.
6. Banbury C. et al. MLPerf Tiny Benchmark // arXiv:2106,07597 [cs.LG]. – 2021. DOI:https://doi.org/10.48550/arXiv.2106.07597
7. Sarker Z., Mitra S., Gandhi S. A Comprehensive Review on Compiler in Neural Networks //2023 IEEE International Conference on Contemporary Computing and Communications (InC4). – IEEE, 2023. – V. 1. – P. 1-6. DOI:https://doi.org/10.1109/InC457730.2023.10263112
8. Aivazyan S. A. Applied Statistics. Fundamentals of Econometrics. – Moscow: UNITY-DANA, 2001. – 432 p.
9. Gorishniy Y. et al. Revisiting deep learning models for tabular data //Advances in neural information processing systems. – 2021. – V. 34. – P. 18932-18943.
10. Chasovskikh A. A. et al. Automation of Neural Network Architecture Search // Intelligent Systems. Theory and Applications. – 2023. – Vol. 27. – No. 4. – P. 5–27.
11. Feurer M., Hutter F. Hyperparameter Optimization // In: Hutter F., Kotthoff L., Vanschoren J. (eds) Automated Machine Learning: Methods, Systems, Challenges. The Springer Series on Challenges in Machine Learning. – Springer, Cham. – 2019. – P. 3–38.
12. Warden P., Situnayake D. TinyML: Machine Learning with TensorFlow Lite on Arduino and Ultra-Low-Power Microcontrollers. – Sebastopol, CA: O'Reilly Media. – 2020. – 504 p.
13. Liu H., Simonyan K., Yang Y. Darts: Differentiable architecture search //arXiv preprint arXiv:1806,09055. – 2018. DOI:https://doi.org/10.48550/arXiv.1806.09055
14. Xing N. et al. Anytime neural architecture search on tabular data //arXiv preprint arXiv:2403,10318. – 2024. DOI:https://doi.org/10.48550/arXiv.2403.10318
15. Deb K. et al. A fast and elitist multiobjective genetic algorithm: NSGA-II //IEEE transactions on evolutionary computation. – 2002. – V. 6. – №. 2. – P. 182-197. DOI:https://doi.org/10.1109/4235.996017
16. He X., Zhao K., Chu X. AutoML: A Survey of the State-of-the-Art // Knowledge-Based Systems. – 2021. – V. 212. – P. 106622. DOI:https://doi.org/10.1016/j.knosys.2020.106622
17. Hutter F., Kotthoff L., Vanschoren J. (Eds.) Automated Machine Learning: Methods, Systems, Challenges. – Springer, 2019. – 223 p.
18. Voronin A., Ziatdinov Yu., Antonyuk A. Multi-criteria Optimization of Neural Network Classifier Architectures // Classification, Forecasting, Data Mining. – 2009. – P. 32–39.
19. Arik S. Ö., Pfister T. Tabnet: Attentive interpretable tabular learning //Proceedings of the AAAI conference on artificial intelligence. – 2021. – V. 35. – №. 8. – P. 6679-6687. DOI:https://doi.org/10.1609/aaai.v35i8.16826
20. Gorishniy Y. et al. Revisiting deep learning models for tabular data //Advances in neural information processing systems. – 2021. – V. 34. – P. 18932-18943.
21. He Y. et al. Filter pruning via geometric median for deep convolutional neural networks acceleration //Proceedings of the IEEE/CVF conference on computer vision and pattern recognition. – 2019. – P. 4340-4349.
22. Jacob B. et al. Quantization and training of neural networks for efficient integer-arithmetic-only inference //Proceedings of the IEEE conference on computer vision and pattern recognition. – 2018. – P. 2704-2713.
23. Skvortsov A. A., Anuryeva M. S., Solodovnikov A. N. Intelligent Text Classification System under Linguistic Uncertainty // Software Engineering. – 2025. – Vol. 16. – No. 11. – P. 583–593. – DOI:https://doi.org/10.17587/prin.16.583-593.
24. Whitley D., Rana S., Heckendorn R. B. The island model genetic algorithm: On separability, population size and convergence //Journal of computing and information technology. – 1999. – V. 7. – №. 1. – P. 33-47.
25. Lu Z. et al. Nsga-net: neural architecture search using multi-objective genetic algorithm // Proceedings of the genetic and evolutionary computation conference. – 2019. – P. 419-427. DOI:https://doi.org/10.1145/3321707.332172
26. Deb K. et al. A fast and elitist multiobjective genetic algorithm: NSGA-II //IEEE transactions on evolutionary computation. – 2002. – V. 6. – №. 2. – P. 182-197. DOI:https://doi.org/10.1109/4235.996017
27. IoT Occupancy Detection Dataset [Electronic resource]. – UCI Machine Learning Repository. – URL: https://archive.ics.uci.edu/ml/datasets/occupancy+detection (date of request: 20.11.2025).
28. Airline Passenger Satisfaction Dataset [Electronic resource]. – Kaggle. – URL: https://www.kaggle.com/datasets/teejmahal20/airline-passenger-satisfaction (date of request: 20.11.2025).
29. Wine Quality (Red Wine) Dataset [Electronic resource]. – UCI Machine Learning Repository. – URL: https://archive.ics.uci.edu/ml/datasets/wine+quality (date of request: 20.11.2025).



