сотрудник с 01.01.2025 по 01.01.2025
ФГБОУ ВО "Мелитопольский государственный университет" (кафедра прикладной математики и информационных технологий им. профессора В.М. Найдыша, профессор)
сотрудник
Россия
Статья посвящена проблеме интеграции технологий «зеленых» вычислений, цифровых двойников и иммерсивной аналитики в систему управления устойчивым развитием. Целью исследования является теоретическое обоснование архитектуры многоуровневой киберфизической системы, трансформирующей экологический мониторинг из пассивного сбора данных в инструмент предиктивного управления рисками. Методологическая основа работы базируется на системном подходе, сравнительном анализе технологических поколений и методе ситуационного анализа (case study). Основные результаты исследования показывают, что переход к децентрализованным вычислениям (Edge Computing) и принципам Green Software Engineering существенно снижает энергопотребление инфраструктуры. Использование цифровых двойников обеспечивает смену парадигмы от реактивного устранения последствий к проактивному сценарному моделированию. Особое внимание уделено внедрению VR/AR-интерфейсов, которые снижают когнитивную нагрузку на экспертов и создают единую среду для межведомственного взаимодействия. Исследование подтверждает эффективность предлагаемой модели для повышения оперативности и прозрачности экологического контроля.
зеленые вычисления, цифровые двойники, иммерсивная аналитика, устойчивое развитие, оценка экологических рисков, киберфизическая система, Edge Computing.
Introduction
The modern paradigm of sustainable development is characterized by a high degree of uncertainty driven by climate change and growing anthropogenic pressure. Under these conditions, traditional environmental management methods relying on retrospective analysis and static data prove inadequate, particularly in the context of destructive natural phenomena.
Existing studies typically address the issues of computing energy efficiency and data visualization in isolation, failing to offer a comprehensive approach to their convergence. The need to resolve these contradictions defined the aim of this study: the development and theoretical justification of an environmental monitoring architecture that transforms passive data collection into a predictive risk management system. To achieve this objective, the following tasks were addressed: systematizing the principles of "green computing" to reduce the energy consumption of monitoring; justifying the transition from static maps to dynamic digital twins; and assessing the effectiveness of immersive interfaces as a tool for cognitive decision support.
Methods
The methodological framework of the study is based on the systems approach, which enabled the consideration of the environmental monitoring infrastructure as a multi-level cyber-physical system (Fig. 1). Comparative analysis of technological generations was employed to assess the effectiveness of the proposed solutions. The theoretical model was verified using the case study method, based on an analysis of contemporary practices in implementing digital twins and immersive interfaces for natural risk management.
In this study, the environmental monitoring architecture is viewed as a multi-level cyber-physical system. This approach allows for the analysis of data collection, processing, and interpretation not in isolation, but as a unified control loop linking physical environmental parameters with digital predictive models.

Fig. 1. Multi-level cyber-physical system
At the first system level, designated as the "computational foundation," a comparative analysis of architectural approaches to Big Data processing was conducted regarding their potential impact on energy consumption and carbon footprint. The principles of Green Software Engineering, formulated by S. Naumann and E. Kern, served as the conceptual framework for this comparison; these principles dictate that energy efficiency must be addressed as early as the software design phase [1].
Within the second level –"dynamic simulation"– the study examines the feasibility of transitioning from predominantly static risk assessment methods, enshrined in various industry regulations (specifically, DVGW standards) [2], to dynamic probabilistic models (such as digital twins) capable of updating with incoming data in near real-time [3].
The third level of analysis–the "cognitive interface"–focused on the challenges of complex data interpretation by end-users. The concept of psychological presence in virtual environments was utilized as the theoretical framework.
Results and Discussion
An analysis of contemporary scientific literature [1,4,5,6] indicates the increasing significance of optimizing computational infrastructure as a key condition for the sustainability of environmental monitoring systems. Works dedicated to this issue emphasize that the rapid growth in data volumes generated by sensor networks (IoT) and distributed observation systems can lead to a so-called "energy paradox," wherein the aggregate energy consumption of the IT infrastructure begins to negate the environmental benefits derived from its use.
The examined studies suggest that transitioning from traditional design and operation models of hardware-software systems to the principles of Green Software Engineering, as well as the utilization of decentralized data processing architectures (Edge Computing), offers the potential to enhance overall system energy efficiency and reduce the load on centralized computational resources.
A comparison of the characteristics of traditional and optimized approaches to organizing computational processes is presented in Table 1.
Table 1.
Comparative analysis of traditional and optimized approaches to
computing organization
|
Comparison Criterion |
Traditional Approach |
Green HPC / Edge Approach |
Expected Effect and Source |
|
Software Design Metrics |
Sole focus on functionality and execution speed (Performance-driven). |
Integration of energy consumption metrics at the architectural design stage (Energy-driven). |
Reduction of Data Center energy consumption by 15–20% during climate simulations. |
|
Data Processing Architecture |
Centralized (Cloud-centric): transmission of all raw data to the cloud for processing. |
Decentralized (Edge-centric): primary filtration and processing of data on edge devices (IoT). |
Reduction of load on backbone communication channels by up to 40%; reduced latency. |
|
Infrastructure Scalability |
Static resource allocation (Over-provisioning) to cover peak loads. |
Dynamic scaling and use of energy-efficient hardware (e.g., ARM architectures). |
Minimization of energy consumption during idle periods. |
The practical applicability of decentralized data processing architectures (Fig. 2) extends beyond addressing purely technical challenges; it can also be regarded as an instrument for optimizing operational costs in industries characterized by geographically distributed infrastructure. From an economic perspective, such architectures enable the reduction of transaction costs associated with the transmission, storage, and processing of redundant volumes of routine data.

Fig. 2. Optimization of transaction costs in information flows
The empirical validation of this approach is reflected in a series of case studies demonstrating the effectiveness of implementing energy-driven metrics at various levels of data management.
The work by Hölbling et al. focuses on the infrastructural level of the problem and analyzes the operation of high-performance computing (HPC) clusters based at the Universities of Graz and Vienna (Austria). This study confirms the proposition presented in Table 1 regarding the necessity of a paradigm shift from a performance-driven to an energy-driven approach. The authors demonstrate that the traditional focus exclusively on computational speed results in an excessive carbon footprint, whereas the implementation of "energy-aware job scheduling" strategies and software code optimization can reduce energy consumption by 10–30% without hardware upgrades. The paper introduces the concept of a "user-centric carbon footprint," enabling a transition from static resource allocation to dynamic workload management, thereby minimizing energy waste during idle periods and during the execution of resource-intensive climate simulations [7].
The scientific-practical approach described in the study by Khatua et al. illustrates the application of Green Software Engineering principles at the algorithmic level within the context of industrial logistics in India. Within the framework of the "Twin Transition" concept, the authors developed a route optimization model wherein the minimization of CO2 emissions is not a by-product, but a mathematical constraint of the algorithm's objective function (Sustainability by Design). This approach demonstrates the practical applicability of decentralized computing for reducing transaction costs in geographically distributed systems. Instead of simple data aggregation on a central server, the implementation of intelligent algorithms at the supply chain management level allows for the optimization of operational processes in real-time, which correlates with the expected effect of infrastructure load reduction presented in Table 1 [8].
Thus, the implementation of "Green Computing" principles creates the necessary infrastructural foundation for environmentally neutral monitoring. However, the mere presence of energy-efficient capacities is a necessary but not sufficient condition. The key challenge lies in how exactly to utilize this computational resource to transition from simply capturing the current state of the environment to forecasting its future changes. This leads us to the second level of the architecture–dynamic simulation technologies.
An analysis of the research field [2,3,9,10] demonstrates that traditional methods, which function effectively under stable conditions, prove inadequate in the face of modern destructive changes, such as climate anomalies. To assess the effectiveness of implementing dynamic simulation, a comparative analysis of three generations of management systems was conducted. The results of the study are presented in Table 2.
Table 2
Evolution of environmental risk management systems: from statics to digital twins
|
Evaluation Criterion |
Generation 1: Traditional Approach (Static/Manual) |
Generation 2: Telemetry and SCADA (Monitoring) |
Generation 3: Digital Twin (Dynamic Simulation) |
Managerial Impact |
|
1. Data Time Horizon |
Discrete. Manual sampling (laboratory) once a week or month. High latency. |
Real-time (observation). Continuous data stream from sensors. State capture "now". |
Real-time + Forecast. Data "now" is used to simulate the state "in 6 hours". |
Elimination of cash flow gaps in resource planning and prevention of equipment downtime. |
|
2. Risk Assessment Methodology |
Retrospective. Analysis of past accidents. Extrapolation of historical trends into the future. |
Threshold-based. Alarm triggering when limits are exceeded (e.g., water level). |
Scenario-based (What-If). Probabilistic modeling of events that have not yet occurred (anomalies). |
Reduction of insurance risks and costs for force majeure consequence remediation. |
|
3. Data Integration |
Fragmented. Data stored in disjointed paper reports or Excel spreadsheets (Silos). |
Partial. Aggregation of sensors from a single system (e.g., water only, but not weather). |
Holistic. Synthesis of hydrology, meteorology, and infrastructure data in a unified model. |
Optimization of operating expenses (OpEx) via a comprehensive situational view. |
|
4. Decision-Making Mode |
Reactive. Actions taken after visually observable damage has occurred. |
Responsive. Actions taken upon sensor triggering (minimum reaction time). |
Proactive. Preventive maneuvers prior to the event (based on simulation). |
Minimization of damage to assets and the environment; prevention of regulatory penalties. |
|
5. Role of Automation |
Zero. Complete reliance on the intuition and qualification of the duty dispatcher. |
Medium. Automated data collection, but manual interpretation and decision-making. |
High. AI algorithms propose ready-made response scenarios (Decision Support). |
Reduction of human factor influence and costs for maintaining a large staff of analysts. |
The data in Table 2 clearly demonstrate the qualitative leap associated with the transition to Generation 3. While telemetry systems (Generation 2) answer the question "What is happening right now?", Digital Twins allow for forecasting: "What will happen if...?". Dynamic simulation based on digital twins is illustrated in Fig. 3.

Fig.3. Dynamic simulation based on digital twins
The mechanism of dynamic Water Safety Plans is described in detail in the works of Gottwalt and Sturm [3,9]. Unlike static guidelines, such plans are generated algorithmically in real-time. For instance, when simulating a chemical spill upstream, the system automatically calculates the arrival time of the contamination wave at the water intake and suggests the optimal moment for the operator to close the sluice gates. This ensures not only the safety of the population but also conserves water purification reagents by limiting the supply shutdown to the strictly necessary time interval.
The foundation of the dynamic simulation subsystem is the data integration layer, described in the review by Lehtola et al. The authors justify the necessity of transitioning from simple 3D models to semantically enriched twins capable of merging heterogeneous streams–from photogrammetry and laser scanning (LiDAR) to IoT sensors. This creates the necessary environment for deploying analytics [11].
Based on this environment, the level of algorithmic resilience is implemented, a detailed example of which can be examined in the study by Dui et al. Here, static data are transformed into dynamic strategies: mathematical models calculate resilience indices and automatically generate infrastructure recovery scenarios in case of accidents, thereby providing the "intelligence" of the system [12].
Despite the fact that digital twins provide high mathematical accuracy of forecasts and are capable of generating resilience scenarios in real-time, their implementation engenders a new problem–the "last mile" problem in analytics. The generated arrays of multidimensional probabilistic data often prove too complex for rapid interpretation by the decision-maker, especially under the stressful conditions of an emergency. The gap between the complexity of the machine model and human cognitive capabilities necessitates the introduction of the third level of the architecture – adaptive interaction interfaces.
The solution to this problem lies in the implementation of immersive analytics technologies [13] and the transition to collaborative learning in virtual environments (MUVEs). Empirical studies, based on Slater’s theory of "psychological presence" [14], confirm that immersion in VR activates spatial reasoning mechanisms that are inaccessible when working with a traditional monitor.
To systematize the effects of implementing immersive technologies, a comparative analysis of interaction interfaces was conducted, the results of which are presented in Table 3.
Table 3
Effectiveness of cognitive interfaces in environmental management
|
Evaluation Criterion |
Traditional 2D Methods (GIS/Maps, Desktop) |
Single-User VR |
Collaborative VR (Collaborative / MUVE) |
|
1. Spatial Data Perception |
Abstract (2D). Requires high qualification to interpret topographic maps and cross-sections. |
Direct (3D, 1:1). Intuitive understanding of scale and distance due to stereoscopic vision. |
Shared (Shared 3D). Participants view the object from different angles while co-located in a unified virtual space. |
|
2. Visualization of Invisible Threats |
Symbolic. Color schemes (heat maps) on a 2D plane. Low visibility of pollution volume. |
Volumetric. Gases, radiation, or noise are visualized as physical 3D objects (clouds, domes). |
Interactive. Ability to point out a hidden anomaly to colleagues using a "finger" (laser pointer) in VR. |
|
3. Expert Communication Format |
Asynchronous. Sequential exchange of reports, screenshots, and emails. High latency. |
Isolated. The expert is self-immersed; communication with the outside world during the session is hindered. |
Synchronous. Real-time interaction via avatars and voice chat, centered "around" the model. |
|
4. Cognitive Load on Decision Makers |
High. The brain expends resources on "translating" numbers and maps into a mental model of the situation. |
Medium. Load is shifted to the interface; sensory conflict (motion sickness) is possible during prolonged use. |
Optimal. Distributed cognition effect: the group solves a complex task collectively, reducing stress. |
As the data in Table 3 indicate, the most promising direction is the creation of multi-user virtual environments that allow for synchronizing situational perception among various agencies (e.g., emergency services, ecologists, and municipal authorities).
Furthermore, this level closes the feedback loop with the public through Citizen Science mechanisms [15]. The integration of Gamification, described by Reiners [16], enables the use of citizens' mobile devices as a distributed sensor network, while Augmented Reality (AR) technologies serve as a tool for providing feedback to residents regarding the environmental conditions in their district. The functional diagram of the cognitive interface and socially-oriented interaction is presented in Fig. 4.

Fig. 4. Functional diagram of the cognitive interface and
socially-oriented interaction
The empirical validation of the proposed architecture is supported by a series of contemporary studies demonstrating the effectiveness of immersive technologies in overcoming cognitive barriers when working with complex data. An analysis of practical implementations allows for the identification of three key vectors of management process transformation: from the visualization of hidden infrastructure to the organization of distributed expert interaction.
The first vector is associated with the challenge of interpreting hidden parameters of the technosphere. As shown in the work of Park et al. [17], the traditional representation of water supply networks as two-dimensional schematics fails to provide operators with sufficient contextual awareness. The solution was the integration of digital twin algorithms (based on EPANET) with Augmented Reality (AR) tools. This approach allows for the superimposition of the virtual network topology directly onto the physical urban environment, visualizing invisible parameters–such as reagent concentration or flow directions–in situ. This eliminates the need for mental reconstruction of the spatial position of utilities and minimizes errors during on-site operational decision-making.
The second vector aims to improve the perception of dynamic geophysical processes. The study by Alene et al. [18] confirms that the direct interpretation of numerical data (Eulerian numerical models) requires significant cognitive effort from the expert. The VR framework developed by the authors transforms raw data arrays regarding avalanches or floods into intuitively understandable dynamic 3D scenarios. The immersive environment ensures a direct semantic link between the mathematical model and its physical manifestation, allowing decision-makers to assess the scale of threats (flow velocity, inundation depth) based on natural mechanisms of spatial perception, rather than through the abstraction of tables and graphs.
The third vector addresses the task of action synchronization under conditions of geographically distributed expert groups. The problem of asynchronous communication is effectively mitigated by Multi-User Virtual Environments (MUVEs), exemplified by the GeospatialVR platform described by Sermet & Demir [19]. The creation of a unified digital space allows stakeholders–from scientists to administration representatives–to interact within the simulation as avatars. This approach implements the principle of "distributed cognition," wherein the development of a response strategy occurs during the synchronous observation of the unfolding emergency, which is critical for eliminating interagency barriers and increasing reaction speed.
The examined cases confirm that the transition to cognitive interfaces is not merely a visual improvement, but represents a fundamental shift in decision-making methodology, ensuring the "transparency" of complex data and a shared context for all participants in the management process.
The results of the study confirm the initial hypothesis that the effectiveness of modern environmental monitoring systems is determined not so much by the quantity of collected data, as by the architecture of its processing and interpretation. The proposed multi-level cyber-physical system demonstrates a synergistic effect resulting from the integration of disparate technological trends–Green HPC, Digital Twins, and Immersive Interfaces–into a unified management control loop.
Понял. Перевожу строго по тексту, максимально точно передавая смысл и структуру оригинала, без лишних добавлений.
Discussion
The analysis of the implementation of Edge Computing at the first level of the model demonstrated the possibility of significantly reducing operational costs by filtering routine traffic at the network periphery, which resolves the problem of "information noise" characteristic of traditional centralized architectures.
Moreover, the transition to dynamic simulation at the second level of the model allows for the transformation of environmental management from reactive to proactive, where a key role is played by scenario-based risk analysis and preventive resource maneuvering, rather than the remediation of accident consequences.
However, despite the theoretical justification and demonstrated advantages, the proposed cyber-physical system possesses a number of significant limitations preventing its immediate and widespread implementation. A critical drawback of the current version of the framework is the high capital intensity of the transition to Level I. The large-scale replacement of legacy infrastructure (legacy sensors) with intelligent Edge devices requires significant initial investments, which may be economically unfeasible for enterprises in developing regions or for sectors with low margins.
The second significant limitation is related to the reliability of Digital Twins at Level II. The effectiveness of predictive models directly depends on the quality and continuity of input data. Currently, the model is not sufficiently resilient to the degradation of sensor networks, intentional data distortion, or connection breaks in remote locations. Reliance on inaccurate data during scenario modeling can lead to the adoption of erroneous managerial decisions with potentially catastrophic consequences.
The third block of limitations concerns the cognitive level. Despite the proven effectiveness of VR technologies in reducing cognitive load on experts, their wide application is constrained by high hardware requirements and the lack of standardized protocols for interagency interaction in virtual environments. Furthermore, the integration of Citizen Science data is associated with the problem of validating heterogeneous information coming from unqualified users, which creates a risk of adding noise to professional models.
In light of the identified limitations, subsequent research should be focused on increasing the economic affordability and technological resilience of the proposed solutions. A priority direction is the development of lightweight machine learning algorithms capable of functioning on low-power and obsolete microcontrollers. This will allow lowering the entry barrier for implementing Edge Computing without the need for complete equipment replacement.
It appears critically important to develop methods for automated Data Quality Assurance based on artificial intelligence for filtering anomalies and verifying information flows from civilian volunteers prior to their integration into the Digital Twin.
Finally, future research must assess the potential of more accessible web-oriented 3D visualization technologies (WebXR) as an alternative to expensive VR headsets, which will allow ensuring mass public participation in environmental monitoring.
The conducted research confirmed the necessity of a fundamental structural transformation of existing environmental monitoring and natural resource management systems under conditions of increasing climatic instability and exponential growth of data volumes. A systematic analysis of the subject area allowed for justifying the effectiveness of the proposed three-level architecture of the cyber-physical system as a comprehensive response to modern technological and managerial challenges.
Conclusion
The study has conceptualized and substantiated the necessity of a fundamental structural transformation of monitoring tools in the context of global sustainable development challenges. The synthesis of the obtained results confirms that achieving digital infrastructure sustainability goals is impossible without a technological paradigm shift at the foundational level: the transition from centralized data processing to Edge Computing combined with Green Software Engineering principles ensures the necessary reduction of operational costs and minimization of the information systems' carbon footprint.
The scientific novelty of the work lies in the theoretical justification of the multi-level cyber-physical system architecture, which, unlike traditional discrete monitoring models, for the first time integrates energy-efficient computational algorithms, probabilistic dynamic modeling, and immersive analytics into a single unified closed control loop.
The theoretical significance of the study consists in the development of the methodological apparatus for sustainable development management. In particular, the proposed approach expands the understanding of the role of the "cognitive interface" as a critically important connecting link ensuring the interpretation of the mathematical abstractions of the digital twin and their transformation into managerial decisions under conditions of high uncertainty.
The practical significance of the work is determined by the possibility of the applied use of the proposed cyber-physical model for the transition from a reactive strategy of consequence remediation to proactive scenario-based risk management. The implementation of the developed principles allows for increasing the speed of interagency interaction through the use of VR/AR environments for collective data analysis, and also provides the technological basis for integrating Citizen Science mechanisms into the decision-making loop, which contributes to increasing the transparency and social responsibility of environmental control.
1. Науманн С., Дик М., Керн Э., Иоганн Т. «Модель GREENSOFT: эталонная модель для экологичного и устойчивого программного обеспечения». Устойчивые вычисления. Том 1, № 4. 2011 г.
2. «Безопасность в питьевом водоснабжении – управление рисками при нормальной эксплуатации (памятка W 1001-B2)». Бонн: wvgw-Verlag. 2015 г.
3. Готтвальт Дж. [и др.]. «Проектирование веб-приложения для процессно-ориентированного управления рисками водосборов питьевой воды». Progress in IS. Чам: Springer. 2018 г.
4. Бунгартц Х.-Й. [и др.]. «Достижения и новые тенденции в экологической информатике: управление изменениями, большими данными и открытой наукой». Чам: Springer. 2018 г.
5. Хаэтами А., Хан О. «Влияние виртуальной реальности на совместное обучение в высшем образовании». Международный журнал образовательных повествований. Том 2, № 6. 2024 г.
6. Керн Э. [и др.]. «Устойчивые программные продукты: критерии оценки эффективности ресурсов и энергии». Системы компьютеров нового поколения. Том 86. 2018 г.
7. Хёльблинг С. [и др.]. «Использование энергии и выбросы углерода в высокопроизводительных вычислениях: кейс университетов». Системы чистой окружающей среды. Арт. 100332. 2025 г.
8. Хатуа С. [и др.]. «Предписательная модель оптимизации маршрутов для промышленных фирм, продвигающих двойной переход». Устойчивое будущее. Том 10. 2025 г.
9. Штурм С., Филлингер Ф., Кифер Дж. «Управление рисками для водосборных площадей водохранилищ питьевой воды». DVGW Энергия|Вода-Практика. № 5. 2016 г.
10. Тао Ф. [и др.]. «Цифровой двойник в промышленности: современное состояние». IEEE Транзакции по промышленной информатике. Том 15, № 4. 2019 г.
11. Лехтола В. В. [и др.]. «Цифровой двойник города: обзор технологий для городских нужд». Международный журнал прикладного наблюдения Земли и геоинформации. Арт. 102915. 2022 г.
12. Дуй Х., Цао Т., Ван Ф. «Оценка устойчивости на основе цифровых двойников и интеллектуальные стратегии городских сетей водоснабжения». Устойчивые города и структуры. Том 4, № 1. 2025 г.
13. Марриотт К. [и др.]. «Иммерсивная аналитика: переоценка ценности 3D для визуализации информации». Иммерсивная аналитика. Чам: Springer. 2018 г.
14. Слейтер М., Санчес-Вивес М. В. «Улучшение нашей жизни с помощью иммерсивной виртуальной реальности». Границы в робототехнике и ИИ. Том 3. 2016 г.
15. Фрайсл Д. [и др.]. «Гражданская наука в науках об окружающей среде и экологии». Nature Reviews Methods Primers. Том 2, № 1. 2022 г.
16. Рейнерс Т., Вуд Л. К. «Геймификация в образовании и бизнесе». Чам: Springer. 2015 г.
17. Пак Дж.-И. [и др.]. «Инструмент цифрового двойника для системы распределения питьевой воды с использованием дополненной реальности и EPANET». Моделирование и ПО для окружающей среды. 2025 г.
18. Алене Г. Х. [и др.]. «Визуализация геофизических потоков в виртуальной реальности: структура». Моделирование и ПО для окружающей среды. Том 177. 2024 г.
19. Сермет Ю., Демир И. «GeospatialVR: веб-платформа виртуальной реальности для совместного экологического моделирования». Компьютеры и геонауки. Том 159. 2022 г. DOI: https://doi.org/10.1016/j.cageo.2021.105010



