It is assumed that since the beginning of time, humans have had an innate desire for unity with nature, but as civilization has evolved, it has become increasingly difficult to fulfill this desire, especially in densely populated urban areas where there is a focus on buildings and infrastructure rather than on maintaining human connection with nature. The concept of biophilic design has emerged as a major trend in contemporary architectural design and is being considered as an alternative to sustainable architecture. Unfortunately, despite its popularity, no effort has been made to comprehensively outline the etymological roots and historical evolution of the concept, as well as its integration into modern architecture. The purpose of this review is to provide a comprehensive synthesis of the existing literature on biophilia, from its etymological roots to its integration into modern architecture. The approach taken in this review is a critical one, aimed at identifying and summarizing the existing relevant literature.
Keywords: biophilia, biophilic design, biophilic architecture, nature, architecture, environmental psychology, fromm, sustainability
Information technologies have become increasingly used in various fields, be it document management or payment systems. One of the most popular and promising technologies is cryptocurrency. Since they require ensuring the security and reliability of data in the system, most of them use blockchain and complex cryptographic protocols, such as zero-knowledge proof protocols (ZKP). Therefore, an important aspect for achieving the security of these systems is verification, since it can be used to assess the system's resistance to various attacks, as well as its compliance with security requirements. This paper will consider both the concept of verification itself and the methods for its implementation. A comparison of methods for identifying a proof suitable for zero-knowledge protocols is also carried out. And as a result, a conclusion is made that an integrated approach to verification is needed, since choosing only one method cannot cover all potential vulnerabilities. In this regard, it is necessary to apply various verification methods at various stages of system design.
Keywords: cryptocurrency, blockchain, verification, formal method, static analysis, dynamic method, zero-knowledge proof protocol
The article presents a comparative analysis of modern database management systems (PostgreSQL/PostGIS, Oracle Database, Microsoft SQL Server, and MongoDB) in the context of implementing a distributed storage of geospatial information. The aim of the study is to identify the strengths and limitations of different platforms when working with heterogeneous geospatial data and to evaluate their applicability in distributed GIS solutions. The research covers three main types of data: vector, raster, and point cloud. A comprehensive set of experiments was conducted in a test environment close to real operating conditions, including functional testing, performance benchmarking, scalability analysis, and fault tolerance assessment.
The results demonstrated that PostgreSQL/PostGIS provides the most balanced solution, showing high scalability and stable performance across all data types, which makes it a versatile platform for building GIS applications. Oracle Database exhibited strong results when processing raster data and proved effective under heavy workloads in multi-node architectures, which is especially relevant for corporate environments. Microsoft SQL Server showed reliable performance on vector data, particularly in distributed scenarios, though requiring optimization for binary storage. MongoDB proved suitable for storing raster content and metadata through GridFS, but its scalability is limited compared to traditional relational DBMS.
In conclusion, PostgreSQL/PostGIS can be recommended as the optimal choice for projects that require universality and high efficiency in distributed storage of geospatial data, while Oracle and Microsoft SQL Server may be preferable in specialized enterprise solutions, and MongoDB can be applied in tasks where flexible metadata management is a priority.
Keywords: geographic information system, database, postgresql, postgis, oracle database, microsoft sql server, mongodb, vector, raster, point cloud, scalability, performance, fault tolerance
This article aims to analyze modern energy-efficient technologies used in the architecture and construction of "healthy" office buildings, their impact on reducing the carbon footprint and operating costs, and key aspects such as the use of renewable energy sources, energy-efficient building materials, and smart control systems.
Keywords: construction, architecture, "healthy" office buildings, energy-efficient technologies, renewable energy sources, energy-efficient building materials, and smart control systems
This paper proposes a novel model of computer network behavior that incorporates weighted multi-label dependencies to identify rare anomalous events. The model accounts for multi-label dependencies not previously encountered in the source data, enabling a "preemptive" assessment of their potential destructive impact on the network. An algorithm for calculating the potential damage from the realization of a multi-label dependency is presented. The proposed model is applicable for analyzing a broad spectrum of rare events in information security and for developing new methods and algorithms for information protection based on multi-label patterns. The approach allows for fine-tuning the parameters of multi-label dependency accounting within the model, depending on the specific goals and operating conditions of the computer network.
Keywords: multi-label classification, multi-label dependency, attribute space, computer attacks, information security, network traffic classification, attack detection, attribute informativeness, model, rare anomalous events, anomalous events
The article discusses modern approaches to the design and implementation of data processing architectures in intelligent transport systems (ITS) with a focus on ensuring technological sovereignty. Special attention is paid to the integration of machine learning practices to automate the full lifecycle of machine learning models: from data preparation and streaming to real-time monitoring and updating of models. Architectural solutions using distributed computing platforms such as Hadoop and Apache Spark, in-memory databases on Apache Ignite, as well as Kafka messaging brokers to ensure reliable transmission of events are analyzed. The importance of infrastructure flexibility and scalability, support for parallel operation of multiple models, and reliable access control, including security issues, and the use of transport layer security protocols, is emphasized. Recommendations are given on the organization of a logging and monitoring system for rapid response to changes and incidents. The presented solutions are focused on ensuring high fault tolerance, safety and compliance with the requirements of industrial operation, which allows for efficient processing of large volumes of transport data and adaptation of ITS systems to import-independent conditions.
Keywords: data processing, intelligent transport systems, distributed processing, scalability, fault tolerance
Many major cities around the world are seeing a trend toward compact housing units, typically located in central areas and fully integrated with existing urban infrastructure. Despite similarities in size and location, each project offers unique architectural and social solutions to address the housing crisis and adapt to limited space. With urban density increasing and real estate prices rising, compact housing projects, typically located in central areas and fully integrated with existing infrastructure, are becoming increasingly relevant. Despite the general trend toward smaller spaces, each project offers unique architectural and social responses to the challenges of modern urbanism.
Keywords: micro-housing, urbanization, architecture, city, infrastructure, mobility, sustainability, renovation, miniaturization, accessibility, autonomy, flexibility, project, space, concept
The article discusses the development of a method for protecting confidential images in instant messengers based on masking with orthogonal matrices. The vulnerability of the system to brute-force attacks and account compromise is analyzed. The main focus is on the development of an architecture for analyzing abnormal activity and adaptive authentication. The article presents a system structure with independent security components that provide blocking based on brute-force attacks and flexible session management. The interaction of the modules within a unified security system is described, with the distribution of functions between server and client components.
Keywords: information security, messenger, messaging, communications, instant messaging systems, security audits, and brute-force attacks
This article is aimed at analyzing river passenger transport infrastructure systems in order to establish a theoretical foundation for designing a modern system of architectural facilities serving river transport passengers. Based on the analysis of cities with developed river transport systems, both in domestic and international contexts, the study identifies the key factors influencing the formation of routes and the placement of facilities serving river passenger transport. It also formulates the principles for developing river transport infrastructure facilities. The findings reveal the specific characteristics of forming a river passenger transport system and provide a solid basis for its future development.
Keywords: river transport, river passenger transport system, river transport berthing infrastructure, pier, landing stage, pontoon, river terminal, river passenger route, river transport network, passenger facility in river infrastructure
The aim of this study is to analyze methods for protecting radio channels from intentional interference by managing wireless channel resources, with an emphasis on identifying key challenges and directions for further research in this area. The primary method applied is the ontological approach of knowledge engineering. The work collects and systematizes the main approaches to counteracting jamming of communication channels and analyzes studies aimed at formalizing radio network problems for the purpose of modeling and analysis. The results made it possible to determine relevant development directions, identify existing gaps, formulate requirements for the model under development, and justify the choice of methods to be used in subsequent research.
Keywords: FHSS, interference, radio channel, radio communication, telecommunications, jamming, network, modeling, communication, mitigation, security
The article is devoted to the development of an innovative neural network decision support system for firefighting in conditions of limited visibility. A comprehensive approach based on the integration of data from multispectral sensors (lidar, ultrasonic phased array, temperature and hygrometric sensors) is presented. The architecture includes a hybrid network that combines three-dimensional convolutional and bidirectional LSTM neurons. To improve the quality of processing, a cross-modal attention mechanism is used to evaluate the physical nature and reliability of incoming signals. A Bayesian approach is used to account for the uncertainty of forecasts using the Monte Carlo dropout method. Adaptive routing algorithms allow for quick response to changing situations. This solution significantly improves the efficiency of firefighting operations and reduces the risk to personnel.
Keywords: mathematical model, intelligence, organizational model, gas and smoke protection service, neural networks, limited visibility, fire department, management, intelligent systems, decision support
The article discusses methods for automated determination of threshold values of parameters when assessing the critical state of technical systems. It explores the theoretical foundations and practical aspects of setting thresholds, including statistical, expert, and combined approaches. Particular attention is paid to mathematical models and algorithms for processing monitoring data. Various methods for determining threshold values are presented: computational, experimental, statistical, and expert. The features of applying adaptive thresholds, dynamic control, and machine learning systems are described. An analysis of existing approaches to determining critical conditions of equipment is conducted. Recommendations are developed for selecting optimal methods for determining thresholds, taking into account the specifics of technical systems. Comprehensive solutions are proposed that combine analytical and machine methods to improve the reliability and safety of computing complexes.
Keywords: automated control, threshold values, technical diagnostics, critical state, computing systems, machine learning, statistical analysis, expert systems, equipment monitoring, technical safety
This paper considers a method for predictive detection and prevention of failures in information systems using neural networks. The «1C:Enterprise» was chosen as an applied example , widely used in the corporate environment. The urgency of the task is due to the need to improve the reliability of information systems and minimize downtime associated with technical failures. The proposed approach includes several stages: collecting and analyzing error logs, preprocessing data, selecting the architecture of an artificial neural network and then verifying its quality. A comparative analysis shows that the proposed solution provides a faster response rate to potential failures compared to classical monitoring tools.
Keywords: 1C:Enterprise, corporate information systems (CIS), IT services analytics
This research investigates the development of expert systems (ES) based on large language models (LLMs) enhanced with augmented generation techniques. The study focuses on integrating LLMs into ES architectures to enhance decision-making processes. The growing influence of LLMs in AI has opened new possibilities for expert systems. Traditional ES require extensive development of knowledge bases and inference algorithms, while LLMs offer advanced dialogue capabilities and efficient data processing. However, their reliability in specialized domains remains a challenge. The research proposes an approach combining LLMs with augmented generation, where the model utilizes external knowledge bases for specialized responses. The ES architecture is based on LLM agents implementing production rules and uncertainty handling through confidence coefficients. A specialized prompt manages system-user interaction and knowledge processing. The architecture includes agents for situation analysis, knowledge management, and decision-making, implementing multi-step inference chains. Experimental validation using YandexGPT 5 Pro demonstrates the system’s capability to perform core ES functions: user interaction, rule application, and decision generation. Combining LLMs with structured knowledge representation enhances ES performance significantly. The findings contribute to creating more efficient ES by leveraging LLM capabilities with formalized knowledge management and decision-making algorithms.
Keywords: large language model, expert system, artificial intelligence, decision support, knowledge representation, prompt engineering, uncertainty handling, decision-making algorithms, knowledge management
The relevance of this article stems from the need to develop lightweight and scalable solutions for decentralized systems (blockchain, IoT), where traditional cryptographic methods are inefficient or excessive. A theoretical-practical method for protecting unmanned transportation systems against Sybil attacks has been developed, based on a server robot’s analysis of each client robot’s unique directional electromagnetic signal power map signature. Experimental solutions for Sybil attack protection are demonstrated using two aerial servers deployed on quadcopters. The proposed keyless Sybil attack defense method utilizes WiFi signal parameter analysis (e.g., power scattering and variable antenna radiation patterns) to detect spoofed client robots. Experiments confirm that monitoring unique radio channel characteristics effectively limits signature forgery. This physical-layer approach is also applicable to detecting packet injection in robot Wi-Fi networks. The key advantages of the developed method include the elimination of cryptography, reducing computational overhead; the use of physical signal parameters as a "fingerprint" for legitimate devices; and the method's scalability to counter other threats, such as traffic injection.
Keywords: protection against Sybil attacks, unmanned vehicle systems, electromagnetic signal power map, WiFi signal, signature falsification, spoofing, and synthetic aperture radar
Methods of increasing the efficiency of data analysis based on topology and analytical geometry are becoming increasingly popular in modern information systems. However, due to the high degree of complexity of topological structures, the solution of the main tasks of processing and storing information is provided by spatial geometry in combination with modular arithmetic and analytical assignment of geometric structures, the description of which is involved in the development of new methods for solving optimization problems. The practical application of elliptic cryptography, including in network protocols, is based on the use of interpolation methods for approximating graphs of functions, since a loss of accuracy may occur when performing many sequential mathematical operations. This problem is related to the features of the computing architecture of modern devices. It is known that an error can have a cumulative effect, so data approximation methods must be used sequentially as calculations are performed.
Keywords: elliptic curve, information system, data analysis, discrete logarithm, point order, scalar, subexponential algorithm
The article discusses intelligent document processing algorithms as a key tool for quality control of the manufacturing process of nanocomposites. The analysis of methods of preprocessing and analysis of technological documentation, including entity extraction, data classification and anomaly detection, is carried out. The comparison is carried out in the context of the applicability of these algorithms to the tasks of ensuring the specified electrophysical properties of the final product. The algorithms were tested on data from the production of polymer nanocomposites based on an epoxy matrix with carbon nanotubes (CNTs) as a conductive filler, and the possibilities of using intelligent methods to predict defects, optimize synthesis parameters, and automate reporting were considered. The advantages and disadvantages of the applied approaches, as well as their effectiveness in various scenarios of quality management in production, are investigated. The developed algorithms are implemented as a microservice architecture compatible with industrial MES systems. The visualization interface allows you to: track the current state of the process in real time; view the history of deviations and decisions made, simulate the consequences of parameter changes. The article will be useful to specialists in the field of materials science, process engineers, as well as developers of automation systems and researchers interested in the application of artificial intelligence in industry.
Keywords: machine learning, automatic document processing, artificial intelligence, nanocomposites, mathematical modeling, software package
This paper addresses the challenges of automating enterprise-level business processes through the development of a specialized web application designed for task planning and working time accounting. The research focuses on solving problems related to organizational efficiency within the context of contemporary digital transformations. An innovative architectural design is proposed, leveraging advanced web technologies such as React for the frontend, Node.js with Express.js for the backend, and MySQL for reliable data storage. Particular emphasis is placed on utilizing the Effector library for efficient state management, resulting in significant performance improvements by reducing redundant UI renders. The developed solution offers robust features that enhance operational transparency, resource allocation optimization, and overall productivity enhancement across departments. Furthermore, economic justification demonstrates cost savings achieved through streamlined workflows and reduced administrative overhead. The practical implementation of this system has shown promising results, providing businesses with enhanced capabilities to manage tasks effectively while ensuring scalability and adaptability to future growth requirements.
Keywords: automation of business processes, task management, time management, web application, React, Node.js, Effector, information systems performance
In the context of modern continuous integration and delivery processes, it is critically important not only to have automated tests, but also their real effectiveness, reliability and economic feasibility. In this paper, the key metrics for evaluating the quality of automated testing are systematized, with a special focus on the problem of unstable tests. New indicators have been introduced and justified: the level of unstable tests and the loss of the continuous integration pipeline, which directly reflect the costs of maintaining the test infrastructure. The limitations of traditional metrics, in particular code coverage, are analyzed in detail, and the superiority of mutation testing as a more reliable indicator of the test suite's ability to detect defects is demonstrated. Key dependencies have been identified on the demonstration data of the real continuous integration pipeline: an increase in code coverage does not guarantee an improvement in mutation testing and does not lead to an increase in the number of detected defects; a high proportion of unstable tests correlates with significant losses of machine time and a decrease in confidence in test results.; Reducing the time to detect and eliminate defects is achieved not only by increasing coverage, but also by reducing the proportion of unstable tests, improving system observability, and strengthening defect management discipline.
Keywords: quality metrics for automated testing, mutation testing, unstable tests, code coverage, empirical metric analysis, comparative analysis of testing metrics, optimization of testing processes, cost-effectiveness of automation, software quality management
Problem statement. When modeling a complex technical system, the issues of parameter estimation are of primary importance. To solve this problem, it is necessary to obtain a methodology that allows eliminating errors and inaccuracies in obtaining numerical parameters. Goal. The article is devoted to a systematic analysis of the methodology for estimating the parameters of a complex technical system using the interval estimation method. The research method. A systematic analysis of the methods of using interval estimates of numerical parameters is carried out. The decomposition and structuring of the methods were carried out. Results. The expediency of using a methodology for describing the parameters of a complex technical system using the interval estimation method is shown. An analysis of the use of various interval estimation models is presented. Practical significance. Application in the analysis and construction of complex systems is considered as a practical application option. The method of estimating the parameters of a complex technical system using the interval estimation method can be used as a practical guide.
Keywords: interval estimation, parameter estimation, numerical data, fuzzy data, complex technical systems
The article presents a basic model for information support of decision-making designed to enhance the efficiency and resilience of organizational systems within the information security units of the Russian Ministry of Emergency Situations (EMERCOM). The model constitutes an integrated, multi-level system based on the interaction of five interrelated mathematical models: data collection and preprocessing, automated classification, situation forecasting and analysis, decision option generation, and knowledge management. To enhance the model’s functionality and adaptability under conditions of uncertainty, the following formulas have been introduced: assessment of data processing system workload; automatic calibration of weighting coefficients based on forecast error minimization; calculation of information source credibility index; weighted event evaluation accounting for source credibility; decision explainability using the SHAP method; and knowledge utility metrics together with criteria for knowledge archiving. These formulas ensure objectivity, transparency, and robustness of management processes, enabling rapid response to cyber threats, minimizing the impact of human factors, and maintaining high-quality managerial decision-making.
Keywords: information support, decision-making, organizational system, basic model, explainability of decisions, knowledge management
The paper provides a comparative analysis of the accuracy of determining the coordinates of an aircraft using the classical correlation extreme algorithm (CEA) and the machine irradiation method based on a fully convolutional neural network (FCN) based on terrain maps. Two-dimensional correlated random functions are used as relief models. It has been shown that CEA is effective with small amounts of data, whereas FCN demonstrates high noise immunity after training on representative samples. Both methods showed the dependence of the accuracy of determining the coordinates of the aircraft on the size of the reference area, the number of standards, entropy, and the correlation coefficient of the random relief.
Keywords: correlation-extreme algorithm, deep learning, convolutional neural network, aircraft guidance, digital terrain model, Fourier filtering, spatial correlation, noise immunity, algorithm comparison, autonomous navigation, hybrid systems, terrain entropy
This article investigates the problem of structured data schema matching and aggregates the results from previous stages of the research. The systematization of results demonstrated that while the previously considered approaches show promising outcomes, their effectiveness is often insufficient for real-world application. One of the most effective methods was selected for further study. The Self-Organizing Map method was analyzed, which is based on a criterial analysis of the attribute composition of schemas, using an iterative approach to minimize the distance between points (in the current task, a point represents a schema attribute). An experiment on schema matching was conducted using five examples. The results revealed both the strengths and limitations of the method under investigation. It was found that the selected method exhibits insufficient robustness and reproducibility of results on diverse real-world datasets. The verification of the method confirmed the need for its further optimization. The conclusion outlines directions for future research in this field.
Keywords: data management, fusion schemes, machine learning, classification, clustering, machine learning, experimental analysis, data metrics