Information technologies have become increasingly used in various fields, be it document management or payment systems. One of the most popular and promising technologies is cryptocurrency. Since they require ensuring the security and reliability of data in the system, most of them use blockchain and complex cryptographic protocols, such as zero-knowledge proof protocols (ZKP). Therefore, an important aspect for achieving the security of these systems is verification, since it can be used to assess the system's resistance to various attacks, as well as its compliance with security requirements. This paper will consider both the concept of verification itself and the methods for its implementation. A comparison of methods for identifying a proof suitable for zero-knowledge protocols is also carried out. And as a result, a conclusion is made that an integrated approach to verification is needed, since choosing only one method cannot cover all potential vulnerabilities. In this regard, it is necessary to apply various verification methods at various stages of system design.
Keywords: cryptocurrency, blockchain, verification, formal method, static analysis, dynamic method, zero-knowledge proof protocol
The article presents a comparative analysis of modern database management systems (PostgreSQL/PostGIS, Oracle Database, Microsoft SQL Server, and MongoDB) in the context of implementing a distributed storage of geospatial information. The aim of the study is to identify the strengths and limitations of different platforms when working with heterogeneous geospatial data and to evaluate their applicability in distributed GIS solutions. The research covers three main types of data: vector, raster, and point cloud. A comprehensive set of experiments was conducted in a test environment close to real operating conditions, including functional testing, performance benchmarking, scalability analysis, and fault tolerance assessment.
The results demonstrated that PostgreSQL/PostGIS provides the most balanced solution, showing high scalability and stable performance across all data types, which makes it a versatile platform for building GIS applications. Oracle Database exhibited strong results when processing raster data and proved effective under heavy workloads in multi-node architectures, which is especially relevant for corporate environments. Microsoft SQL Server showed reliable performance on vector data, particularly in distributed scenarios, though requiring optimization for binary storage. MongoDB proved suitable for storing raster content and metadata through GridFS, but its scalability is limited compared to traditional relational DBMS.
In conclusion, PostgreSQL/PostGIS can be recommended as the optimal choice for projects that require universality and high efficiency in distributed storage of geospatial data, while Oracle and Microsoft SQL Server may be preferable in specialized enterprise solutions, and MongoDB can be applied in tasks where flexible metadata management is a priority.
Keywords: geographic information system, database, postgresql, postgis, oracle database, microsoft sql server, mongodb, vector, raster, point cloud, scalability, performance, fault tolerance
This article aims to analyze modern energy-efficient technologies used in the architecture and construction of "healthy" office buildings, their impact on reducing the carbon footprint and operating costs, and key aspects such as the use of renewable energy sources, energy-efficient building materials, and smart control systems.
Keywords: construction, architecture, "healthy" office buildings, energy-efficient technologies, renewable energy sources, energy-efficient building materials, and smart control systems
This paper proposes a novel model of computer network behavior that incorporates weighted multi-label dependencies to identify rare anomalous events. The model accounts for multi-label dependencies not previously encountered in the source data, enabling a "preemptive" assessment of their potential destructive impact on the network. An algorithm for calculating the potential damage from the realization of a multi-label dependency is presented. The proposed model is applicable for analyzing a broad spectrum of rare events in information security and for developing new methods and algorithms for information protection based on multi-label patterns. The approach allows for fine-tuning the parameters of multi-label dependency accounting within the model, depending on the specific goals and operating conditions of the computer network.
Keywords: multi-label classification, multi-label dependency, attribute space, computer attacks, information security, network traffic classification, attack detection, attribute informativeness, model, rare anomalous events, anomalous events
The article discusses modern approaches to the design and implementation of data processing architectures in intelligent transport systems (ITS) with a focus on ensuring technological sovereignty. Special attention is paid to the integration of machine learning practices to automate the full lifecycle of machine learning models: from data preparation and streaming to real-time monitoring and updating of models. Architectural solutions using distributed computing platforms such as Hadoop and Apache Spark, in-memory databases on Apache Ignite, as well as Kafka messaging brokers to ensure reliable transmission of events are analyzed. The importance of infrastructure flexibility and scalability, support for parallel operation of multiple models, and reliable access control, including security issues, and the use of transport layer security protocols, is emphasized. Recommendations are given on the organization of a logging and monitoring system for rapid response to changes and incidents. The presented solutions are focused on ensuring high fault tolerance, safety and compliance with the requirements of industrial operation, which allows for efficient processing of large volumes of transport data and adaptation of ITS systems to import-independent conditions.
Keywords: data processing, intelligent transport systems, distributed processing, scalability, fault tolerance
The article is devoted to the development of an innovative neural network decision support system for firefighting in conditions of limited visibility. A comprehensive approach based on the integration of data from multispectral sensors (lidar, ultrasonic phased array, temperature and hygrometric sensors) is presented. The architecture includes a hybrid network that combines three-dimensional convolutional and bidirectional LSTM neurons. To improve the quality of processing, a cross-modal attention mechanism is used to evaluate the physical nature and reliability of incoming signals. A Bayesian approach is used to account for the uncertainty of forecasts using the Monte Carlo dropout method. Adaptive routing algorithms allow for quick response to changing situations. This solution significantly improves the efficiency of firefighting operations and reduces the risk to personnel.
Keywords: mathematical model, intelligence, organizational model, gas and smoke protection service, neural networks, limited visibility, fire department, management, intelligent systems, decision support
The article discusses methods for automated determination of threshold values of parameters when assessing the critical state of technical systems. It explores the theoretical foundations and practical aspects of setting thresholds, including statistical, expert, and combined approaches. Particular attention is paid to mathematical models and algorithms for processing monitoring data. Various methods for determining threshold values are presented: computational, experimental, statistical, and expert. The features of applying adaptive thresholds, dynamic control, and machine learning systems are described. An analysis of existing approaches to determining critical conditions of equipment is conducted. Recommendations are developed for selecting optimal methods for determining thresholds, taking into account the specifics of technical systems. Comprehensive solutions are proposed that combine analytical and machine methods to improve the reliability and safety of computing complexes.
Keywords: automated control, threshold values, technical diagnostics, critical state, computing systems, machine learning, statistical analysis, expert systems, equipment monitoring, technical safety
This paper considers a method for predictive detection and prevention of failures in information systems using neural networks. The «1C:Enterprise» was chosen as an applied example , widely used in the corporate environment. The urgency of the task is due to the need to improve the reliability of information systems and minimize downtime associated with technical failures. The proposed approach includes several stages: collecting and analyzing error logs, preprocessing data, selecting the architecture of an artificial neural network and then verifying its quality. A comparative analysis shows that the proposed solution provides a faster response rate to potential failures compared to classical monitoring tools.
Keywords: 1C:Enterprise, corporate information systems (CIS), IT services analytics
This research investigates the development of expert systems (ES) based on large language models (LLMs) enhanced with augmented generation techniques. The study focuses on integrating LLMs into ES architectures to enhance decision-making processes. The growing influence of LLMs in AI has opened new possibilities for expert systems. Traditional ES require extensive development of knowledge bases and inference algorithms, while LLMs offer advanced dialogue capabilities and efficient data processing. However, their reliability in specialized domains remains a challenge. The research proposes an approach combining LLMs with augmented generation, where the model utilizes external knowledge bases for specialized responses. The ES architecture is based on LLM agents implementing production rules and uncertainty handling through confidence coefficients. A specialized prompt manages system-user interaction and knowledge processing. The architecture includes agents for situation analysis, knowledge management, and decision-making, implementing multi-step inference chains. Experimental validation using YandexGPT 5 Pro demonstrates the system’s capability to perform core ES functions: user interaction, rule application, and decision generation. Combining LLMs with structured knowledge representation enhances ES performance significantly. The findings contribute to creating more efficient ES by leveraging LLM capabilities with formalized knowledge management and decision-making algorithms.
Keywords: large language model, expert system, artificial intelligence, decision support, knowledge representation, prompt engineering, uncertainty handling, decision-making algorithms, knowledge management
The relevance of this article stems from the need to develop lightweight and scalable solutions for decentralized systems (blockchain, IoT), where traditional cryptographic methods are inefficient or excessive. A theoretical-practical method for protecting unmanned transportation systems against Sybil attacks has been developed, based on a server robot’s analysis of each client robot’s unique directional electromagnetic signal power map signature. Experimental solutions for Sybil attack protection are demonstrated using two aerial servers deployed on quadcopters. The proposed keyless Sybil attack defense method utilizes WiFi signal parameter analysis (e.g., power scattering and variable antenna radiation patterns) to detect spoofed client robots. Experiments confirm that monitoring unique radio channel characteristics effectively limits signature forgery. This physical-layer approach is also applicable to detecting packet injection in robot Wi-Fi networks. The key advantages of the developed method include the elimination of cryptography, reducing computational overhead; the use of physical signal parameters as a "fingerprint" for legitimate devices; and the method's scalability to counter other threats, such as traffic injection.
Keywords: protection against Sybil attacks, unmanned vehicle systems, electromagnetic signal power map, WiFi signal, signature falsification, spoofing, and synthetic aperture radar
The article discusses intelligent document processing algorithms as a key tool for quality control of the manufacturing process of nanocomposites. The analysis of methods of preprocessing and analysis of technological documentation, including entity extraction, data classification and anomaly detection, is carried out. The comparison is carried out in the context of the applicability of these algorithms to the tasks of ensuring the specified electrophysical properties of the final product. The algorithms were tested on data from the production of polymer nanocomposites based on an epoxy matrix with carbon nanotubes (CNTs) as a conductive filler, and the possibilities of using intelligent methods to predict defects, optimize synthesis parameters, and automate reporting were considered. The advantages and disadvantages of the applied approaches, as well as their effectiveness in various scenarios of quality management in production, are investigated. The developed algorithms are implemented as a microservice architecture compatible with industrial MES systems. The visualization interface allows you to: track the current state of the process in real time; view the history of deviations and decisions made, simulate the consequences of parameter changes. The article will be useful to specialists in the field of materials science, process engineers, as well as developers of automation systems and researchers interested in the application of artificial intelligence in industry.
Keywords: machine learning, automatic document processing, artificial intelligence, nanocomposites, mathematical modeling, software package
This paper addresses the challenges of automating enterprise-level business processes through the development of a specialized web application designed for task planning and working time accounting. The research focuses on solving problems related to organizational efficiency within the context of contemporary digital transformations. An innovative architectural design is proposed, leveraging advanced web technologies such as React for the frontend, Node.js with Express.js for the backend, and MySQL for reliable data storage. Particular emphasis is placed on utilizing the Effector library for efficient state management, resulting in significant performance improvements by reducing redundant UI renders. The developed solution offers robust features that enhance operational transparency, resource allocation optimization, and overall productivity enhancement across departments. Furthermore, economic justification demonstrates cost savings achieved through streamlined workflows and reduced administrative overhead. The practical implementation of this system has shown promising results, providing businesses with enhanced capabilities to manage tasks effectively while ensuring scalability and adaptability to future growth requirements.
Keywords: automation of business processes, task management, time management, web application, React, Node.js, Effector, information systems performance
In the context of modern continuous integration and delivery processes, it is critically important not only to have automated tests, but also their real effectiveness, reliability and economic feasibility. In this paper, the key metrics for evaluating the quality of automated testing are systematized, with a special focus on the problem of unstable tests. New indicators have been introduced and justified: the level of unstable tests and the loss of the continuous integration pipeline, which directly reflect the costs of maintaining the test infrastructure. The limitations of traditional metrics, in particular code coverage, are analyzed in detail, and the superiority of mutation testing as a more reliable indicator of the test suite's ability to detect defects is demonstrated. Key dependencies have been identified on the demonstration data of the real continuous integration pipeline: an increase in code coverage does not guarantee an improvement in mutation testing and does not lead to an increase in the number of detected defects; a high proportion of unstable tests correlates with significant losses of machine time and a decrease in confidence in test results.; Reducing the time to detect and eliminate defects is achieved not only by increasing coverage, but also by reducing the proportion of unstable tests, improving system observability, and strengthening defect management discipline.
Keywords: quality metrics for automated testing, mutation testing, unstable tests, code coverage, empirical metric analysis, comparative analysis of testing metrics, optimization of testing processes, cost-effectiveness of automation, software quality management
Problem statement. When modeling a complex technical system, the issues of parameter estimation are of primary importance. To solve this problem, it is necessary to obtain a methodology that allows eliminating errors and inaccuracies in obtaining numerical parameters. Goal. The article is devoted to a systematic analysis of the methodology for estimating the parameters of a complex technical system using the interval estimation method. The research method. A systematic analysis of the methods of using interval estimates of numerical parameters is carried out. The decomposition and structuring of the methods were carried out. Results. The expediency of using a methodology for describing the parameters of a complex technical system using the interval estimation method is shown. An analysis of the use of various interval estimation models is presented. Practical significance. Application in the analysis and construction of complex systems is considered as a practical application option. The method of estimating the parameters of a complex technical system using the interval estimation method can be used as a practical guide.
Keywords: interval estimation, parameter estimation, numerical data, fuzzy data, complex technical systems
Polyurethane foams (PUFs) constitute a major class of polymeric materials, widely appreciated for their excellent mechanical strength, chemical resistance, and physical versatility. They are used in a wide variety of applications, such as insulation, cushioning, coatings, and structural parts. Traditionally, PUFs are prepared through polyaddition reactions involving polyols, diisocyanates, and water, where the in-situ generated CO₂ in the reaction mixture serves as the blowing agent. However, there are significant concerns with the use of isocyanates as they are toxic, classified respiratory sensitizers, and contribute to environmental pollution. These issues have directed both researchers and industry experts to search for safer and more sustainable alternative feedstocks.
The polyaddition reaction between cyclic carbonates (CCs) and polyfunctional amines has been one promising alternative. The reaction leads to the formation of non-isocyanate polyurethanes (NIPUs), specifically polyhydroxyurethane foams (PHUFs). Foaming is achieved by using external chemical blowing agents or through self-blowing reactions, where gases are generated directly in the system. The generated foam cells – the structures that give foams their unique properties – depends largely on the gas-forming reactions.
This review focuses on the different blowing agents used in NIPUF synthesis, such as poly(methylhydrogensiloxane) (PHMS) and liquid fluorohydrocarbons. It also looks at recent advances in self-blowing techniques, which eliminate the need for external agents and make the process more sustainable. Special emphasis is placed on NIPUFs derived from renewable feedstocks, as these align with global trend towards green chemistry and circular materials. The review provides an overview of both externally blown and self-blown biobased NIPUFs, detailing their synthesis, performance, and potential industrial applications.
Keywords: biobased polyurethane, blowing agent, non-isocyanate polyurethane, polymeric foams, polyurethane foams, self-blowing
The paper presents the results of the study of the condition of wooden structures of the attic floor of the St. Petersburg Theological Academy of the Russian Orthodox Church. The main defects and damages of the rafter system elements are indicated. Conclusions are given on the physical and mechanical characteristics of a beam over 200 years old. Conclusions are made on measures to maintain the operability of the structures.
Keywords: load-bearing capacity of wooden structures, biodestructors, strength reduction, rafter system
The invention related to the field of construction is proposed, which can be used for trenchless linear laying of oil, gas and other pipelines under natural and artificial obstacles, linear and extended structures (highways, railroads, etc.) located, including in monolithic rocky soils with simultaneous formation of a case, development and arrangement of a pre-drilled well.
Keywords: pilot well, diamond wire rope, foundation pit, encasement, working body, circular sector
This article presents a study of biosorption wastewater treatment. Water treatment sludge, a waste product from thermal power plants, is used for treatment. The introduction of sludge allows for more efficient biological wastewater treatment, as evidenced by reduced COD, BOD5, ammonia nitrogen, and phosphate values. Improved sedimentation properties of activated sludge and a beneficial effect of sludge on the detoxification process are demonstrated.
Keywords: biosorption, energy waste, activated sludge, wastewater treatment, quality indicators, sedimentation properties, detoxification
The article discusses the surface-active properties of casein and its potential for use as a bioorganic additive in repair and restoration compounds (RRC). An analysis of literary sources confirming the amphiphilic nature of casein, its ability to stabilize interfacial systems and form stable micelles is carried out. The results of an experimental study of the dependence of the surface tension of aqueous solutions of casein on its concentration, carried out by the hanging drop method on a DSA-100 tensiometer, are presented. It is shown that casein effectively reduces the surface tension of an aqueous solution, especially at concentrations of up to 4–5%, which confirms its active adsorption at the phase boundary. Based on the obtained surface tension isotherm, the predicted wettability is considered based on the Young equation. The data obtained emphasize the prospects of using casein as a natural surfactant for creating environmentally friendly and authentic restoration materials that meet the requirements of construction and restoration.
Keywords: casein, surfactants, repair and restoration compounds, surface tension, wettability.
Reinforced concrete structures (RCS) operating under the natural conditions of the Far North are subjected to alternating freezing and thawing. The impact of freezing–thawing cycles (FTC) leads to the degradation not only of the strength but also of the deformation properties (DP) of concrete. In the current design standards for RCS, the DP of concrete and reinforcement are specified as average statistical values. This study investigates the influence of the variability of concrete’s deformation properties on the reliability of the load-bearing capacity of flexural reinforced concrete elements before and after exposure to FTC. It was shown that considering the variability of concrete’s deformation characteristics at reinforcement ratios up to 1% under alternating temperature conditions has practically no effect on the load-bearing capacity, while at reinforcement ratios close to the limiting values it leads to its reduction. In addition, recommendations were provided for the design of flexural reinforced concrete elements under alternating temperature conditions.
Keywords: freeze-thaw cycle, statistical regularities of resistance, flexure, reinforced concrete, ultimate deformation of concrete, assurance
The relevance of the research is driven by the critical state of a significant portion of the historical building stock in Russia, where interfloor slabs are the most vulnerable elements. The problem lies in the need to replace them with structures that meet modern standards for load-bearing capacity and fire safety, without increasing the load on the weakened foundations and walls.
Keywords: reconstruction, cultural heritage site, monolithic floor slab, void former, lightweight structure, load-bearing capacity, work procedure statement
In the current regulatory documents on the design of reinforced concrete structures, a number of conditional assumptions and limitations are adopted, taking into account the specific nature of the resistance of structural elements and simplifying the calculation. One of these assumptions is the assignment of the deformation properties of concrete as average statistical values, which, along with strength characteristics, determine the stress–strain diagrams of the material.
The influence of concrete freezing–thawing cycles (FTC) leads to the degradation of its deformation and strength properties. The failure mode of a flexural reinforced concrete element (plastic or brittle) depends on the strength and deformation characteristics of the concrete and reinforcement, as well as on the reinforcement ratio.
This study examines the statistical patterns of the ultimate reinforcement ratio and the limiting relative height of the compressed concrete zone under alternating freezing–thawing conditions. The analysis of the statistical regularities of the parameters used and their functional relationships confirms their significant variability and, as a consequence, the possible fluctuations (reduction) of the design reliability level of flexural reinforced concrete structures.
Modeling the variability of the strength of flexural reinforced concrete elements under FTC using statistically representative data on the kinetics of the physical and structural parameters of concrete confirms a sharp decrease in reliability and indicates the need for additional targeted research in this field.
Keywords: freezing–thawing cycles, statistical patterns of resistance, reinforced concrete, ultimate concrete strains, reliability of the limiting relative height of the compressed concrete zone, reliability of the ultimate reinforcement ratio