Citations count: 6
Reference:
Gibadullin R.F., Viktorov I.V. —
Ambiguous Results when Using Parallel Class Methods within the .NET Framework
// Software systems and computational methods.
– 2023. – ¹ 2.
– P. 1 - 14.
DOI: 10.7256/2454-0714.2023.2.39801 EDN: UGEGOO URL: https://en.nbpublish.com/library_read_article.php?id=39801
Read the article
Abstract:
Parallel programming is a way of writing programs that can run in parallel on multiple processors or cores. This allows programs to process large amounts of data or perform more complex calculations in a reasonable amount of time than would be possible on a single processor. The advantages of parallel programming: increased performance, load sharing, processing large amounts of data, improved responsiveness, increased reliability. In general, parallel programming has many advantages that can help improve the performance and reliability of software systems, especially with the increasing complexity of computational tasks and data volumes. However, parallel programming can also have its own complexities related to synchronization management, data races, and other aspects that require additional attention and experience on the part of the programmer. When testing parallel programs, it is possible to get ambiguous results. For example, this can happen when we optimize concatenation of float- or double-type data by means of For or ForEach methods of the Parallel class. Such behavior of a program makes you doubt about the thread safety of the written code. Such a conclusion can be incorrect and premature. The article reveals a possible reason for ambiguity of the results received by a parallel program and offers a concise solution of the question.
Citations count: 4
Reference:
Viktorov I.V., Gibadullin R.F. —
Syntax Tree Development for Automated Serial-to-Parallel Code Translator for Multicore Processors
// Software systems and computational methods.
– 2023. – ¹ 1.
– P. 13 - 25.
DOI: 10.7256/2454-0714.2023.1.38483 EDN: ANMSZI URL: https://en.nbpublish.com/library_read_article.php?id=38483
Read the article
Abstract:
The emergence of multicore architectures has extremely stimulated the area of parallel computing. However, developing a parallel program and manually paralleling inherited sequential program codes are time-consuming work. The programmer should have good skills in using parallel programming methods. This fact determines the relevance of the subject of the research – the development of a serial-to-parallel code translator. The article gives a review of existing solutions in the chosen direction of research and considers their advantages and disadvantages. The principle of formation of a syntactic tree which is based on JSON format (the text format of data exchange based on JavaScript) is offered and an example of formation of a syntactic tree on the basis of this principle is considered. The result of the work is an approach for building a program platform for translating sequential code into parallel code. The distinctive feature of the developed platform is the web-service, which potentially allows you to extend the translator with other programming languages. The interaction with the programming environment is realized by means of REST-requests (HTTP-requests designed to call remote procedures). The developed software platform consists of three modules: the query processing module, which provides interaction with external systems through REST-requests; the tree building module, which forms a syntax tree on the basis of the source program code; the code conversion module, which obtains parallel program code on the basis of the syntax tree.
Citations count: 4
Reference:
Drobotun E.B. —
Method for estimating the cost of the life cycle of systems for protecting against computer attacks
// Software systems and computational methods.
– 2018. – ¹ 2.
– P. 17 - 27.
DOI: 10.7256/2454-0714.2018.2.23086 URL: https://en.nbpublish.com/library_read_article.php?id=23086
Read the article
Abstract:
The article deals with the economic aspects of building protection systems against computer attacks for information-computing and automated systems for various purposes. An objective assessment of the cost of the life cycle of systems to protect against computer attacks is one of the most important factors that determines the strategy for choosing a rational option for building defense systems. The subject of the study are the economic aspects of choosing options for building defense systems against computer attacks, as well as minimizing the financial costs of their creation and operation. The object - the system to protect against computer attacks. The methodology of this study is based on the use of an integrated approach to assessing the life cycle cost of protection systems as the costs included in the calculation year, including the share of the cost of the protection system, the costs of its implementation, operation during the use of the protection system, and the costs of its disposal at the end service life. Scientific novelty of the work is to create a real practical methodology that allows to evaluate all the components of one-time and current costs that are included in the cost of the life cycle of systems to protect against computer attacks. The offered technique allows to carry out an estimation of cost of a life cycle of several alternative variants of construction of system of protection against computer attacks and to make a choice of admissible on cost variants of construction of system of protection.
Citations count: 4
Reference:
Dushkin R. —
Intellectualization of the management of technical systems as part of a functional approach
// Software systems and computational methods.
– 2019. – ¹ 2.
– P. 43 - 57.
DOI: 10.7256/2454-0714.2019.2.29192 URL: https://en.nbpublish.com/library_read_article.php?id=29192
Read the article
Abstract:
The article discusses certain issues of intellectualization of control of technical systems in the framework of a functional approach to the construction of intelligent control systems for various objects and processes based on systems engineering and complex engineering. Intellectualization of management allows you to simultaneously get all the benefits of various paradigms for considering processes of various nature, as well as emergent to show new properties of the general approach to increase the degree of controllability and efficiency of operation of technical control objects (and generally practically arbitrary control objects of a technical nature). The application of the functional approach in combination with complex equipment in the intellectual management of such objects as transport, buildings, energy, allows us to transfer their operation to a higher level of service availability, sustainability, environmental friendliness and comprehensive development of not only the control object itself, but also the hierarchy of its super-systems - municipality, region, state. The scientific novelty of the proposed approach lies in the new application of the mathematical apparatus in terms of the general theory of sets and category theory for organizing a distributed computing system in the field of intelligent buildings and managing their internal environment. An article can become the basis for novelty of a higher order in the transition from a systemic to an integrated approach. Furthermore, a systematic approach is used with the use of a simplified cybernetic "streaming" scheme of functioning of an intelligent building.
Citations count: 3
Reference:
Damdinova T.T., Nikiforova A.P., Prudova L.Y., Bubeev I.T. —
The use of digital image processing methods to determine the moisture-binding capacity of meat and fish products
// Software systems and computational methods.
– 2019. – ¹ 3.
– P. 20 - 29.
DOI: 10.7256/2454-0714.2019.3.30646 URL: https://en.nbpublish.com/library_read_article.php?id=30646
Read the article
Abstract:
The article presents the results of determining the moisture-binding ability and plasticity of food products. Many indicators depend on the ability of meat and fish to bind moisture, including juiciness, tenderness, yield, loss during heat treatment, and appearance. The objects for research were Baikal omul, fresh and salted, beef meat in thawed condition. The moisture-binding ability and plasticity of the objects of study were evaluated by the pressing method. The paper presents the calculations performed using the traditional method and the method of digital processing of color images. Digital image processing was performed using a program developed by the authors, the article provides drawings and tables obtained during image processing. The undoubted advantage of the processing program compared to the traditional method is a significant reduction in time for processing images and the ability to process a large amount of data in a short time. When creating the necessary shooting conditions, the digital image processing method for determining the moisture-binding ability and plasticity of food products can be successfully used for laboratory research in determining the quality of meat and fish products.
Citations count: 3
Reference:
Tikhanychev O.V. —
User interfaces in automated systems: development issues
// Software systems and computational methods.
– 2019. – ¹ 2.
– P. 11 - 22.
DOI: 10.7256/2454-0714.2019.2.28443 URL: https://en.nbpublish.com/library_read_article.php?id=28443
Read the article
Abstract:
The subject of research is the software development process of automated control systems. The object of study is the development of user interfaces for control automation software. The generally recognized prospective direction for increasing the efficiency of organizational and technical systems is the automation of their management, which ensures increased efficiency and validity of decisions made. A significant impact on the effectiveness of any automated system is provided by its software. This primarily relates to application software. Application development is fraught with certain difficulties, including those associated with the creation of user interfaces. The analysis of development practice shows the presence of a number of problems in this area, determined by the fact that this problem is at the junction of scientific disciplines: control theory, ergonomics, technical aesthetics and psychology. During the study, general scientific methods of analysis and synthesis were used.The authors analyzes the factors affecting the efficiency of user interface development. Based on this analysis, suggestions for solving the problem are synthesized based on the use of standardization, unification and prototyping tools.The analysis showed that for the development conditions of application software for automated control systems, the last of the listed approaches provides the greatest efficiency, namely, the use of specialized prototyping systems. It is proposed to finalize the normative documentation specifying the development of automated control systems for the mandatory stage of prototyping user interfaces in the structure of the process of creating such systems
Citations count: 3
Reference:
Tikhanychev O.V. —
On the quality indicators of automated control systems software
// Software systems and computational methods.
– 2020. – ¹ 2.
– P. 22 - 36.
DOI: 10.7256/2454-0714.2020.2.28814 URL: https://en.nbpublish.com/library_read_article.php?id=28814
Read the article
Abstract:
The subject of the research is the process of developing automated control systems software. The object of the research is the quality control system of this process. The regulatory documents establish a list of the main characteristics of program quality assessment, which, as practice has shown, does not fully meet its purpose, providing not quality control, but verification of the compliance of programs with the customer's requirements formulated in the terms of reference. One of the reasons for this lies in the impossibility of evaluating exclusively quantitative indicators of the quality of systems, including both technical means and a person. An attempt to use world practice, for example, relatively successful quality models from the ISO / IEC 25000: 2014 standards have not yet been implemented: the model itself is allowed to be used by regulatory documents (GOST R ISO / IEC 25010-2015), but the quality indicators described in it are not accepted. Private improvements to existing methods do not solve the problem systematically. The article uses general scientific methods of analysis and synthesis. Based on the analysis of existing approaches to assessing the quality of software development, proposals for improving this process are synthesized.The article formulates a scientific and practical problem and offers one of the approaches to its solution, based on the refinement of existing methods for assessing quality based on the model described in GOST R ISO / IEC 25010, taking into account the real needs of users, interpreted through reducing the likelihood of errors of the first and second kind arising from the use of software. The solution of the formulated problem will provide a general increase in the efficiency of automated control through the use of quantitative and qualitative assessments of the software being developed.
Citations count: 2
Reference:
Lobanov A.A., Mordvinov V.A., Murakov M.V., Raev V.K. —
Construction of a model of a multifunctional airborne guidance and landing system for a spacecraft
// Software systems and computational methods.
– 2018. – ¹ 2.
– P. 36 - 50.
DOI: 10.7256/2454-0714.2018.2.26217 URL: https://en.nbpublish.com/library_read_article.php?id=26217
Read the article
Abstract:
The basic requirements for the on-board complex of the spacecraft for purposes of guidance and landing on small bodies of the solar system are formulated in the work. The main tasks of the landing and landing vehicles are braking and approaching the surface of the celestial body, landing, working on its surface, possibly taking off from the surface to deliver the returned vehicle to the ground. Providing high requirements to the accuracy and reliability of the on-board guidance and landing system, an actual solution is proposed. Using the traditional approach to the modeling of processes and systems, a functional model of the onboard guidance and landing system in IDEF0 notation was created. In the process of creating a functional model, the main processes performed by the complex during descent from orbit and landing are described. As a result of the work, a description of the procedures performed by the multifunctional on-board guidance and landing system of the spacecraft has been obtained. An applied functional model of the "to-be" level was constructed, based on the use of an integrated approach. The proposed integrated approach is focused on the sharing of data from all on-board devices, both basic and backup information. This approach allows to increase the accuracy and reliability of the landing procedure.
Citations count: 2
Reference:
Reshetnikova E.S., Savelyeva I.A., Svistunova E.A. —
Geometric modeling and development of custom libraries in the design of engineering facilities
// Software systems and computational methods.
– 2020. – ¹ 1.
– P. 1 - 7.
DOI: 10.7256/2454-0714.2020.1.32292 URL: https://en.nbpublish.com/library_read_article.php?id=32292
Read the article
Abstract:
The subject of research is the process of designing a conveyor belt. The authors consider parameterization in geometric modeling of parts and components of equipment and the creation of custom libraries in Compass 3D as a means of reducing the complexity and improving the quality of the design process. The preliminary design is the design stage of the development of design documentation and aims to determine the fundamental design solutions for a general idea of the device, operating principles and dimensions of the product. It is advisable to develop a preliminary design before the stage of developing a technical project and creating design documentation. Today, at all stages of work on the project, modern computer-aided design (CAD) systems are used, which not only accelerate the design process, but also make it possible to demonstrate to the customer the finished project at the stage of making technical decisions. This allows making timely changes in accordance with the requirements of the customer and to carry out high-quality preparation of the project for its implementation. The volume and time for further stages of work depend on the timing of the presentation of the preliminary design, therefore, the use of three-dimensional modeling parametrization in CAD is an effective way for designing engineering objects. Parameterization when working with 3D models allows you to get a set of typical product designs based on a once-created model by changing the set values of the variables, which significantly reduces the time spent on the project.
Citations count: 2
Reference:
Guzii A.G., Kukushkin Y.A., Lushkin A.M. —
Computer technology of pilot functional reliability prognostic evaluation
// Software systems and computational methods.
– 2018. – ¹ 2.
– P. 84 - 93.
DOI: 10.7256/2454-0714.2018.2.22425 URL: https://en.nbpublish.com/library_read_article.php?id=22425
Read the article
Abstract:
The subject of research is mathematical software prognostic evaluation of functional reliability of the pilot. The object of research is the functional reliability of the professional activities of the pilot. The authors consider in detail such aspects of the topic as automated assessment of the risk of an aviation event due to the release of flight parameters for operational limitations, understanding by the risk assessment the probabilistic measure of the occurrence of an aviation event of a fixed degree of severity due to exceeding operational limitations of the aircraft, in flight such an event (depending on the severity effects) is classified as an aviation event, subject to the investigation. The research methodology is based on the system approach and unites methods of probability theory, mathematical statistics, aviation cybernetics, psychophysiology of flight work. The main result of the study is a software-implemented technology for predicting the functional reliability of the pilot, implemented, allowing to implement an individual a priori risk assessment of an aviation event (incident) for a group of causative factors "crew" in the most critical stages of flight (takeoff and landing) Accumulated statistics of aviation events caused by the release of flight parameters for operational limitations, which is important To ensure proactive management of safety levels in the airline. The novelty of the research is that the technology of predictive estimation of the functional reliability of the pilot is developed on the basis of the concept of acceptable risk of an accident.
Citations count: 2
Reference:
Zakharov A.A., Tuzhilkin A.Y. —
Segmentation of satellite images based on super pixels and sections on graphs
// Software systems and computational methods.
– 2018. – ¹ 1.
– P. 7 - 17.
DOI: 10.7256/2454-0714.2018.1.25629 URL: https://en.nbpublish.com/library_read_article.php?id=25629
Read the article
Abstract:
The study is devoted tp algorithms of segmentation of satellite images for various systems of technical vision. For the segmentation of images authors use sections on graphs. Preliminary segmentation is performed based on the minimal spanning tree to improve performance. When describing the properties of super pixels, information about the height and color of the regions is taken into account. The height of the areas is calculated based on the stereo images. The color of segments is calculated on the basis of color invariants. All super pixels in accordance with their characteristics belong to the areas of buildings, grass cover, trees and shrubs, shaded areas, etc. The image is an undirected weighted graph, the nodes of which are segments of the image. The weights of the vertices of a graph are numbers that determine the membership of a certain class. To divide regions into clusters, the method of cuts on graphs is used. The novelty of the study is the algorithm for segmenting satellite imagery based on super pixels and graphs. The segmentation time on the basis of the developed algorithm decreases several times in comparison with the method of cuts on graphs. The developed algorithm is used to allocate buildings to images. Comparison of the developed algorithm with existing approaches of building allocation is shown, its advantages are shown. Examples of the operation of the algorithm are given by the authors of the article and the results of the research are described.
Citations count: 2
Reference:
Ignatenko A.M., Makarova I.L., Kopyrin A.S. —
Methods for preparing data for the analysis of poorly structured time series
// Software systems and computational methods.
– 2019. – ¹ 4.
– P. 87 - 94.
DOI: 10.7256/2454-0714.2019.4.31797 URL: https://en.nbpublish.com/library_read_article.php?id=31797
Read the article
Abstract:
The aim of the study is to prepare for the analysis of poorly structured source data, their analysis, the study of the influence of data "pollution" on the results of regression analysis. The task of structuring data, preparing them for a qualitative analysis is a unique task for each specific set of source data and cannot be solved using a general algorithm, it will always have its own characteristics. The problems that may cause difficulties when working (analysis, processing, search) with poorly structured data are considered. Examples of poorly structured data and structured data that are used in the preparation of data for analysis are given. These algorithms for preparing weakly structured data for analysis are considered and described. The cleaning and analysis procedures on the data set were carried out. Four regression models were constructed and compared. As a result, the following conclusions were formulated: Exclusion from the analysis of various kinds of suspicious observations can drastically reduce the size of the population and lead to an unreasonable decrease in variation. At the same time, such an approach would be completely unacceptable if, as a result, important objects of observation are excluded from the analysis and the integrity of the population is violated. The quality of the constructed model may deteriorate in the presence of abnormal values, but may also improve due to them.
Citations count: 2
Reference:
Mikheev I.V., Vishtak O.V., Kondratov D.V. —
System of quantitative characteristics of software quality assessment
// Software systems and computational methods.
– 2018. – ¹ 2.
– P. 28 - 35.
DOI: 10.7256/2454-0714.2018.2.25981 URL: https://en.nbpublish.com/library_read_article.php?id=25981
Read the article
Abstract:
The subject of the study is the process of teaching programming. Information technologies are in close integration with various spheres of human activity. The peculiarities of information technologies include rapid rates of development of technologies in this field and the need to modernize already functioning information and technical means. Existing standards and other normative documents can be applied only for real developments, which can not be attributed to the programs developed by students in the learning process, so such documents and standards can not be fully used to assess the level of knowledge of students, because before their application it is necessary to carry out the analysis of the most significant ones and to adapt to the specifics of the learning process. In the process of research, those quantitative characteristics were singled out - metrics that allow to reveal the real level of possession of the student's technologies. Using the obtained quantitative characteristics as an integral system, the teachers have an opportunity to get an objective assessment of the program developed by the student, and such an assessment will fully meet the requirements of a graded rating of students' performance. As a result of the analysis of this area of research, the metrics from the group "Metric characteristics" were singled out and described: the program completion code, the total program execution time, the maximum amount of physical memory used, the maximum number of used memory pages, the maximum amount of virtual memory used, the use of processor time, total processor time, which can be used as a basis for building a software product that performs testing and student programs based on a dynamic approach.
Citations count: 2
Reference:
Dolzhenko A.I., Shpolyanskaya I.Y. —
Fuzzy model and algorithm for assessing the quality of web services integrated into the service oriented architecture of an information system.
// Software systems and computational methods.
– 2017. – ¹ 2.
– P. 22 - 31.
DOI: 10.7256/2454-0714.2017.2.23098 URL: https://en.nbpublish.com/library_read_article.php?id=23098
Read the article
Abstract:
Selection of appropriate web services is an important step in the development of service-oriented architecture (SOA) for an information system. When selecting a web service, the quality of service criteria (QoS) are used regularly. These criteria characterize the non-functional properties of web service candidate. However, the functional characteristics of the service that are difficult to quantify, and the factors of the effectiveness of the lifecycle management of composite architectures are also important. The article discusses the comprehensive approach to evaluation and selection of web services in SOA development on the basis of factors related to non-functional and functional requirements and taking into account the economic efficiency evaluation of implementation and use of service-oriented applications. The choice of service is performed via fuzzy models for integrated assessment of the web service quality. The developed model and algorithm of web service quality evaluation and its software implementation will allow to realize decision support procedures for substantiation and choice of an appropriate structure of SOA-based information system in the situation of incomplete information.
Citations count: 2
Reference:
Gibadullin R.F. —
Thread-safe Control Calls in Enriched Client Applications
// Software systems and computational methods.
– 2022. – ¹ 4.
– P. 1 - 19.
DOI: 10.7256/2454-0714.2022.4.39029 EDN: IAXOMA URL: https://en.nbpublish.com/library_read_article.php?id=39029
Read the article
Abstract:
When the first version of the .NET Framework was released, there was a pattern in enriched client applications that focused on message processing loops, where an embedded queue was used to pass execution units from worker threads. A generalized ISynchronizeInvoke solution was then developed in which the source thread could queue a delegate to the destination thread and, as an optional option, wait for that delegate to complete. After asynchronous page support was introduced into the ASP.NET architecture, the ISynchronizeInvoke pattern did not work because asynchronous ASP.NET pages are not mapped to a single thread. This was the reason for creating an even more generalized solution – SynchronizationContext, which is the subject of the research. The article uses practical examples to show how to update UI elements from worker threads without breaking thread-safety of the user application. Solutions proposed in this aspect are: using Beginlnvoke or Invoke methods to put this delegate into the UI thread message queue; capturing the UI thread synchronization context via the Current property of the SynchronizationContext class; using the deprecated BackgroundWorker class, which provides an implicit capture of the UI thread synchronization context. The peculiarity of implementation of the SynchronizationContext abstract class in ASP.NET platform is not left unnoticed. Practical recommendations on the use of marshalling mechanism on the example of development of multiclient chat with a centralized server are formulated.
Citations count: 2
Reference:
Tikhanychev O.V. —
Agile technologies in software development of decision support systems
// Software systems and computational methods.
– 2018. – ¹ 2.
– P. 51 - 59.
DOI: 10.7256/2454-0714.2018.2.23743 URL: https://en.nbpublish.com/library_read_article.php?id=23743
Read the article
Abstract:
The subject of the study is the process of developing software for automated control systems. Object of research - methodologies for organizing software development. Automation of management is a universally recognized promising direction of increasing the efficiency of the application of complex technical systems. First of all, automation, organized by the principle of decision support systems. The basis of the effectiveness of any automated system is its software. First of all, this applies to application software. The development of such programs entails certain difficulties, primarily organizational ones. Generalized analysis has shown that in the world practice there is a rather wide range of methods for organizing the development of programs. These methods can be divided into two large groups with respect to the algorithms used for "hard" and "flexible" ones. Each of the approaches is effective for certain conditions of work. The article analyzes the factors influencing the effectiveness of applying a particular methodology, synthesizes suggestions on the appropriateness of using different methodologies in different conditions of the development process. The analysis showed that for the development of automated decision support systems, the use of "flexible" approaches is most effective. In the article features of Scrum technology are considered, as a typical implementation of "flexible" methodologies. Conclusions about the expediency of its application in the development of application software for automated decision support systems are formulated
Citations count: 2
Reference:
Glushenko S.A. —
Analysis of software for implementing fuzzy expert systems
// Software systems and computational methods.
– 2017. – ¹ 4.
– P. 77 - 88.
DOI: 10.7256/2454-0714.2017.4.24251 URL: https://en.nbpublish.com/library_read_article.php?id=24251
Read the article
Abstract:
The research focuses on enterprises and organizations of various industries, leading project-oriented business. The subject of the study are the decision-making processes that are present in the implementation of various projects. Increasing the effectiveness of decisions can be achieved through the use of expert systems. At the same time, the expert system should be based on modern methods of processing information in conditions of uncertainty. The author suggests using expert systems based on methods and models of fuzzy logic. Particular attention in the article the author pays to the functional requirements, which must correspond to fuzzy expert system. The author examines in detail the existing list of software for the implementation of fuzzy expert system, and to identify the optimal software which meets the requirements, the method of analysis of complex systems by the criterion of functional completeness of Professor G.N. Khubaeva.As a result of the analysis it was established that the existing software solutions do not meet the functional requirements in many respects, therefore the development of a new and effective tool is an actual task.The analysis also made it possible to identify software tools with a similar set of functions, to estimate the degree of similarity and the degree of correspondence of the systems of the "reference" model of the information system that takes into account the requirements of the user.
Citations count: 2
Reference:
Kiryanov D.A. —
Hybrid categorical expert system for use in content aggregation
// Software systems and computational methods.
– 2021. – ¹ 4.
– P. 1 - 22.
DOI: 10.7256/2454-0714.2021.4.37019 URL: https://en.nbpublish.com/library_read_article.php?id=37019
Read the article
Abstract:
The subject of this research is the development of the architecture of an expert system for distributed content aggregation system, the main purpose of which is the categorization of aggregated data.
The author examines the advantages and disadvantages of expert systems, a toolset for the development of expert systems, classification of expert systems, as well as application of expert systems for categorization of data. Special attention is given to the description of the architecture of the proposed expert system, which consists of a spam filter, a component for determination of the main category for each type of the processed content, and components for the determination of subcategories, one of which is based on the domain rules, and the other uses the methods of machine learning methods and complements the first one.
The conclusion is made that an expert system can be effectively applied for the solution of the problems of categorization of data in the content aggregation systems. The author establishes that hybrid solutions, which combine an approach based on the use of knowledge base and rules with the implementation of neural networks allow reducing the cost of the expert system.
The novelty of this research lies in the proposed architecture of the system, which is easily extensible and adaptable to workloads by scaling existing modules or adding new ones.
Citations count: 1
Reference:
Shchemelinin D. —
Methods of creating a distributed computer-computing system for software information and communication switch
// Software systems and computational methods.
– 2019. – ¹ 1.
– P. 91 - 97.
DOI: 10.7256/2454-0714.2019.1.28782 URL: https://en.nbpublish.com/library_read_article.php?id=28782
Read the article
Abstract:
The subject of the research is the new principles of building globally distributed computer and computing systems for creating a software infocommunication switch providing telephony services, sending facsimile, short and multimedia messages services, via packet data transmission channels, and also storing user files. The object of scientific research is the globally distributed computing system of RingCentral (USA), which began to be created from the beginning of the company's founding in 1999 and is technologically constantly evolving at the present time. The author reveals in detail the main criteria for testing the hardware environment of the implemented computer infrastructure during the transition to the distributed computing model. The creation of a globally distributed computing system architecture for an information and communication software switch was a decisive factor in the development of computer systems in the provision of universal information and communication services. The exchange of the obtained scientific knowledge at specialized conferences and seminars showed that the constructed model of information and communication services is the most modern from the technological point of view of the proposed architecture.
Citations count: 1
Reference:
Dushkin R., Andronov M.G. —
An intelligent algorithm for creating control actions on engineering systems of intelligent buildings
// Software systems and computational methods.
– 2020. – ¹ 2.
– P. 69 - 83.
DOI: 10.7256/2454-0714.2020.2.31041 URL: https://en.nbpublish.com/library_read_article.php?id=31041
Read the article
Abstract:
The article describes an algorithm for generating control actions on various engineering systems of an intelligent building, individually or in combination, within the framework of intelligent control of the parameters of the internal environment of such a building. The intelligence of the algorithm is due to the possibility of its autonomous operation and adaptability in relation to the parameters of the internal environment in relation to which the monitoring and control is carried out. The article provides a brief description of the algorithm, as well as a mathematical model for the selection and application of control actions. As a research method, a set-theoretic approach to modeling management processes was adopted, as well as BPMN notation for representing algorithms. The novelty of the issue under consideration is due to the use of a functional approach for the development of an intelligent algorithm, as well as the use of methods of distributed computing and computing on terminal devices within the framework of the hybrid paradigm of artificial intelligence. The relevance of the presented model is based on the need to translate the life cycle management processes of buildings and structures into the Industry 4.0 paradigm in order to increase their degree of intelligence. The article will be of interest to scientists and engineers working in the field of automation of technological and production processes. The present work is theoretical.
Citations count: 1
Reference:
Osipov M.Y. —
On the question of the specifics of the formulation and use of the Turing test for the ChatGPT
// Software systems and computational methods.
– 2023. – ¹ 4.
– P. 1 - 16.
DOI: 10.7256/2454-0714.2023.4.68680 EDN: TCQVHG URL: https://en.nbpublish.com/library_read_article.php?id=68680
Read the article
Abstract:
The subject of the research in this article are the features and regularities of the functioning of systems based on ChatGPT technologies, the knowledge of which makes it possible to formulate appropriate modifications of the Turing test, as well as the features and regularities of the formulation and use of the Turing test for systems based on ChatGPT technologies. The purpose of the study is to identify the features and patterns of functioning of systems based on the technologies of ChatGPT, as well as the features and patterns of formulation and use of the Turing test for systems based on the technologies of Chat GPT. As research methods, the method of social experiment was used, when during the study of a system based on Chat GPT technologies, certain questions were asked, answers were received, the analysis of which allowed us to conclude about the features of the "thinking" of systems based on ChatGPT technologies. In the course of the study, the following was found. Unlike human thinking, which is based on certain facts, the "thinking" of systems based on ChatGPT technologies, in some cases is not based on facts that take place in reality, often the user is given deliberately false information about facts and circumstances that take place in reality.
In contrast to human thinking, which is usually systemic in nature, the "thinking" of systems based on ChatGPT technologies is disorderly and fragmentary. Systems based on ChatGPT technologies cannot admit their mistakes, and attempts to force systems based on ChatGPT technologies to critically comprehend their answers lead to a malfunction of these systems. The article also provides a Turing test developed by the author for ChatGPT, which made it possible to identify the features of the "thinking" of systems based on ChatGPT technologies.
Citations count: 1
Reference:
Yanchus V.E., Borevich E.V., Avdeeva A.A. —
The use of eye tracking technologies in studying perception of graphic information perception
// Software systems and computational methods.
– 2021. – ¹ 1.
– P. 53 - 62.
DOI: 10.7256/2454-0714.2021.1.33378 URL: https://en.nbpublish.com/library_read_article.php?id=33378
Read the article
Abstract:
The scientific research related to the use of eye tracking technologies are currently of particular relevance. A substantial area of scientific knowledge, in which it is feasible to use eye tracking technologies, is associated with the study of perception of visual information. One of the trends of such research is the examination of the impact of compositional construction of visual image upon its perception by the viewer. The object of this research is elements of graphic composition (stylistic and color solutions). The subject is the methods and algorithms for digital processing of graphic material. The article employs the computational methods of color correction, eye tracking technology, expert assessment, statistical processing of the obtained experimental data. The scientific novelty of this work consists in the development of methodology of preparing and conducting computational experiments for the impact of the stylistic solution of graphic composition upon its perception by the viewer. The technique developed by the authors has proven itself in carrying out the experiments for studying the effect of color solution of the frame upon its perception by the viewer. The study revealed the statistical significance of the impact of stylistic solution of graphic composition upon viewer’s perception. The hypothesis that abstract graphic compositions are difficult for viewer’s perception was statistically proven. The acquired results can be used in constructing complex graphic images, which require the speed of information perception to view them.
Citations count: 1
Reference:
Trub I. —
Probabilistic model of hierarchical database indexes
// Software systems and computational methods.
– 2017. – ¹ 4.
– P. 15 - 31.
DOI: 10.7256/2454-0714.2017.4.24437 URL: https://en.nbpublish.com/library_read_article.php?id=24437
Read the article
Abstract:
The subject of the study is the concept of hierarchical bitmap-indexes proposed by the author. It is that in order to improve the processing performance of queries on the time filter, the indices are supported not only for the values of the basic unit of time, but also for arbitrary larger multiple units. The object of the study is the construction of an analytic probability model of such indices for the particular case of the exponential distribution of a random stream of recording records in a database. The author focuses on such an aspect as the calculation of the discrete distribution of the number of indices involved in the processing of the query. The methodology of the study is probability theory, combinatorial methods, measure theory, computational experiment. In addition, it is shown that the latest concepts of the theory of cellular automata, such as the Zaitsev's neighborhood, can be used to study the features of the proposed model. The main results of the work can be formulated as follows: introduced an original, intuitive concept of building indexes; new, meaningful optimization problems for selecting a hierarchical index system are formulated; a mathematical model is constructed and verified, allowing to estimate the efficiency of using the chosen hierarchy of indices. It is shown that in the limiting case the model naturally tends to a set of fractal nature, in particular, one of the varieties of Cantor dust, for which the formula for calculating its Hausdorff-Besicovitch dimension is derived through the application parameters of the initial problem.
Citations count: 1
Reference:
Borevich E.V., Meshcheriakov S.V., Shchemelinin D.A., Yanchus V.E. —
Methods and algorithms for the experimental study of graphical models of color solutions
// Software systems and computational methods.
– 2018. – ¹ 4.
– P. 144 - 153.
DOI: 10.7256/2454-0714.2018.4.27695 URL: https://en.nbpublish.com/library_read_article.php?id=27695
Read the article
Abstract:
The article presents new methods of experimental study of the influence of color on the perception of a movie frame by the viewer and software algorithms for processing the obtained statistical data. The object of scientific research is filmed and edited film material, including individual graphic images of video frames. The subject of the research is graphic models of color solutions, digital color correction methods, digital cinema and video processing algorithms. The specificity of the computational experiment put into practice is that it is conducted online using the Internet resources.The described method is the development of existing methods for the experimental study of human oculomotor activity using I-tracking technology. The proposed methods and algorithms for experimental studies of the influence of color solutions on their perception allow us to objectively measure statistical data and analyze the work of human visual activity according to various criteria. The development of an Internet application has allowed many more subjects to be involved in a computational experiment and to increase the reliability of statistical results.
Citations count: 1
Reference:
Dubanov A.A. —
Modeling of trajectory of the pursuer in space using the method of constant-bearing approach
// Software systems and computational methods.
– 2021. – ¹ 2.
– P. 1 - 10.
DOI: 10.7256/2454-0714.2021.2.36014 URL: https://en.nbpublish.com/library_read_article.php?id=36014
Read the article
Abstract:
This article examines the model of pursuit task, when the pursuer while moving in space, adheres to the strategy of constant-bearing approach. The velocity modules of the pursuer and target are constant. The object moves evenly and straightforwardly, for certainty of the model, since the test program is written based on the materials of the article. The velocity vectors of the target and the pursuer in the beginning of the pursuit are directed arbitrarily. The iterative process consists of the three parts. Calculation of trajectory of the pursuer in space, calculation of trajectory of the pursuer in a plane, calculation of the transition of trajectory from space to a plane are conducted. All parts of the iterative process have to meet the conditions specified in a task. An important condition is that the minimum radius of curvature of the trajectory should not exceed a certain set value. The scientific novelty of the geometric model consists in the possibility to regulate the time of reaching the target by changing the length of trajectory of the pursuer, as well as the orientation of a plane of pursuit. Calculation of the point of next position of the pursuer in space is the point of intersection of the sphere, cone and plane of constant-bearing approach. A plane of constant-bearing approach is perpendicular to a plane of pursuit. In the model under review, a plane of pursuit is determined by the target velocity vector and direct target that connects the pursuer and the target (sight line). The radius of the sphere is equal to the step of the pursuer for the time interval the time of the iterative process is divided into. The angle of solution of the cone is the angle by which the velocity vector of the pursuer can turn. The mathematical model presented in the article may be of interest to developers of unmanned aerial vehicles.
Citations count: 1
Reference:
Bondarenko M.A., Drynkin V.N., Nabokov S.A., Pavlov Y.V. —
Adaptive algorithm for selecting informative channels in on-board multispectral video systems
// Software systems and computational methods.
– 2017. – ¹ 1.
– P. 46 - 52.
DOI: 10.7256/2454-0714.2017.1.21952 URL: https://en.nbpublish.com/library_read_article.php?id=21952
Read the article
Abstract:
The article reviews algorithmic support of multispectral video systems in the part of automatic selection of the most informative channels, which are video sensors of different spectral ranges, forming images of different informativeness depending on weather conditions. Such systems can be used to increase the awareness of aircraft pilots or operators of unmanned aerial vehicles in difficult weather conditions. The usage of these algorithms is considered using the example of airborne systems of improved (enhanced) vision where it is necessary to give the pilot a single combined image of the external situation, due to the limited space of the information-control field of the cockpit and the increased requirements for decision-making. The development and research was carried out using the base of video sequences obtained in the course of real flight experiments of the prototype of a three-channel video system as well as expert judgment on the consistency of the results of the work of algorithms with visual perception. The study is the further development of image analysis algorithms in terms of their informativeness. The article proposes a new adaptive algorithm for automatic selection of individual video channels for their subsequent integration based on the information-free metric of informativity. The proposed automatic calculation of the priority of video channels for informativeness showed its consistency with the subjective perception of useful information by the human operator.
Citations count: 1
Reference:
Lyapustin A., Kolesnikova S., Mel'nik D. —
Security Model Based on a Multi-Agent System
// Software systems and computational methods.
– 2018. – ¹ 3.
– P. 81 - 90.
DOI: 10.7256/2454-0714.2018.3.26575 URL: https://en.nbpublish.com/library_read_article.php?id=26575
Read the article
Abstract:
The article is devoted to the urgent problem of ensuring the security of heterogeneous information platforms using a multi-agent threat detection system. The object of the study is a multi-agent platform. The authors pay special attention to such aspects of the topic as: security of multi-agent platforms, management of threat detection agents, interconnection between different threat detection agents, vulnerability in multi-agent platforms. Trends are being considered for the development of new distributed security models. The paper presents a multi-agent architecture for security services, as well as a general scenario for deploying security in multi-agent threat detection systems. The paper presents a multi-agent architecture for security services, as well as a general scenario for deploying security in multi-agent threat detection systems. Using a multi-agent system approach to a security model that provides control from the sender using the mobility and extensibility characteristics of the agent. The multi-agent protection system model for heterogeneous information platforms provides an improved approach for online communications, providing flexible mechanisms for the comprehensive protection of heterogeneous information platforms that can satisfy various security requirements and provide better control over the data to its sender.
Citations count: 1
Reference:
Demichev M.S., Gaipov K.E. —
Search algorithm for loopless routes
// Software systems and computational methods.
– 2020. – ¹ 4.
– P. 10 - 25.
DOI: 10.7256/2454-0714.2020.4.33605 URL: https://en.nbpublish.com/library_read_article.php?id=33605
Read the article
Abstract:
The subject of this research is the search algorithm for loopless routes from transmitter to the recipient of network traffic in the conditions of a known network topology. In designing data transmission network, one of the primary problems is the formation of network traffic routing, due to the fact that heavy traffic often cause the occurrence of bottlenecks in form of the overloaded communication node, which results in speed reduction of data transmission. This article provides the search algorithm for loopless routes from transmitter to the recipient of network traffic; the result is presented as a set of loopless routes in accordance with the specified network topology. The article also provides the software code of the algorithm written in the C# language, as well as the results of test solutions of the specified topologies. The algorithm was developed via experimental and theoretical methods, on the bases of the available route search algorithms, such as Floyd's algorithm and Dijkstra's algorithm, as well as mechanisms of static and dynamic routing, such as RIP, OSPF, and EIGRP. The novelty of this work consists in elaboration of search algorithm for loopless routes from transmitter to the recipient in the conditions of the available network topology; and in comparison of the acquired results with other methods of formation phase variables. This algorithm allows generating a list of all loopless routes within the indicated network topology between the pair of interacting nodes.
Citations count: 1
Reference:
Martyshenko S.N. —
Automating the formation of databases based on the results of questionnaires
// Software systems and computational methods.
– 2017. – ¹ 4.
– P. 7 - 14.
DOI: 10.7256/2454-0714.2017.4.22887 URL: https://en.nbpublish.com/library_read_article.php?id=22887
Read the article
Abstract:
The research is devoted to computer technologies for processing personal data. In the article particular attention is paid to the preparatory stage of data processing, which precedes the meaningful analysis of data. This stage of work largely determines the quality of the results of the whole work. The preparatory stage is very time-consuming and requires a lot of time. Automation of the work of participants in the process of collecting information makes it possible to increase the efficiency of the entire system for the analysis of personal data. The purpose of this study is to develop a software tool that allows the data collected in various ways to be combined into a single database. To analyze various schemes for organizing information collection and transferring information to the computer medium, the methodology of system analysis was used. In addition, the practical experience of using various Internet services for organizing online surveys was used. Computer technology has the property of universality, allowing the use of a wide range of schemes and methods of organizing questionnaires. The advantage of the technology is its ease of use and availability to a wide range of users who use the EXCEL environment in their work. The effectiveness of the developed software was confirmed in the course of practical work on the study of the socio-economic problems of the region.
Citations count: 1
Reference:
Bubeev I., Dubanov A.A., Ayusheev T.V., Motoshkin P.V. —
Building models of the movement of objects in the pursuit problem and solving it in the "MathCAD" computational mathematics system
// Software systems and computational methods.
– 2019. – ¹ 1.
– P. 1 - 11.
DOI: 10.7256/2454-0714.2019.1.28454 URL: https://en.nbpublish.com/library_read_article.php?id=28454
Read the article
Abstract:
This article provides a description of the developed models of the behavior of objects in the pursuit problem, as well as the pursuer and the pursued. The purpose of the conducted research is the development of algorithms for autonomous robotic complexes. In the proposed behavior models, local dynamic coordinate systems are introduced, which are formed by the direction of motion of the objects. During a certain period of time, the object must decide in which direction it should move depending on the result of the analysis of the coordinates of the second object. According to the proposed models of the behavior of objects in the pursuit problem, programs have been written in the computer math system “MathCAD”, which can be found on the author’s website. Due to the fact that the object, when moving in space, cannot instantly change the direction of motion, in our problems “inertness” is modeled using the angular velocity of rotation. The results of the programs obtained animated images of the movement of objects, references to which are given in the text of the article.
Citations count: 1
Reference:
Buldaev A.A., Naykhanova L.V., Evdokimova I.S. —
Model of decision support system in educational process of a university on the basis of learning analytics
// Software systems and computational methods.
– 2020. – ¹ 4.
– P. 42 - 52.
DOI: 10.7256/2454-0714.2020.4.34286 URL: https://en.nbpublish.com/library_read_article.php?id=34286
Read the article
Abstract:
In recent decades, the potential of analytics and data mining – the methodologies that extract valuable information from big data, transformed multiple fields of scientific research. Analytics has become a trend. With regards to education, these methodologies are called the learning analytics (LA) and educational data mining (EDM). Latterly, the use of learning analytics has proliferated due to four main factors: a significant increase in data quantity, improved data formats, achievements in the area of computer science, and higher complexity of available analytical tools. This article is dedicated to the description of building the model of decision support system (DSS) of a university based on educational data acquired from digital information and educational environment. The subject of this research is the development of DSS with application of learning analytics methods. The article provides a conceptual model of decision-making system in the educational process, as well as a conceptual model of the components of DSS component – forecasting subsystem. The peculiarity of forecasting subsystem model implies usage of learning analytics methods with regards to data sets of a higher educational institution, which contain the results of work of the digital information and educational environment, and include the characteristics of student activity. The main results of the conducted research is the examined and selected methods of clusterization and classification (KNN), the testing of which demonstrated palatable results. The author examined various methods of clusterization, among which k-prototypes method showed best results. The conclusion is made on favorable potential of application of the methods of learning analytics in Russian universities.
Citations count: 1
Reference:
Gradov O.V., Aleksandrov P.L., Gradova M.A. —
Study of mineral samples relevant for desert locations using software correlation spectral analysis of scanning electron microscopy registers: from 2D Fourier spectra to online analysis of statistics of integral spatial characteristics
// Software systems and computational methods.
– 2019. – ¹ 4.
– P. 125 - 171.
DOI: 10.7256/2454-0714.2019.4.31379 URL: https://en.nbpublish.com/library_read_article.php?id=31379
Read the article
Abstract:
The purpose of this article is to demonstrate the possibility of identifying minerals characteristic of desert regions under expeditionary conditions and in small laboratories that do not have the means to provide energy dispersive microanalysis (EDXMA) or mapping methods of wave dispersive spectroscopy (WDXRS), due to the use of hotel points as an identification technique and areas of interest (ROI) on a sample of correlation spectral image analysis (QAVIS) software. It is proved that this technique allows the identification of individual, resource-valuable minerals using the integral frequency and integral spatial characteristics, as well as the Fourier spectrum itself and the correlogram between the samples. Registration was made on the JEOL JSM system digitized by P.L. Alexandrov (IBCh RAS). The measurements were carried out using QAVIS software, developed at the Far Eastern Branch of the Russian Academy of Sciences by the team of the Laboratory for the Analysis of Oceanological Information (developers - Goncharova A.A., Fischenko V.K.), Department of Information Technology, Pacific Oceanological Institute named after V.I.Ilicheva FEB RAS. Efficiency of video processing is provided by using one of the “fastest” discrete Fourier transform libraries - FFTW and careful optimization of the QAVIS program code by the authors of this program (the program works with computer video memory, which allows you to process all frames of the video stream at the same time as it is viewed on a computer screen). Thus, it has been shown that program correlation-spectral analysis can serve in working with desert and relevant minerals not only for additive analysis of minerals using scanning microscopy methods, but also for distinguishing submicrostructures of these formations and obtaining histograms of statistical distributions of their descriptors.
Citations count: 1
Reference:
Borevich E.V. —
The Eye-Tracking Study of the Film Frame Composition Influence on the Visual Perception
// Software systems and computational methods.
– 2023. – ¹ 1.
– P. 51 - 60.
DOI: 10.7256/2454-0714.2023.1.39634 EDN: IWYBNX URL: https://en.nbpublish.com/library_read_article.php?id=39634
Read the article
Abstract:
The research is aimed at studying the elements that affect the visual perception of the film frame in order to develop methodological recommendations for the process harmonization of the film frame. The object of research is a film frame. The subject of the research is the technology of film frame processing. The purpose of this work is to obtain experimental data of the film frame viewing pattern and to identify statistical patterns to confirm or refute the formulated hypothesis. The goal of the study is to conduct an experimental study of the influence of composition on the parameters of the film frame viewing pattern. The influence of the factor of the mutual ratio of the areas of the centers of interest to the background on the parameters of the template for viewing the stimulus material is investigated. As a result, the methodology has been developed for conducting experimental studies of human perception of visual information using an eye-tracking software and hardware complex. When analyzing data on the influence of the objects size factor. The results obtained show that under the condition that objects occupy a small area of the frame, the observer needs more time to consider this frame. As well as in the case when objects occupy most of the frame (more than 40%). In the first case, due to the small size of the objects, it becomes more difficult for the observer to find objects in the frame space. In the second case, it takes time to identify objects, since they tend to be perceived as a background due to their large size.
Citations count: 1
Reference:
Panchuk K.L., Lyubchinov E.V. —
Cyclographic modeling of solutions of geometric optics problems on the plane
// Software systems and computational methods.
– 2018. – ¹ 4.
– P. 134 - 143.
DOI: 10.7256/2454-0714.2018.4.25745 URL: https://en.nbpublish.com/library_read_article.php?id=25745
Read the article
Abstract:
The subject of research is the optical transformations of pairs of basic geometric objects on a plane that simulate various sources of radiation. In the general case, when solving problems of geometric optics on a plane, the task of transforming one beam of rays into another, for example, converting rays of a point source into a system of parallel rays, is distinguished. Such tasks require the creation of a relatively simple method based on the laws of geometrical optics and allowing one to obtain reflective lines of a certain geometry corresponding to the given initial data. Obtaining a reflective line for different combinations of central, parallel and scattered direct beam transformations in this work is based on the cyclographic display method. The method is based on the optical properties of a cyclographic model of a spatial curve of a line and makes it possible to obtain reflective curves of various shapes during optical transformations of straight beams. The use of this method in building a source-receiver system makes it possible to select a receiver (or source) from a variety of receivers (sources) with the same reflector line. The study showed that the cyclographic mapping method makes it relatively easy to obtain reflective lines during optical transformations of various beam beams, while the analytical algorithm makes it possible to obtain parametric equations of this line. The results of the work can be used in the design of various optical systems in the antenna, laser and lighting engineering industries.
Citations count: 1
Reference:
Pekunov V.V. —
Refined calculation of droplet distributions in modeling atmospheric multiphase media
// Software systems and computational methods.
– 2019. – ¹ 4.
– P. 95 - 104.
DOI: 10.7256/2454-0714.2019.4.30707 URL: https://en.nbpublish.com/library_read_article.php?id=30707
Read the article
Abstract:
In this article the author considers the problem of increasing the accuracy of the search for adequate droplet distributions in the numerical simulation of multiphase media including the droplet phase. This problem is especially relevant when calculating the distributions with discontinuities that occur during intercellular droplet transfer, which have their own speed, as well as in the presence of sharp drops droplets, for example, of a technogenic nature. Tasks of this kind are often encountered in calculating the processes of formation and spread of pollutants in the air, in particular when modeling acid rain. The problem of constructing distributions is considered using the methods of computational mathematics (theory of interpolation), taking into account the physical laws of conservation of mass and number of drops. The elements of the method of moments (Hill method) and the sectional approach to modeling the droplet phase are used. A new approach is proposed for modeling droplet distributions by piecewise spline interpolation according to the density and concentration of droplet components, also relying on the constructed preliminary piecewise linear distributions. The results were compared with data obtained by direct modeling of many drops, as well as data obtained using exclusively piecewise linear distributions. The higher accuracy of the proposed approach is demonstrated in comparison with the original method using only piecewise linear distributions and a rather high calculation speed is shown in comparison with the Lagrangian approach.
Citations count: 1
Reference:
Simavoryan S.Z., Simonyan A.R., Popov G.A., Ulitina E.I. —
The procedure of intrusions detection in information security systems based on the use of neural networks
// Software systems and computational methods.
– 2020. – ¹ 3.
– P. 1 - 9.
DOI: 10.7256/2454-0714.2020.3.33734 URL: https://en.nbpublish.com/library_read_article.php?id=33734
Read the article
Abstract:
The subject of the research is the problem of identifying and countering intrusions (attacks) in information security systems (ISS) based on the system-conceptual approach, developed within the framework of the RFBR funded project No. 19-01-00383. The object of the research is neural networks and information security systems (ISS) of automated data processing systems (ADPS). The authors proceed from the basic conceptual requirements for intrusion detection systems - adaptability, learnability and manageability. The developed intrusion detection procedure considers both internal and external threats. It consists of two subsystems: a subsystem for detecting possible intrusions, which includes subsystems for predicting, controlling and managing access, analyzing and detecting the recurrence of intrusions, as well as a subsystem for countering intrusions, which includes subsystems for blocking / destroying protected resources, assessing losses associated with intrusions, and eliminating the consequences of the invasion. Methodological studies on the development of intrusion detection procedures are carried out using artificial intelligence methods, system analysis, and the theory of neural systems in the field of information security. Research in this work is carried out on the basis of the achievements of the system-conceptual approach to information security in ADPS.The main result obtained in this work is a block diagram (algorithm) of an adaptive intrusion detection procedure, which contains protection means and mechanisms, built by analogy with neural systems used in security systems.The developed general structure of the intrusion detection and counteraction system allows systematically interconnecting the subsystems for detecting possible intrusions and counteracting intrusions at the conceptual level.
Citations count: 1
Reference:
Korotin A.S., Popov E.V. —
Processing of digital terrain models for improving the reliability of the analysis of water basin morphometry
// Software systems and computational methods.
– 2018. – ¹ 2.
– P. 67 - 83.
DOI: 10.7256/2454-0714.2018.2.26383 URL: https://en.nbpublish.com/library_read_article.php?id=26383
Read the article
Abstract:
The object of the study are open digital terrain models located on the Internet. The subject of the study is the procedure for eliminating the errors of digital terrain models aimed at increasing the reliability of calculations of the morphometric characteristics of basins of water bodies. This article is devoted to the improvement of methods for processing and modifying geoinformation features by processing digital high-altitude relief models. The approaches outlined in this paper are aimed at increasing the reliability of calculating the main morphometric characteristics of the relief by eliminating errors in the original data. Usually for morphometric analysis, cartographic works are used, according to which the defined relief forms contain subjective errors. The values of these errors can then affect the results of the analysis, since the quantitative characteristics depend on where and how the boundary of the form passes. Given the presence in the initial open data of a number of high-altitude distortions that indicate their inapplicability to use for qualitative morphometric analysis at the level of private watersheds, the paper considers ways to adjust their geometric characteristics by eliminating the influence of tree vegetation and preserving the relief's relief using the Lagrange coefficients. A comparison of the results of a particular morphometric analysis obtained using corrected relief models with results obtained from other data is given. To increase the reliability and automate the processing of the digital model of the river basin relief, which is a regular network, it is necessary to carry out analysis in private watersheds with an area from 0.6 to 0.8% of the total area of the basin.
Citations count: 1
Reference:
Sharipov R.R., Yusupov B.Z. —
The research of electrical parameters of threshold detectors
// Software systems and computational methods.
– 2023. – ¹ 3.
– P. 29 - 47.
DOI: 10.7256/2454-0714.2023.3.43682 EDN: ZSVLGS URL: https://en.nbpublish.com/library_read_article.php?id=43682
Read the article
Abstract:
This research work provides an in-depth analysis of the fire alarm system, considered as a security tool for a variety of facilities, from industrial buildings to residential premises. Two key subsystems serve as the basis for the study: the fire alarm system, which is designed to detect and report the occurrence of fire, and the intrusion alarm system, whose task is to detect attempts at illegal intrusion. For each of these subsystems, their functions and components are investigated, their mechanism of action, principles of operation, and possible implementation options are described, depending on the specific conditions and security requirements. At the same time, emphasis is placed on three types of automatic fire alarm systems: threshold, addressable and addressable-analog, each of which has its own features, advantages and disadvantages. The article goes beyond theoretical analysis and presents the results of a practical study of the three main types of fire alarm systems: wired, wireless and addressable. The study is based on a specially designed training stand, which allows you to simulate the operation of fire alarm systems in conditions as close to reality as possible. The article demonstrates the connection schemes of detectors, explains their states in the "normal" and "alarm" modes. It is noted that the currents and voltages of alarm loops in different modes were measured. Dependency diagrams of these parameters are presented, and the alarm threshold levels are measured. The research presented in the paper provides a valuable contribution to the study and optimization of alarm and fire systems, providing meaningful data for the development and testing of these systems. The paper can be useful for fire and security professionals and those interested in improving the performance of these systems.
Citations count: 1
Reference:
Pozolotin V.E., Sultanova E.A. —
Application of data transformation algorithms in time series analysis for elimination of outliers
// Software systems and computational methods.
– 2019. – ¹ 2.
– P. 33 - 42.
DOI: 10.7256/2454-0714.2019.2.28279 URL: https://en.nbpublish.com/library_read_article.php?id=28279
Read the article
Abstract:
The subject of the research is data conversion algorithms for eliminating outliers in time series. The author considers data conversion algorithms based on arithmetic mean and median, as well as combined smoothing methods like 4253Í and 3RSSH. The author considers such aspects of the topic as changing the statistical characteristics of the time series when applying transformations, and also pays attention to the issues of visual presentation of data and changing the behavior of the series when introducing outliers into the time series. When writing the work, both theoretical and empirical research methods were used: the work and software systems that affect these issues were studied, and a series of experiments was conducted. Computational experiments on processing the time series have been carried out both without emissions and with emissions for smoothing. A comparison of the results of processing time series. A software tool is proposed that allows the use of various smoothing filters. The software tool has been tested for working with various characteristics of the input data.
Citations count: 1
Reference:
Kopyrin A.S., Makarova I.L. —
Algorithm for preprocessing and unification of time series based on machine learning for data structuring
// Software systems and computational methods.
– 2020. – ¹ 3.
– P. 40 - 50.
DOI: 10.7256/2454-0714.2020.3.33958 URL: https://en.nbpublish.com/library_read_article.php?id=33958
Read the article
Abstract:
The subject of the research is the process of collecting and preliminary preparation of data from heterogeneous sources. Economic information is heterogeneous and semi-structured or unstructured in nature. Due to the heterogeneity of the primary documents, as well as the human factor, the initial statistical data may contain a large amount of noise, as well as records, the automatic processing of which may be very difficult. This makes preprocessing dynamic input data an important precondition for discovering meaningful patterns and domain knowledge, and making the research topic relevant.Data preprocessing is a series of unique tasks that have led to the emergence of various algorithms and heuristic methods for solving preprocessing tasks such as merge and cleanup, identification of variablesIn this work, a preprocessing algorithm is formulated that allows you to bring together into a single database and structure information on time series from different sources. The key modification of the preprocessing method proposed by the authors is the technology of automated data integration.The technology proposed by the authors involves the combined use of methods for constructing a fuzzy time series and machine lexical comparison on the thesaurus network, as well as the use of a universal database built using the MIVAR concept.The preprocessing algorithm forms a single data model with the ability to transform the periodicity and semantics of the data set and integrate data that can come from various sources into a single information bank.
Citations count: 1
Reference:
Dagaev D.V. —
Restrictive language semantics in the Multioberon system
// Software systems and computational methods.
– 2023. – ¹ 1.
– P. 26 - 41.
DOI: 10.7256/2454-0714.2023.1.36217 EDN: IWIODR URL: https://en.nbpublish.com/library_read_article.php?id=36217
Read the article
Abstract:
The Oberon-based language and systems in implementation demonstrate a minimalist approach to achieving reliability, significantly different from most software systems that seek to maximize the number of supported functions. The requirements for critical systems for Category A nuclear power plants prohibit the use of even more programming practices. In order to meet the category A requirement of a stable number of iterations, the use of conditional loop operators is prohibited. To ensure ergodicity, the prohibition of the use of dynamic memory and recursion is used. A buffer overflow type vulnerability is closed by prohibiting the system operations module SYSTEM. Restrictions can be set to identify the problem of a fragile base class, type change operations, and the use of nested procedures. It is noted that the transition to the Oberon-07 dialect mainly concerned additional restrictions and fits well into the framework of restrictive semantics. Instead of languages and dialects for each set of requirements, the author proposes an approach of restrictive semantics, in which one language with a system of restrictions is used. A single RESTRICT statement has been introduced into the language as a declaration of restrictions on this module. The Multioberon compiler is implemented with one frontend, including a system of restrictions, and several replaceable backends. The syntactic analysis of the compiler is demonstrated by examples. The strategy of scaling the compiler depending on the system requirements is shown. The novelty of the restrictive semantics approach is the achievement of a set of minimum necessary properties that meet the requirements for the system. The use of the "from limitations" approach by system developers is an advantage, because it declares the really necessary properties of the system, linked to the requirements.
Citations count: 1
Reference:
Lyapustin A. —
Security of the Multi-Agent Platform
// Software systems and computational methods.
– 2017. – ¹ 3.
– P. 16 - 24.
DOI: 10.7256/2454-0714.2017.3.23311 URL: https://en.nbpublish.com/library_read_article.php?id=23311
Read the article
Abstract:
The research is devoted to the topical issue of ensuring security of heterogeneous information platforms applying the multi-agent threat detection systems. The object of the research is the multi-agent platform. The author pays special attention to such aspects of the topic as security of multi-agent platforms, managing threat detection agents, interaction between different threat detection agents, and vulnerability of multi-agent platforms. The author also analyses tendencies for developing new distributed security structures. In his research Lyapustin offers a mutl-agent structure that can be used by security service as well as a general scenario for deploying security policy in threat detection multi-agent systems. In terms of theory, the results of the research demonstrate the need to extend the scope of the multi-agent approach and integrate it with the intelligent analysis of information systems development and operation. The result of the research is the concept of a secure multi-agent platform that can be used in the treat detection multi-agent system.
Citations count: 1
Reference:
Vyatkin S.I. —
Reykasting three-dimensional textures and functionally defined surfaces using graphic accelerators
// Software systems and computational methods.
– 2019. – ¹ 2.
– P. 23 - 32.
DOI: 10.7256/2454-0714.2019.2.28666 URL: https://en.nbpublish.com/library_read_article.php?id=28666
Read the article
Abstract:
The object of study is the volumetric rendering method based on three-dimensional textures and functionally defined surfaces in an interactive mode using graphic accelerators. A hierarchical approach to the representation of textures in memory and a method for managing large arrays of voxels are proposed. The hierarchical structure has a compact texture description using data homogeneity and the importance of information to reduce the required memory and computation speed. The method is based on effective texture management, in which texture memory is assigned according to the degree of significance of the region and the content of voxel data. The combination of data, interpolation, and data importance determine the selected set of tree nodes. These nodes determine how the volume should be laid out and represented in the texture memory. The research method is based on analytical geometry in space, differential geometry and vector algebra, theory of interpolation and matrix theory, based on mathematical modeling and theory of computing systems. The main conclusions of the study are: the ability to visualize a large number of volumes, functionally defined objects, complex translucent volumes, including volume intersections in constructive solid-state modeling. Rendering different volumes at the same time is a more complex problem than rendering one volume, because it requires intersection and blending operations. Functionally defined surfaces are well suited for embedding external objects in volumes. Models of medical instruments and the combination of surfaces with volumetric data are necessary for virtual computer surgery.
Citations count: 1
Reference:
Volushkova V.L. —
Integration of heterogeneous data in corporate information systems
// Software systems and computational methods.
– 2019. – ¹ 1.
– P. 81 - 90.
DOI: 10.7256/2454-0714.2019.1.28768 URL: https://en.nbpublish.com/library_read_article.php?id=28768
Read the article
Abstract:
The object of the research is ways to store master data in corporate information systems. Building systems for integrating heterogeneous data is one way to solve the problem of master data management. The paper discusses the storage system for structured data in various databases. Such systems are called heterogeneous systems. Heterogeneous systems usually arise in cases where nodes that already operate their own database systems will eventually integrate into a distributed system. The aim of the work is to create a storage system for heterogeneous data in databases of various types. To build the system, the methodology “divergent development” is used. An approach to improving the efficiency of managing heterogeneous data in corporate information systems based on the “divergent development” programming paradigm is proposed. Within this paradigm, a domain-specific query language for a heterogeneous database has been developed. The effectiveness of the data integration system created can be judged by the test results given in the article.
Citations count: 1
Reference:
Shchemelinin D. —
Methods of managing configuration parameters, software artifacts and state metrics of computing components in globally distributed cloud information complexes
// Software systems and computational methods.
– 2019. – ¹ 1.
– P. 98 - 106.
DOI: 10.7256/2454-0714.2019.1.29757 URL: https://en.nbpublish.com/library_read_article.php?id=29757
Read the article
Abstract:
The subject of scientific research presented in this publication is a logical model and computer-computing infrastructure, including specialized software used to build a management database, determine, record and verify versions of deployed software and configurations of all computing elements, as well as a description of the relationships between these elements in globally distributed cloud information systems. The object of this study was chosen globally-distributed cloud information production infrastructure with numerous service equipment and large data flows of the company RingCentral (USA). The author of the article examines in detail the main aspects of the organization of the effective maintenance of the information cloud environment consisting of tens of thousands of virtual servers. Special attention is paid to the developed methods of integration of disparate computing systems containing reliable data about the information environment, but not intersecting among themselves. The developed methods for constructing a database of configuration parameters management in information systems were presented in the form of reports and presentations at international scientific seminars and conferences, where the scientific novelty and effectiveness of the proposed methods for servicing the globally distributed cloud information processing complex were noted. The use of the developed methods in RingCentral, has reduced the current costs associated with the organization of maintenance of the globally distributed information complex as a whole by thirty percent.
Citations count: 1
Reference:
Garmaev B.Z., Boronoev V.V. —
The selection of continuous wavelet transform basis for finding extrema of biomedical signals
// Software systems and computational methods.
– 2018. – ¹ 1.
– P. 45 - 54.
DOI: 10.7256/2454-0714.2018.1.23239 URL: https://en.nbpublish.com/library_read_article.php?id=23239
Read the article
Abstract:
The authors consider the problem of choosing a wavelet for its application in a continuous wavelet transformation. The whole advantage of wavelet analysis lies in the possibility of choosing a basis among a large number of wavelets. The choice of the analyzing wavelet is usually determined by what information needs to be extracted from the signal under study. Each wavelet has characteristic features, both in time and in frequency space. Therefore, with the help of different wavelets, it is possible to reveal more fully and emphasize certain properties of the analyzed signal. The choice of the analyzing wavelet function for creating a basis for wavelet transform is one of the issues whose successful solution affects the success of using wavelet analysis in the problem being solved. Bypassing this question repels the beginners in this field of researchers from using wavelet analysis or significantly lends the field of its application. The choice of the wavelet function is especially important for a continuous wavelet transform, where the result of the transformation is a three-dimensional continuous wavelet spectrum. This makes it difficult to analyze it, which is often limited to a visual analysis of the projection of the wavelet spectrum on the scale-time axis. This also complicates the choice of the wavelet function, since when changing the wavelet in the projection of the wavelet spectrum, numerous changes that can not be analyzed sometimes occur.The purpose of this work is to show the method of substantiating the choice of the analyzing wavelet-function for its use in continuous wavelet transformation using the example of the problem of localizing the points of extrema of a digital signal. The work uses continuous wavelet transform. We consider wavelet coefficients on different scales for analyzing the changes not on the wavelet spectrum as a whole, but on its individual parts. The proposed technique shows an algorithm for analyzing continuous wavelet spectra with different wavelet functions in order to evaluate their suitability for searching for extrema. An important point in this technique is the transition from a visual analysis of three-dimensional wavelet spectra to a quantitative analysis of two-dimensional wavelet coefficients on different scales. Such a transition shows how the wavelet analysis works inside three-dimensional wavelet spectra (analyzed primarily visually) and automates signal analysis. This also allows us to numerically estimate the accuracy of finding extrema when using a particular wavelet. As a result, the article shows that the Haar wavelet is the most accurate in the problem of searching for signal extrema by means of continuous wavelet analysis.This method of choosing a basis can be used in problems where an acceptable quantitative estimate of the accuracy of the operation of a continuous wavelet transform is possible. This will allow the authors to analyze three-dimensional wavelet spectra not only qualitatively (visually), but also quantitatively.
Citations count: 1
Reference:
Pekunov V.V. —
Predicting channels in parallel programming: possible applications in mathematical modeling of processes in continuous media
// Software systems and computational methods.
– 2019. – ¹ 3.
– P. 37 - 48.
DOI: 10.7256/2454-0714.2019.3.30393 URL: https://en.nbpublish.com/library_read_article.php?id=30393
Read the article
Abstract:
In this paper, the author considers the problem of applying prediction in classical and parallel programming of mathematical modeling problems based on the numerical integration of partial differential equations. Prediction can be used to replace fragments of a mathematical model with simpler approximate relations, to anticipate the values received during parallel counting, to predict the execution time of fragments of a parallel program when deciding whether to use a sequential or parallel algorithm and when balancing the load of individual processors of a computing system. To formalize the types of predictors and determine the mathematical relationships that allow them to be calculated, the approaches and methods of computational mathematics (theory of interpolation and extrapolation) are used. To determine the software implementations of predictors, approaches that are characteristic of parallel programming engineering are used. The results are verified experimentally. For the first time, a new type of technological means of prediction for use in parallel programming is proposed - prediction-solving channels. Two types of channels are proposed - autoregressive point and linear (explicit or implicit) collective solutions. The mathematical aspects of prediction in such channels are described, basic programming tools are briefly described. It is shown that combining channels with data prediction tools simplifies the programming of a number of algorithms related to numerical modeling, and allows, in particular, hidden transitions from explicit difference schemes to implicit ones, which are more stable in counting, as well as from sequential counting algorithms to parallel ones. Using the example of the problem of numerical integration of the non-stationary heat equation, it is shown that the adequate use of channels in some cases allows you to speed up the calculation on multi-core computing systems.
Citations count: 1
Reference:
Dobrynin A.S., Gudkov M.Y., Koynov R.S. —
A precedent approach to incident management in automated process control systems
// Software systems and computational methods.
– 2020. – ¹ 2.
– P. 45 - 52.
DOI: 10.7256/2454-0714.2020.2.31040 URL: https://en.nbpublish.com/library_read_article.php?id=31040
Read the article
Abstract:
The continuous development of automated control systems for industrial facilities leads to the emergence of more advanced and complex control algorithms. A natural consequence of the development of control systems (CS) is the use of more complex technical means: sensors, controllers, SCADA and MES systems. Ultimately, the saturation of systems with additional software and hardware leads to a decrease in manageability in general, since software needs to be updated, equipment often fails, needs replacement, etc. Thus, approaches aimed at creating separate, autonomously functioning subsystems are becoming a thing of the past. An integrated, multi-level joint management of the entire infrastructure of the process control system is needed, from the technological facility to the technical infrastructure, which is closely tied to the facility. The article discusses the issues of constructing top-level control subsystems for the process control system, when it is necessary to control directly the software and hardware as part of the process control system. As research methods, simulation and computer modeling was used, which made it possible to evaluate the effectiveness of the proposed approaches and management methods. Also, the research results were verified through the pilot implementation of an automated incident management system based on the proposed approaches in the process of managing a complex technologically object.
The novelty of the research lies in the proposed approach to incident management in automated process control systems, which makes it possible to improve the quality of management, reduce management costs, and predict (in some cases) the occurrence of new incidents and take measures to prevent them. Studies have shown the feasibility of using the proposed approach to control complex non-stationary automation systems.
Citations count: 1
Reference:
Koronkov S.O. —
Methodology of automated study of the workload of a helicopter pilot
// Software systems and computational methods.
– 2022. – ¹ 4.
– P. 63 - 74.
DOI: 10.7256/2454-0714.2022.4.36459 EDN: MIVFFZ URL: https://en.nbpublish.com/library_read_article.php?id=36459
Read the article
Abstract:
The subject of the study is the contradiction between the need to conduct a study of the workload of helicopter pilots and the lack of methods regulated by regulatory and technical documents to ensure that such studies are carried out during the testing of new aircraft. The purpose of the study was to ensure the possibility of objectification of the pilot's workload during testing of modernized and created helicopter models, as well as in the process of flight simulator training. The author examines in detail such aspects of the topic as the development of a pilot workload research program; the completion of a helicopter by installing a set of technical means for studying attention reserves; determining the pilot's workload; determining the integral indicator of the pilot's workload and drawing up a conclusion based on the results of the workload study. The main conclusion of the conducted theoretical and experimental research is that the developed methodology for studying the workload of a helicopter pilot, based on determining the reserves of his attention when performing professional activities, provides an adequate definition of the workload when performing helicopter tests on semi-natural modeling complexes and during flight simulator training. The results of verification and research of the effectiveness of the developed solutions have shown that the objectification of the pilot's workload during helicopter testing provides an opportunity to substantiate industry recommendations on the refinement and improvement of the layout of helicopter cabins, to rationalize the points of the flight part of the test program and to intensify the process of professional training of flight personnel.
Citations count: 1
Reference:
Kiryanov D.A. —
Research of the methods of creating content aggregation systems
// Software systems and computational methods.
– 2022. – ¹ 1.
– P. 9 - 31.
DOI: 10.7256/2454-0714.2022.1.37341 URL: https://en.nbpublish.com/library_read_article.php?id=37341
Read the article
Abstract:
The subject of this research is the key methods for creating the architecture of information aggregators, methods for increasing scalability and effectiveness of such systems, methods for reducing the delay between the publication of new content by the source and emergence of its copy in the information aggregator. In this research, the content aggregator implies the distributed high-load information system that automatically collects information from various sources, process and displays it on a special website or mobile application. Particular attention is given to the basic principles of content aggregation: key stages of aggregation and criteria for data sampling, automation of aggregation processes, content copy strategies, and content aggregation approaches. The author's contribution consists in providing detailed description of web crawling and fuzzy duplicate detection systems. The main research result lies in the development of high-level architecture of the content aggregation system. Recommendations are given on the selection of the architecture of styles and special software regime that allows creating the systems for managing distributed databases and message brokers. The presented architecture aims to provide high availability, scalability for high query volumes, and big data performance. To increase the performance of the proposed system, various caching methods, load balancers, and message queues should be actively used. For storage of the content aggregation system, replication and partitioning must be used to improve availability, latency, and scalability. In terms of architectural styles, microservice architecture, event-driven architecture, and service-based architecture are the most preferred architectural approaches for such system.
Citations count: 1
Reference:
Nuriev M.G., Belashova E.S., Barabash K.A. —
Markdown File Converter to LaTeX Document
// Software systems and computational methods.
– 2023. – ¹ 1.
– P. 1 - 12.
DOI: 10.7256/2454-0714.2023.1.39547 EDN: SNAYLQ URL: https://en.nbpublish.com/library_read_article.php?id=39547
Read the article
Abstract:
Common text editors such as Microsoft Word, Notepad++ and others are cumbersome. Despite their enormous functionality, they do not eliminate the risk of incorrectly converting the document, for example, when opening the same Word files on older or, conversely, newer versions of Microsoft Word. The way out is the use of markup languages, which allow you to mark up text blocks in order to present them in the desired style. Currently, very popular are LaTeX (a set of macro-extensions of the TeX typesetting system) and Markdown (a lightweight markup language, designed to denote formatting in plain text). So the question of converting a Markdown document into a LaTeX document is relevant. There are various tools to convert Markdown files to LaTeX document, such as Pandoc library, Markdown.lua, Lunamark and others. But most of them have redundant steps to generate the output document. This paper highlights a solution method by integrating a Markdown file into a LaTeX document, which will potentially reduce the output document generation time unlike existing solutions. The developed Markdown to LaTeX document converter will automatically generate the output document and reduce the possibility of errors when manually converting text from Markdown format to LaTeX format.