Reference:
Ponachugin A.V..
Software Reliability definition in the modern information system structure
// Cybernetics and programming.
2019. № 2.
P. 65-72.
DOI: 10.25136/2644-5522.2019.2.20341 URL: https://en.nbpublish.com/library_read_article.php?id=20341
Abstract:
The subject of research is the use of modern approaches to determine aimed at improving the reliability of software security and quality of the information system. The object of research is a model determining the reliability of the software. The author examines in detail aspects such topics as: the use of a functional approach to the study of the reliability of the software component is the sum of the aggregate reliability to achieve each function block; Use a systematic approach to the study of the reliability of the software in the information system structure. Particular attention is paid to the comparison of existing models and methods of determining the reliability. The proposed method of estimating software reliability based on the allocation of the function blocks and to guarantee quality and reliability of the final result of software development. Key findings of the study: the use of a systematic approach makes it possible to identify the quality of the following interconnections between the constituent elements of the program, which define modern ways to improve software reliability. The paper proposed to adapt the major principles of the system improve the reliability of software proved useful effect of their introduction. The novelty of the research lies in the joint use of the functional and systemic approach in order to identify ways to increase the quality and reliability that takes into account the characteristics within the structural elements of the program and their interaction.
Keywords:
improving reliability, software reliability, reliability assessment, functional unit, the reliability of management systems, reliability of software, self-test information system, software, systems approach, functional approach
Reference:
Revnivykh A.V., Velizhanin A.S..
The study of the disassembled representation of executable files generated by different compilers. Example of buffer overflow vulnerability.
// Cybernetics and programming.
2019. № 1.
P. 1-17.
DOI: 10.25136/2644-5522.2019.1.28238 URL: https://en.nbpublish.com/library_read_article.php?id=28238
Abstract:
The subject of the study is a potential buffer overflow vulnerability in various software related to the function of the standard C / C ++ strcpy programming language library and approaches and methods for finding such vulnerabilities. The object of the study is the data of the machine code of the compilers when the program is assembled in various modes. The purpose of the study is to analyze some features of the machine code generated by various compilers for Windows and Linux in the Debug and Release modes, including, on the basis of this, a review of the buffer overflow vulnerability. Research methods. The paper reviews and develops methods for constructing algorithms for searching for buffer overflow vulnerabilities, examines the characteristics of this vulnerability at the level of machine code. This is done using the Visual C ++ compilers, Intel C ++ compilers, g ++ compilers, as well as the WinDBG, GDB debuggers. Key findings. Building programs in different modes leads to the formation of differences in the executable code, which is made from the completely same high-level programming language code; these differences manifest themselves in differences in program behavior. In the course of researching software in search of vulnerabilities, it is important to analyze computer code in order to identify hidden patterns. The novelty of the study lies in identifying differences in the machine code obtained after assembling the same high-level code, identifying compiler stamps when executing the assembly of the program in different modes. A special contribution of the author to the study of the topic is the development of methods for constructing algorithms for searching for buffer overflow vulnerabilities.
Keywords:
Debug mode, Compiler stamps, Buffer overflow, Disassembling, Code analysis, Vulnerabilities, Information security, Release mode, Algorithm construction methods, WinDBG debugger
Reference:
Golosovskii M.S..
Algorithms for automated determination of links between elements of software development project
// Cybernetics and programming.
2017. № 6.
P. 38-49.
DOI: 10.25136/2644-5522.2017.6.19616 URL: https://en.nbpublish.com/library_read_article.php?id=19616
Abstract:
The subject of the study is to determine the links between elements of software development projects implemented with the help of tracing requirements on the basis of data from the monitoring systems of versions of the source code. Many well-known tracing techniques are dependent on the programming language, which limits their use in projects developed using multiple programming languages. Therefore, the research goal was to form a set of algorithms to build relationships between the entities the software development process (artifacts) on the basis of the source code and analyze these connections (the code should be independent of the programming language and easy to implement). The research methodology combines methods of system analysis, software engineering, software development, reliability theory, computer science and mathematical qualimetry. The main conclusions of the study are the algorithms for the automated determination of relations between the elements of a software development project, allowing to solve tasks perform impact analysis. High computational complexity of the algorithms developed can be reduced by the gradual formation of a global connectivity matrix as the project progresses. The accuracy of the developed algorithms can be improved, if the coupling element does not take the file, and the function or class method.
Keywords:
software application architecture, version the program code, program code, version control system, tracing requirements, program requirements, software development, program engineering, integrity control architecture, communication project elements
Reference:
Rannev E.V., Myasnikov V.I..
Noise analysis of the NMR relaxometer receiving channel
// Cybernetics and programming.
2014. № 6.
P. 1-6.
DOI: 10.7256/2306-4196.2014.6.13302 URL: https://en.nbpublish.com/library_read_article.php?id=13302
Abstract:
Traditionally, in pulsed NMR spectroscopy signals are registered by subtracting from it the reference frequency which is near to the Larmor frequency of the nucleus, followed by digitization and further processing. In modern NMR apparatus the receiving channel consists of preselector, quadrature detector, normalizing amplifier, low pass filter and analog-to-digital converter. The disadvantage of such a receiver is the length of the analog section, since each module adds additional noises. Thermal noise occurs on fluctuations of electrons in conductors having a particular temperature. Such fluctuations have spectral components that are in the same frequency band with useful signals. This article analyzes the nature of the noise of the receiving channel of nuclear magnetic resonance relaxometer and evaluates its characteristics. Authors suggest replacement of the analog part of receiving channel with the digital quadrature detector and theoretically calculate the benefit from the exclusion of several noise sources.
Keywords:
nuclear magnetic resonance, relaxometer, quadrature detector, signal, noise, digital receiver, envelope detector, noise factor, interference, receiving channel
Reference:
Kuchinskaya-Parovaya I.I..
Component design of neural networks to process knowledge bases
// Cybernetics and programming.
2013. № 1.
P. 9-15.
DOI: 10.7256/2306-4196.2013.1.8308 URL: https://en.nbpublish.com/library_read_article.php?id=8308
Abstract:
The article describes the main steps of the methodology component design of neural networks to process knowledge bases represented by semantic networks. The technique is based on the use of a unified neural network model and component-based approach to work with neural networks. An important element of the component design of neural networks is a library of neural network compatible components . One of the possible solutions to these problems may be the development of a technique of designing and using neural networks based on the unified model of neural networks and the component approach. Component Design technique is based on the use of the library of the neural networks compatible components. It is concluded that the use of the proposed methodology of component design approach will ease the design and development of the neural networks, lower qualification requirements for the developer (the end user), as well as solve the problem of neural network integration with other methods of representation and processing of information in the development of intelligent systems.
Keywords:
neuroinformatics, neural network techniques, integration, HC components, data processing, knowledge base, neural networks, design, hybrid systems, neural network library