Reference:
Chausov E.V., Molchanov A.S..
Mathematical means and software for digital image processing when evaluating the linear resolution of aerial photographic systems
// Cybernetics and programming.
2020. № 1.
P. 42-52.
DOI: 10.25136/2644-5522.2020.1.32974 URL: https://en.nbpublish.com/library_read_article.php?id=32974
Abstract:
The authors consider the issues of evaluating digital aerial photographic systems based on the modulation transfer function. Special attention is paid to the issues of synthesis of mathematical and software for processing digital images when evaluating the linear resolution of aerial photographic systems. The basis of the mathematical support is the original method for evaluating the linear resolution on the ground using the modulation transfer function, taking into account the results of the flight experiment. The software is implemented in an automated complex for image processing of digital aerial photographic systems, which automatically calculates the modulation transfer function based on the results of the flight experiment. The research methodology combines methods of systems analysis, digital image processing, probability theory, mathematical modeling and software engineering. The main conclusion of the study, confirmed by the results of determining the experimental function of modulation transfer using an automated image processing complex based on images obtained during flight tests of a digital camera of the complex from the Orlan-10 unmanned aerial vehicle, is to confirm the operability of the developed automated complex for digital image processing, which makes it possible to use it to estimate the linear resolution on the ground during flight tests of aerial photographic systems of aerial reconnaissance and surveillance.
Keywords:
image decryption, unmanned aerial vehicle, flight tests, digital aerial photography system, bar image, linear terrain resolution, modulation transfer function, digital image processing, automated image processing, automation of aerial reconnaissance
Reference:
Abramova O.F..
Visualization of the web-system user behavior pattern
// Cybernetics and programming.
2019. № 3.
P. 43-52.
DOI: 10.25136/2644-5522.2019.3.23017 URL: https://en.nbpublish.com/library_read_article.php?id=23017
Abstract:
Usability evaluation of web information systems is relevant and useful both for developers and for customers. One of the most important ways of evaluating the performance of information web-systems can be considered to estimate the so-called behavioral factors. Almost all currently used methods involve significant financial and time investments, but also depend on the number and level of knowledge of the participants specifically recruited focus groups. This is a rather laborious process, not always bringing the expected result with a clear evaluation and recommendations for improving the conversion of web-system. Therefore, the maximum visualization of the results of the assessment will significantly increase the information content and reduce the complexity of the evaluation of the system under study. For example, visualization of the pattern of behavior of user of the web-system. The display of the list of visited pages and the pages from which the user came to the website, will assess a few important indicators: the depth of view, the source of referrals to the site, as well as to simulate and evaluate the pattern of user behavior in the system. Moreover the clarity and the simplicity of the resulting circuit allows to use it both for professionals and owners of web-based systems. The article is devoted to description of methods for identifying usability problems of web systems, as well as description of how the program works to identify and visualize the behavior pattern of the user of based graph.
Keywords:
behavior graph, Web-system, efficiency mark, pattern of user behavior, visualization, behavioral factors, graph, usability, algorithm, software implementation
Reference:
Stepanov P.P..
Application of group control and machine learning algorithms on the example of the "Battlecode" game
// Cybernetics and programming.
2019. № 1.
P. 75-82.
DOI: 10.25136/2644-5522.2019.1.23527 URL: https://en.nbpublish.com/library_read_article.php?id=23527
Abstract:
The subject of the research is the task of group management of autonomous agents in a dynamic multi-agent system and self-study of the management model. The author examines such aspects of the problem as a group interaction, using the example of the most effective group control algorithms, such as SWARM, ant algorithm, bee algorithm, firefly algorithm and fish school movement algorithm, and training of an artificial neural network through the use of reinforcement training. A comparison of various algorithms for finding the optimal path. The comparison was made on the basis of the gaming environment "Battlecode", which dynamically forms a new map for the new round, which ensured the quality of the comparison of the considered algorithms. The author uses statistical methods of data analysis, the selection and analysis of qualitative signs, forecasting methods, modeling method, classification method. The author shows that Q-learning increases its effectiveness by replacing the tabular representation of the Q-function with a neural network. This work proves the effectiveness of the bee algorithm in solving the problem of researching and patrolling the area. At the same time, the path search algorithm A* is much more flexible and efficient than the Dijkstra algorithm.
Keywords:
multiagent system, ant algorithm, bee algorithm, game artificial intellegence, reinforcement learning, neural network, group management, Battlecode, modeling, agent
Reference:
Poriadin A., Oparin K..
Non-parametric model of learning for a system of diagnostics of psycho- physiological qualities
// Cybernetics and programming.
2016. № 2.
P. 13-19.
DOI: 10.7256/2306-4196.2016.2.18155 URL: https://en.nbpublish.com/library_read_article.php?id=18155
Abstract:
The article studies support systems in field of diagnostics of person psycho-physiological qualities. The subject of the research is the use of neural networks in the development of tests for evaluation of the psycho-physiological state of a person. In this paper the authors examine the possibility of using neural networks to assess the psycho-physiological state of a person applying the achievements obtained by other researchers using neural networks to solve the problems of medical diagnostic, such as in the diagnosis of myocardial infarction or in recognition of emotions based on psycho- physiological parameters. The authors used mathematical modeling methods, such as methods of probability theory, mathematical statistics, artificial intelligence, methods of forecasting and decision-making. The study shows that neural network is an effective tool for the study of such stochastic systems as human. Using neural networks in systems of psycho- physiological diagnosis improves the accuracy of diagnosis by uncovering hidden relationships between different human systems. The ability to use neural networks for the treatment of psycho-physiological test results was confirmed using a generalized description of a neural network and examples of input and output neural network vectors for processing results of the test а reaction to moving object.
Keywords:
diagnostics, psycho- physiological qualities of man, psycho-physiological tests, decision support, neural network, neurophysiology, reaction to moving object, tapping test, LVQ neural network, non-parametric model of learning
Reference:
Malashkevich V.B., Malashkevich I.A..
Elements of algebra of triplexes in idempotent bases
// Cybernetics and programming.
2016. № 1.
P. 1-228.
DOI: 10.7256/2306-4196.2016.1.17583 URL: https://en.nbpublish.com/library_read_article.php?id=17583
Abstract:
The subject of study in algebra is a ternary (three-dimensional) hypercomplex numbers (triplexes). Since the time of Hamilton (1983) algebras of hypercomplex numbers attracted the attention of researchers. The largest number of papers in this area is dedicated to quaternion algebra and bicomplex numbers, as well as its applications to the solution of various problems of science and technology. Algebra of ternary (three-dimensional) hypercomplex numbers is less studied. However, it is undoubtedly promising in solving problems related to processing of point objects and fields in three-dimensional Euclidean space. The main objective of the article is forming a basis of idempotent algebra of three-hypercomplex numbers. Idempotent bases are typical for commutative multiplicative algebras without division. Such bases provide a simple definition and way of studying mathematical constructions of hypercomplex numbers as well as a significant increase in computational efficiency. The paper presents all possible unit vectors of potential idempotent bases of triplex numbers. The authors highlight two idempotent bases providing not excessive presentation of triplexes. The main attention is given the study of one of these bases with complex unit vectors. The paper shows, that idempotent triplexes basis allows formulating the definition of arithmetic operations and triplex argument functions in terms of the well-studied algebra of real and complex numbers. At the same time mentioned basis provides a high computational efficiency for calculating values of these operations and functions.
Keywords:
hypercomplex numbers, commutative hypercomplex algebra, zero divisor, idempotent basis, triplex, Algebra without division, triplex algebra, conjugation, triplex function of the argument, triplex ring
Reference:
Milovanov M.M..
Software implementation of a workbench for testing trading algorithm based on the project approach
// Cybernetics and programming.
2016. № 1.
P. 229-235.
DOI: 10.7256/2306-4196.2016.1.17855 URL: https://en.nbpublish.com/library_read_article.php?id=17855
Abstract:
The article describes an approach to the development and implementation of a software system for the design, testing and implementation of trading algorithms. The author shows methods of obtaining and transmitting data from a medium to the terminal and vice versa. The article presents a review of similar software and highlights the main advantages of this approach to design. The author uses a prototype-based programming approach to software implementation. As a method of research the author uses observation. The object of research is the algorithm. The subject of the study is a data set for analysis of the algorithm. The main novelty of the proposed approach is to use prototype-based programming approach to algorithm design and implementation used as an object-oriented model. The article suggests a scheme of data exchange with third-party applications using functions of native dynamic libraries of a terminal. The study gives an algorithm for the application.
Keywords:
data analysis, software, lua, programming, development, stock market, testing, optimization, .NET, C #
Reference:
Perminova M.Y..
The analysis of partitions based algorithm of polynomials decomposition
// Cybernetics and programming.
2015. № 6.
P. 21-34.
DOI: 10.7256/2306-4196.2015.6.17169 URL: https://en.nbpublish.com/library_read_article.php?id=17169
Abstract:
The research focuses on the generating function, which is an effective tool for solving various mathematical problems in combinatorics, probability theory, mathematical physics, analysis of algorithms, etc. The subject of research is one class of generating functions – polynomials. Special attention is paid to the problem of polynomial decomposition, which has a number of solutions. The author proposes a new polynomial decomposition algorithm based on partitions. The article gives a brief description of the algorithm and gives an example of its usage. The study determines computational complexity of the algorithm, which consists of the time complexity of generating partitions, producing a monomial and solving the equation. The time complexity of the polynomial decomposition algorithm based on partitions is calculated on the basis of the results obtained by D. Knuth and given in the On-Line Encyclopedia of Integer Sequences. The original polynomial decomposition algorithm is also given. It is shown that the time complexity of the algorithm is O (n^2). The author compares the described algorithm with its analogs. The analysis shows that most of the decomposition algorithms have polynomial computational complexity of O (n^2). The experimental curves of the computational complexity of the polynomial decomposition algorithm based on partitions and known algorithms are shown.
Keywords:
Decomposition of polynomials, Generation of partitions, algorithm, the computational complexity of the algorithm, generating functions, polynomial, computer algebra systems, composition, monomial, solution of equations
Reference:
Borodin A.V., Azarova A.N..
Methods of classification and dimensionality reduction in case of visualization of performance metrics
// Cybernetics and programming.
2015. № 4.
P. 1-35.
DOI: 10.7256/2306-4196.2015.4.15271 URL: https://en.nbpublish.com/library_read_article.php?id=15271
Abstract:
The paper deals with the methodology of an assessment of technical efficiency of network infrastructure. Much attention is given to research of methods of visualization of performance metrics on the basis of comparing of the evaluated sample with a set of alternative decisions in the conditions of stochastic nature of behavior of an external environment. The method of visualization of time response characteristics of access to resources of the Internet offered in operation is developed especially for demonstration of advantages which can be received when using the concept of "cognitive Internet". Unlike numerical efficiency characteristics the offered method of visualization allows to envelop "one look" a status of all channels of access in comparison with the optimum channel on the given time slot of integration. At the same time the method doesn't exclude possibility of sharing of the coordinated numerical efficiency characteristics. On the other hand it is important to mark that scope of a method isn't restricted to the specified applications.Methods of multivariate statistic analysis (methods of discriminant function analysis and principal component analysis) are the basis for algorithm elaboration of visualization of time response characteristics of access to resources of the Internet. The main result of the conducted research is algorithm elaboration and the software of visualization of metrics of productivity of infrastructure decisions in the field of ensuring access to Internet resources. Novelty of this research is defined not only novelty of data domain (technology of the cognitive Internet), but also the form of representation of results (a projection of the hodograph of time response characteristics of access to the most informative plane).
Keywords:
technical efficiency, performance metric, dimensionality reduction, principal component analysis, cluster analysis, visualization, characteristic vector, characteristic value, cognitive internet, discriminant function analysis
Reference:
Milovanov M.M..
Using Windows PowerShell scripts to manage Microsoft SQL Server backups in application for Department of Social Security
// Cybernetics and programming.
2015. № 3.
P. 7-10.
DOI: 10.7256/2306-4196.2015.3.15410 URL: https://en.nbpublish.com/library_read_article.php?id=15410
Abstract:
At the present time it is very important to save gathered data. The development of modern information technologies, using databases raises questions of storing and backups for big amount of data. The high requirements to the speed of data recovery imply the correct organization of storing data and backups. Keeping that in mind the author shares his experience of setting up making of backups in command line using Windows PowerShell scripts. The article describes mechanisms and results of the applied technique. The research is focused on the observing IT-processes. The article reviews usage of upgraded Windows PowerShell command line instead of outdated command line. The author presents a short review of the main commands used in writing a PowerShell script. The article gives examples of using the developed script for backing up and storing databases. Using this technique proves to be reliable for a long time based on the tests on different platforms. This method increases the efficiency of IT, the reliability of information processes and optimization of employee time.
Keywords:
big data, data archiving, database, powershell, script, windows, baskup, software, IT-process, algoritm
Reference:
Borodin A.V., Biryukov E.S..
The practical implementation of some algorithms related to the problem of number composing
// Cybernetics and programming.
2015. № 1.
P. 27-45.
DOI: 10.7256/2306-4196.2015.1.13734 URL: https://en.nbpublish.com/library_read_article.php?id=13734
Abstract:
Among combinatorial algorithms of additive number theory the algorithms of the algorithms for listing compositions of natural numbers have a special place. On the one hand, ideologically, they are among the simplest algorithms in mentioned theory. On the other hand, they play a huge role in all applications somehow connected with the polynomial theorem. In recent years, due to the rapid development of the general theory of risk ideas underlying the polynomial theorem were involved to in the challenges of risk measurement in homogeneous systems of high dimensionality. Solving these problems requires providing mass listing compositions numbers of fixed length and calculating the amount of such compositions for sufficiently large values of both number and the length of composition. In these circumstances, the most urgent task is in effective implementation of these algorithms. The presented article is devoted to the questions related with the synthesis of efficient algorithms for listing the compositions of fixed length and calculating the amount of such compositions. As a methodological base of this study authors use certain facts of set theory, approaches of theory of complex algorithms, as well as some basic results of the theory of numbers. Within this paper, the author propose a new efficient implementation of two algorithms: algorithm for listing all the compositions of fixed length based on the idea of multiset representation of the number partitions and algorithm for calculating the amounts of the compositions of given kind, implemented without involvement of high bitness machine arithmetic. The article shows not only an estimate of the complexity of the proposed algorithms but also presents the results of numerical experiments demonstrating the effectiveness of the implementation of the algorithms discussed in the VBA programming language.
Keywords:
number composition, number expansion, partition of the number, polynomial theorem, multiset, complexity of the algorithm, risk, risk theory, risk measurement, total cost of ownership
Reference:
Urazaeva T.A..
Application package “MultiMIR”: architecture and appliance
// Cybernetics and programming.
2014. № 5.
P. 34-61.
DOI: 10.7256/2306-4196.2014.5.12962 URL: https://en.nbpublish.com/library_read_article.php?id=12962
Abstract:
Evaluation of risks of system development is an urgent task for a for a variety of disciplines such as economics and sociology, technology and ecology, the system studied at the intersection of different disciplines. Often the parameters of such systems are discrete, set of possible states is bounded. The application package “MultiMIR” was designed to evaluate risks of development in such systems. An important difference of “MultiMIR” from other application is in achievement of polynomial computational complexity for some classes of systems, while most analogues offer only exponential complexity. The article describes: purpose of the application, main ideas used as a basis for algorithms, application architecture. The author gives an overview of ways of using the application. The conceptual basis of the theory used in the development of algorithms implemented in “MultiMIR” is in theoretical probabilistic approach. As a specific mathematical apparatus the author has chosen formalism of theory of multisets, which, in author’s opinion, has the richest expressive possibilities for the study in the described the subject area. As a programming system used in the development of the first version of the application the author used VBA-subsystem office with Microsoft Office. The selection of the programming system is dictated by the features and preferences of the primary target of the package: banking and financial analysts. Using “MultiMIR” allowed for the first time to provide accurate calculation of such non-linear measures of risk as expected utility, distorted probability measure, "Value at Risk", and so on for medium and large homogeneous portfolios term financial instruments without involving time-consuming analytical methods. Unlike traditionally used for this purpose Monte Carlo method, approached based on the described above application allows obtaining an exact solution using a comparable amount of CPU resource. “MultiMIR” application can also be used for verification of reliability of the results obtained using Monte Carlo methods considered classical in the financial risk management.
Keywords:
system, visual modeling , risk, risk process, measure of risk, Value at Risk, VaR, homogeneous portfolio , term financial instruments, Monte Carlo method
Reference:
Ponomarev D..
Software load sharing system for information systems
// Cybernetics and programming.
2013. № 5.
P. 29-36.
DOI: 10.7256/2306-4196.2013.5.9762 URL: https://en.nbpublish.com/library_read_article.php?id=9762
Abstract:
The article presents the results of the development of software for calculating load distribution in information systems using tensor methodology. Applying tensor models allows solving the task for a wide range of information networks. The article states that for the purpose of the application of tensor analysis to the problem of analyzing the distribution of traffic information network a software system that implements certain stages of network analysis was developed. The author notes that the used mathematical apparatus is well formalized and it is possible to solve the problem of the implementation of the tensor methodology in software system using the available software tools. As an example of the developed software the author presents a research on the distribution of traffic on the network. In conclusion, it is stated that the further analysis of the values of the intensity distribution of load requires defining the type information distribution systems (lossy or expectation), and determining the required number of lines for a given level of losses.
Keywords:
software, load distribution, information systems, tensor methodology, information networks, structure of the primitive network, matrix components, transformation matrix, research, data analysis
Reference:
Fukin I.A..
Cloud management system for communication between educational institutions and employers
// Cybernetics and programming.
2012. № 2.
P. 25-37.
DOI: 10.7256/2306-4196.2012.2.13896 URL: https://en.nbpublish.com/library_read_article.php?id=13896
Abstract:
Creating conditions for effective process of interaction between subjects of educational cluster in the current institutional environment requires the formation of an information space of the participants. The need for complex algorithms and solving problems of information support of the interaction of subjects of educational clusters due to the complexity of control as learning process consists in the fact that the assessment of the quality of management and adjusting curricula, load distribution, class schedules are possible only after the completion of a certain cycle of learning, and a single information environment this segment of the labor market. In the article the problem of interaction between enterprises and educational institutions for training is reviewed. As a tool to deal with them author suggests to use the control subsystem interaction of stakeholders in education and labor markets control systems of educational process based on cloud technologies. Proposed solution is presented as a software module.
Keywords:
software unit, asynchronous control, quality control, distributed computing, cloud computing, cloud, control system, education, feedback, e-learning