Reference:
Revnivykh A.V., Velizhanin A.S..
The method of automated research of the structure of disassembled representation of software code with a buffer overflow vulnerability using the matrix approach
// Cybernetics and programming.
2018. № 6.
P. 11-30.
DOI: 10.25136/2644-5522.2018.6.28288 URL: https://en.nbpublish.com/library_read_article.php?id=28288
Abstract:
The subject of the research is the optimization algorithms for automated dependency search on disassembled code. The object of the research is the dependent code blocks on the x64 architecture of Intel processors manufactured by the company and listings obtained as a result of reverse engineering software by compilers with different settings in Windows and Linux.Purpose of the study. The purpose of the study is to consider the possibility of using mathematical matrices to build a machine code map, and also to review possible problems for automatic analysis, to search for the paths of information flows. Research methods. We used the Visual C ++ compiler. We consider the architecture in which the transfer of information can be carried out in the following ways: register-memory, memory-register, register-register. For the analysis, the method of forming the list of functions called up to the potentially dangerous block of code under investigation, chosen for each considered path to the block under study, was chosen. Methods for implementing the matrix approach are described and developed. Novelty and key findings. Mathematical matrix method can be used to build a machine code map. However, determining the reachability paths of individual code blocks may require a significant amount of resources. In addition, the machine code can be exposed to packers and obfuscators, which also introduces additional complexity. A number of potentially dangerous functions of the standard library of the C / C ++ programming language were identified.
Keywords:
Mathematical Matrix Method, Buffer overflow, Disassembling, Code analysis, Vulnerabilities, Information security, Code compilers, Code Packers, Code obfuscators, Functions List
Reference:
Baltaev R.K., Lunegov I.V..
Steganographic method of embedding information using a noise-like sequence and preserving the statistical model of images
// Cybernetics and programming.
2018. № 5.
P. 76-83.
DOI: 10.25136/2644-5522.2018.5.27634 URL: https://en.nbpublish.com/library_read_article.php?id=27634
Abstract:
The subject of research is the steganographic method of embedding information in digital images. Steganography is capable of hiding not only the content of information, but also the very fact of its existence. The paper considers one of the most important problems in the development of steganographic methods - the secrecy of the transfer of protected information. Stealth is not only visual or auditory indistinguishability of a digital media resource from a media resource with embedded information, but also statistical indistinguishability. Special attention is paid to preserving the spatial statistical dependence between the image pixels. The methodological basis of the research is the methods of mathematical statistics and image processing theory, as well as image distortion metrics. The novelty of the research lies in the development of a new method of embedding information in static images. The authors consider in detail the problem of applying the moving average autoregression process to represent the statistical dependence of image pixels. It is shown that the proposed method allows you to embed information into digital images without significant distortion.
Keywords:
information embedding algorithm, digital images, CIEDE2000, hidden communication, information security, image processing, ARMA, steganography, autoregression process, image distortion
Reference:
Fayskhanov I.F..
Authentication of users with a stable keyboard handwriting in free text selection
// Cybernetics and programming.
2018. № 3.
P. 72-86.
DOI: 10.25136/2644-5522.2018.3.25044 URL: https://en.nbpublish.com/library_read_article.php?id=25044
Abstract:
The subject of the research in this work is a dynamic process of user authentication using keyboard handwriting with free text selection.This process is a regular check of the user on a "friend-to-another" principle: the user entering the text is under continuous monitoring of the system and, in case of non-coincidence of the identification characteristics, the system refuses to continue working.The free sample is understood as follows: the user performs text input based on his current tasks, the system in turn analyzes this work, extracts signs, learns, and in case of inconsistency of characteristics, stops access. The research method used in this work is theoretical, consisting of research, search and calculations. An empirical method is also used, which consists of experiment, comparison, and study. The novelty of this paper is as follows. To date, the most popular authentication method is the password.However, the password gradually displaces the biometric means of authentication. For example, to date, many smartphones are equipped with a fingerprint scanning feature.Nevertheless, despite the effectiveness of this method, the method of keyboard authentication has its advantages: the fingerprint scanning system has the risk of not recognizing the finger, if it is injured, hacking methods of this method already exist and, most importantly, the proposed keyboard handwriting system controls by entering continuously, which will allow first, to prevent an attacker from entering the authentication phase, and also to detect it if, for example, he could, by fraudulent means, gain access to the system.
Keywords:
normal distribution, analysis, recognition, dynamic authentication, keyboard handwriting, biometry, authentication, information security, statistics, experiments
Reference:
Bashmakov D.A..
Adaptive Prediction of Pixels in Gradient Areas to Raise Steganalysis Accuracy of Static Digital Images
// Cybernetics and programming.
2018. № 2.
P. 83-93.
DOI: 10.25136/2644-5522.2018.2.25514 URL: https://en.nbpublish.com/library_read_article.php?id=25514
Abstract:
In his research Bashmakov analyzes accuracy of background area selection in static digital images by using the histogram method as part of steganalysis performed by Weighted Stego Image and WSPAM methods. He examines the dependence of practical accuracy of steganalysis of static digital images by using Weighted Stego Image and WSPAM methods on the kind of prediction model in gradient regions of an image as part of resistance to data transmission channels that use the method of embedding the least significant bit of spatial domain in static digital images with a significant part of homogeneous background. The author analyzes the Weighted Stego steganalysis algorithm and WSPAM modification thereof. To evaluate the analysis efficiency, the author has used the BOWS2 collection. To evaluate efficiency of homogenous background selection, the author has used images selected from a wide range of sources. The information is built in by changing the least significant bits of images in spatial domain with an actual load from 3-5%. Efficiency of methods is defined based on true-positive, true-negative, false-positive and false-negative values of image classification. The author demonstrates the low accuracy of homogenous background selection using the histogram method. The author suggests to select homogenous background using the segmentation neural net and proves its efficiency. He also offers an improved model of pixel prediction in image gradient areas, this model allowing to achieve the highest accuracy of steganalysis. The results of the research can be used to create systems of passive resistance to steganographic data transmission channels that are based on the Weighted Stego algorithm.
Keywords:
steganalytic algorithm, steganographic embedding, steganalysis method accuracy, image spatial domain, statistical steganalysis, passive resistance, least significant bit, binary classification, steganalysis, steganography
Reference:
Sivachev A.V..
Increasing the efficiency of steganoanalysis in the area of discrete wavelet image transformation by analyzing the parameters of the frequency domain of the image
// Cybernetics and programming.
2018. № 2.
P. 29-37.
DOI: 10.25136/2644-5522.2018.2.25564 URL: https://en.nbpublish.com/library_read_article.php?id=25564
Abstract:
The object of the study are the methods of stegan analysis in the area of discrete wavelet transformation of an image. The author investigate the influence of the fact of embedding in the region of a concrete wavelet transformation on the values of the coefficients of the regions of discrete cosine transform and image in order to improve the efficiency of detecting the fact of embedding into the discrete wavelet transformation domain. The influence of the fact of embedding in the region of discrete wavelet transformation on certain coefficients of regions of discretely cosine transform and discrete sine transformation of the image is shown. The author proposed to use certain coefficients to improve the quality of training of the support vector machine. Method of research: to assess the effectiveness of the steganoanalysis method proposed in the article using the proposed coefficients, a comparison of the efficiency of image classification with other popular steganoanalysis methods for the wavelet decomposition region is performed. As a steganographic influence, the values of the least significant bits of the coefficients of the discrete wavelet transform are used. Main results of the study is the possibility of using certain coefficients of discrete cosine transform and discretely sinus transformation of the region with the purpose of steganoanalysis in the region of discrete wavelet transformation is shown. According to the results of the study, an original method of steganoanalysis is proposed, which makes it possible to increase the efficiency of steganoanalysis for the LH and HL regions of the discrete wavelet transformation of the image. The obtained results can be used in the development of steganoanalysis systems to provide an effective detection of the fact of embedding into the discrete wavelet transformation region of an image.
Keywords:
support vector machine, machine learning, discrete sine transform, discrete cosine transform, discrete wavelet transform, frequency domain, steganalysis, steganography, binary classification, wavelet domain
Reference:
Dikii D.I., Grishentsev A.Y., Savchenko-Novopavlovskaya S.L., Nechaeva N.V., Eliseeva V.V., Artemeva V.D..
Development of neural network module for user authentication based on handwriting dynamics
// Cybernetics and programming.
2018. № 1.
P. 55-63.
DOI: 10.25136/2644-5522.2018.1.19801 URL: https://en.nbpublish.com/library_read_article.php?id=19801
Abstract:
The article is devoted to the development and investigation of the structure of the neural network module, which is part of the authentication system for users of various information systems analyzing the parameters of handwriting dynamics. The algorithm for learning the neural network module is also considered. The main task that the neural network module should solve is the implementation of a binary classifier based on input characteristic vectors such as the Cartesian coordinates of the handwriting pattern along the abscissa and ordinate axes, as well as time cuts that allow describing the writing speed of the sample. For the structures of the neural network module considered in the experiment, an experiment was performed in which different volumes of handwriting samples were fed to the input in order to determine the most stable. A mathematical model of a neural network module and a genetic algorithm for its learning are described. The article also provides an overview of the structures of neural network modules that are used in other user authentication software for the dynamics of handwriting. The substantiation of the choice of the module structure based on the results of the experiment is presented. The software implementation of the neural network module is implemented in the Java programming language.
Keywords:
binary classifier, machine learning, perceptron, genetic algorithm, artificial neural network, handwriting dynamics, authentication, biometrics, signature, password
Reference:
Bashmakov D.A., Prokhozhev N.N., Mikhailichenko O.V., Sivachev A.V..
Application of neighborhood matrices of pixels to improve the accuracy of steganoanalysis of fixed digital images with a homogeneous background
// Cybernetics and programming.
2018. № 1.
P. 64-72.
DOI: 10.25136/2644-5522.2018.1.24919 URL: https://en.nbpublish.com/library_read_article.php?id=24919
Abstract:
The article consideres the accuracy of the steganoanalysis using the Weighted Stego algorithm in passive data transmission channel countermeasures using the method of embedding the spatial region of fixed digital images with the RGB color model into the smallest significant bit. The dependence of the accuracy of steganoanalysis by the Weighted Stego method on the fraction of a homogeneous background in the analyzed image was studied. The drop in the accuracy of pixel prediction in the background areas of the image is investigated using the prediction model proposed by the authors of the original Weighted Stego algorithm. The Weighted Stego steganoanalysis algorithm is investigated. The basis for the steganoanalysis algorithm is a model for predicting the pixel values of the analyzed image from adjacent pixels. To assess the effectiveness of the analysis, the BOWS2 collection was used. Embedding information is realized by changing the least significant bits of the image in the spatial domain with a payload of 3-5 %%. The effectiveness of the methods is determined taking into account the obtained truly positive, truly negative, false positive and false negative values of the classification of images. The fall in the accuracy of steganoanalysis by the Weighted Stego method is shown with an increase in the fraction of a homogeneous background in the analyzed image. The method of improvement of the pixel prediction model in the basis of Weighted Stego is proposed, which allows to level out the drop in accuracy with increasing the fraction of a homogeneous background in the analyzed image. The results of the work are useful to a specialist in the field of information protection in the tasks of detecting and countering a hidden data channel. The obtained results can be used in the development of steganoanalysis systems based on the Weighted Stego algorithm.
Keywords:
steganalysis method accuracy, image spatial domain, statistical steganalysis, passive countermeasure, least significant bit, binary classification, steganalysis, steganography, steganographic embedding, steganalysis algorithm
Reference:
Komarova A.V., Korobeynikov A.G., Menshchikov A.A., Klyaus T.K., Negol's A.V., Sergeeva A.A..
Theoretical possibilities for combining various mathematical primitives within an electronic digital signature scheme.
// Cybernetics and programming.
2017. № 3.
P. 80-92.
DOI: 10.25136/2644-5522.2017.3.23364 URL: https://en.nbpublish.com/library_read_article.php?id=23364
Abstract:
The study is devoted to the algorithms and protocols of an electronic digital signature, providing for the key information properties: its integrity, authenticity and accessibility. This article highlights the problems of modern cryptography and a possible way to solve them via creation of an electronic digital signature that can withstand a quantum computer. The article concerns various mathematical primitives, which can increase the stability of existing cryptosystems when used together. This area of research is a new and promising one for the development of domestic cryptography. The theoretical methods of research used in this article include the theory of computational complexity, the theory of rings, fields and lattices, algorithmic aspects of lattice theory and their application in cryptography, in particular, the complexity of solving systems of linear Diophantine equations, the complexity of finding the shortest nonzero lattice vector And the vector of the lattice closest to the given vector, known approximate algorithms for these problems. We refer to experimental methods of research, such as carrying out statistical calculations and data analysis in the Mathlab mathematical environment, constructing elliptic curves in the mathematical environment of Mathcad, creating software implementations of the algorithm for generating a signature in Python, using precompiled modules from the NumPy library. It is planned to achieve the following results in the future: 1. The development of a methodology for constructing electronic digital signature schemes based on two independent computationally difficult problems; 2. The development of a polynomially complex electronic digital signature scheme based on fundamentally different mathematical primitives; 3. The estimation of the size of safe parameters of the developed EDS protocols; 4. The theoretical model of the growth of calculation time from the length of an electronic digital signature key.
Keywords:
information security, elliptic curve, the lattice theory, cryptosystem, postquantum cryptography, the shortest vector problem, discrete logarithming, Pollard algorithm, information privacy, digital signature
Reference:
Prokhozhev N.N., Sivachev A.V., Mikhailichenko O.V., Bashmakov D.A..
Improving the precision of steganalysis in the DWT sphere by using the interrelation between the spheres of one-dimensional and two-dimensional developments.
// Cybernetics and programming.
2017. № 2.
P. 78-87.
DOI: 10.7256/2306-4196.2017.2.22412 URL: https://en.nbpublish.com/library_read_article.php?id=22412
Abstract:
The article contains the studies, which are aimed at improving the precision of steganalysis in the sphere of digital image DWT. The authors analyze the causes of inaccuracy of the modern stegoanalysis methods based upon the support vectors, then they offer the directions for improving the teaching quality. In order to improve the quality of teaching support vectors machine the authors study the interrelation between the spheres of one-dimensional and two-dimensional DWT and the influence of the changes in the coefficients of the high frequency spheres of the two-dimensional DWT upon the coefficient spheres of the one-dimensional DWT. The steganographic influence involves the change in the value of the lower meaning bit coefficients of the DWT. Considering the study results the authors develop an original method, guaranteeing greater precision in the sphere of finding incorporated information in the high frequency areas of the two-dimensional DWT image. In order to prove the precision of the original method, the authors compare it with some modern steganalysis methods. Experimental results of a comparative study prove that the original method provides for greater precision (generally 10-15% higher than other evaluated methods) when detecting the fact of steganographic influence in high frequency areas of HL and LH of the two-dimensional DWT. The original method also provides for the same precision in the high frequency HH area, as do other modern methods evaluated in this article.
Keywords:
DVT, steganalysis, support vector machine , data hiding, steganogram, efficiency of steganalysis, machine learning , passive attack, binary classification, steganography
Reference:
Mironov S.V..
Game-theoretic approach to testing compilers for the presence of undeclared capabilities of implementation mechanisms
// Cybernetics and programming.
2017. № 1.
P. 119-127.
DOI: 10.7256/2306-4196.2017.1.20351 URL: https://en.nbpublish.com/library_read_article.php?id=20351
Abstract:
The subject of research is mathematical software software certification procedures for information security requirements in view of time constraints, regulatory and design requirements. This essential requirement is the availability of the source code on the test software, which is quite critical for developers as a potential channel formed intellectual property leakage. To overcome this drawback, the technique of testing the compilers on the lack of mechanisms for the implementation of undeclared capabilities to stage software compilation. The research methodology combines the methods of software engineering, theory of possibilities of object-oriented programming, systems analysis, the theory of reliability. The main conclusion of the study is that by forming an optimal set of tests using the mathematical apparatus of the theory of games, spending his compiling and analyzing the control flow graphs and data obtained from the compiler output and built according to the original texts of the tests, we can conclude the presence or absence in the test compiler mechanisms introduction of undeclared capabilities in the compiled software.
Keywords:
information security, software engineering, software compilation, introduction of undeclared capabilities, compilers testing, software certification, software, software security, program analysis, certification testing programs
Reference:
Borodin A.V..
The feasibility study on implementation of technology of support of integrity and authenticity of information on paper carrier in case of aloof document handling
// Cybernetics and programming.
2017. № 1.
P. 30-47.
DOI: 10.7256/2306-4196.2017.1.22192 URL: https://en.nbpublish.com/library_read_article.php?id=22192
Abstract:
Object of the research in a broad sense is the system of document flow of the commercial enterprise rendering services to the population and using the Internet network as the main environment of communication with the client. At the same time for support of validity of agreements between the enterprise and its clients the traditional "paper" document flow based on delivery of documents on the solid carrier with use of a mail service is used. An object of a research is a process of aloof information processing on client side in conditions when the enterprise as the contractor of the transaction, has no opportunity to control this process. Special attention in article is paid to questions of reasons for economic feasibility of implementation of the offered process of aloof document handling.The systems concept and in particular authoring technologies of the ontological analysis is the basis methodologists of a research. On the basis of the analysis of an ontological domain model the specific technical solution of safety of technological process of aloof document handling is proposed and the event model of this process is synthesized. This model is probed with use of approaches of the algebraic theory of risk.Scientific novelty of a research consists in a unique combination of the technical solutions providing the solution of an objective. The preliminary analysis of the market showed absence of similar decisions in practice of the interested companies. The main outputs of the conducted research is an opportunity and feasibility of use of technologies of aloof document handling as transition stage to completely electronic document management between the commercial enterprise and its contractors of arbitrary nature.
Keywords:
total cost of ownership, Petri net, security policy, conceptual model, threat model, document, data integrity, digital signature, legal recognition, QR-code
Reference:
Gorbunova E.S..
Dynamic Authentication of Users in Learning Management Systems
// Cybernetics and programming.
2016. № 4.
P. 65-72.
DOI: 10.7256/2306-4196.2016.4.19517 URL: https://en.nbpublish.com/library_read_article.php?id=19517
Abstract:
The object of the research is the mechanism of dynamic authentification via keystroke dynamics. The author examines reinforced authentification of users in Learning Management Systems whereas e-learning gradually occupying a niche in the education environment. The purpose of the present research is to develop and verify the system of dynamic authentification. Special attention is paid to analyzing biometric authentification and developing architectures of the required system and algorithm for classifying users based on the classifier's parameter learning as well as testing results. The author of the article carried out analysis of methods and algorithms that are used in the field of dynamic authentification and offers his alternative solution of the problem. The main results of the research include: architecture of the user authentification mechanism in Learning Management Systems; and description of the algorithm for dividing users into two classes. In accordance with the obtained requirements for the system, the author implemented the aforesaid mechanism in practice and conducted testing of the mechanism. The test shows that the desired result was obtained for the first- and second-order errors. The mechanism of keystroke dynamic authentification can be used not only in Learning Management Systems but also in other systems similar to the violator's model.
Keywords:
parametric classifier, biometrics, Learning Management System, strong authentication, biometric authentication, keystroke dynamics, security, dynamic authentication, confidentiality, behavioral authentication
Reference:
Piskova A.V., Korobeinikov A.G..
Features of applying lattice theory in digital signature schemes
// Cybernetics and programming.
2016. № 2.
P. 8-12.
DOI: 10.7256/2306-4196.2016.2.17970 URL: https://en.nbpublish.com/library_read_article.php?id=17970
Abstract:
The subject of the study is the scheme of digital signature, which is an important element in building secure systems used in most real-world security protocols. Reliability of existing schemes of electronic digital signature can be severely lowered in case of developments in classical cryptanalyst or progress in the development of quantum computers. A potential alternative approach is to construct the schemes based on the complexity of certain properties of the lattices, which are supposed to be intractable for quantum computers. Due to significant scientific advances in recent years, scheme based on lattice theory already used in practice and is a very viable alternative to number-theoretic cryptography. The study is based on the use of methods of lattice theory. This choice is dictated by the lack of solution of problem of finding the shortest vector or finding the nearest vector in polynomial time. The main conclusion of the paper is that the main area of future development in the schemes of the digital signature on the basis of lattice theory is their optimization and implementation of the Fiat-Shamir model in it. For example, Bliss scheme showed high performance and therefore it can be integrated into portable systems and devices.
Keywords:
digital signature, RSA, post-quantum cryptography, Bliss scheme, cryptography, Fiat-Shamir transformation, lattice theory, Abelian group, Euclidean space, identification scheme
Reference:
Menshchikov A.A., Gatchin Y..
Detection methods for web resources automated data collection
// Cybernetics and programming.
2015. № 5.
P. 136-157.
DOI: 10.7256/2306-4196.2015.5.16589 URL: https://en.nbpublish.com/library_read_article.php?id=16589
Abstract:
The article deals with the problem of automated data collection from web-resources. The authors present a classification of detection methods taking into account modern approaches. The article shows an analysis of existing methods for detection and countering web robots. The authors study the possibilities and limitations of combining methods. To date, there is no open system of web robots detection that would be suitable for use in real conditions. Therefore the development of an integrated system, that would include a variety of methods, techniques and approaches, is an urgent task. To solve this problem the authors developed a software product – prototype of such detection system. The system was tested on real data. The theoretical significance of this study is in the development of the current trend in the domestic segment, making a system of web robots detection based on the latest methods and the improvement of global best practices. Applied significance is in creation of a database for the development of demanded and promising software.
Keywords:
web-robots, information gathering, parsing, web robot detection, web security, information security, information protection, intrusion detection, intrusion prevention, weblogs analysis
Reference:
Mironov S.V., Kulikov G.V..
Technologies of security control for automated systems on the basis of structural and behavioral software testing
// Cybernetics and programming.
2015. № 5.
P. 158-172.
DOI: 10.7256/2306-4196.2015.5.16934 URL: https://en.nbpublish.com/library_read_article.php?id=16934
Abstract:
The subjects of the study are the basic methods and principles of testing software systems used in the interest of the safety evaluation and control of automated systems. The study provides recommendations on the methods of software testing for the most common threats to security subsystems such as firewall, audit; access control; integrity monitoring; password and encryption. The authors considered the possibility that the product could contain the following vulnerabilities: buffer overflow, incorrect handling of format means, race problems. The research methods include the methods of the theory of programming, theory of reliability, software engineering, error-correcting coding, information security, system analysis. The main conclusion of the study is that software testing is a powerful tool to detect both errors in the software and security vulnerabilities. Modern methods of behavioral testing allow to identify vulnerabilities without software source code and can be used successfully in the Russian market, where accessing the source code for testing purposes is almost impossible.
Keywords:
structural testing, software engineering, program testing method, Security Subsystem, software vulnerabilities, behavioral testing, testing programs, information security, safety of the automated system, threat security programs
Reference:
Galanina N.A., Ivanova N.N..
Analysis of the effectiveness of synthesis of computing devices for non-positional digital signal processing
// Cybernetics and programming.
2015. № 3.
P. 1-6.
DOI: 10.7256/2306-4196.2015.3.15354 URL: https://en.nbpublish.com/library_read_article.php?id=15354
Abstract:
The article researches methods, algorithms and computing devices for encoding, digital filtering and spectral analysis of signals. The subject of the study is methods of synthesis and analysis of devices for signals digital filtration and spectral analysis in system of residual classes. The proposed article presents efficiency analysis of synthesis of computing devices for non-positional digital signal processing in the system of residual classes. The authors show results of comparative evaluation of performance of computing devices for digital filtering and spectral analysis. The authors propose a method of increasing the speed of digital devices in the system of residual classes. The research is based on the apparatus of mathematical analysis, mathematical logic, theory of algorithms, theory of algebraic integers, automata theory, the theory of the discrete Fourier transform and fast variations, probability theory, mathematical methods and simulation. The study presents ways of solving the problem of implementation of digital signal processing algorithms in system of residual classes on modern signal processors taking into account peculiarities of the system of residual classes. Implementation of digital devices on digital signal processor intended for data processing in non-positional number systems, including system of residual classes, is a promising line of development digital signal processing devices.
Keywords:
fast Fourier transform, residue number system module, speed, digital signal processor, residue number system, digital signal processing, hardware expenses, spectrum analysis, service simulating test, nonpositional notation
Reference:
Sidel'nikov O.V..
Comparison of computational complexity of classification algorithms for recognizing the signs of cyber attacks
// Cybernetics and programming.
2014. № 6.
P. 7-16.
DOI: 10.7256/2306-4196.2014.6.13306 URL: https://en.nbpublish.com/library_read_article.php?id=13306
Abstract:
The article presents a comparison of computational complexity of two logical classification algorithms: an algorithm of sequential search (brute force) and algorithm of inductive states prediction. Logic algorithms are implemented in Matlab. For comparison of the computational complexity of classification algorithms author uses Zakrevskiy technique. Classification problem is one of the main problems in detection of threats of cyber attacks in the information system. Information about the signs of cyber attacks detection can be received from various sources (sensors) of software and hardware of the information system, for example, antivirus tools, dumps RAM logs, hard drives, user logon information, etc. Each of those sources contain information that can be used to determine the presence of an attack on the system. The article reviews the problem of logical classification of already existing data using two algorithms: an algorithm of sequential search (brute force) and algorithm of inductive states prediction. The use of the adapted method of inductive states prediction allowed to reduce amount of computation and get the average gain K ≈ 9,3 and thereby reduce time of detection of computer attacks.
Keywords:
logical classification algorithm, inductive algorithm, software, algorithm, threat, security, computational complexity, brute force, states prediction, Matlab
Reference:
Khaliullin A.I..
Implementation of the electronic workflow in the activities of law enforcement organizations of the Commonwealth of Independent States
// Cybernetics and programming.
2013. № 6.
P. 12-16.
DOI: 10.7256/2306-4196.2013.6.10279 URL: https://en.nbpublish.com/library_read_article.php?id=10279
Abstract:
The article reviews the current state and perspectives of the evolvement of the electronic workflow systems in the activities of the law enforcement organizations of the Commonwealth of Independent States on the example of a multiservice network for secure exchange of criminalistics data between the Ministries of Internal Affairs (Polices) of the Commonwealth of Independent States. Information technologies provide rapid exchange of information between law enforcement agencies of the Commonwealth of Independent States. The multiservice network of the secure exchange of criminalistics data between the Ministries of Internal Affairs (Polices) of the Commonwealth of Independent States is a hardware-software complex that includes software and automated workplace for the specialist. Efficiency of the multiservice network is defined also by the amount of criminalistic data in the database, most actively filled by the Ministry of Internal Affairs of Russia , Belarus and Tajikistan. The author suggests the ways of improvement of the multiservice network for secure exchange of criminalistics data between the Ministries of Internal Affairs (Polices) of the Commonwealth of Independent States regarding maintenance the legal significance of procedural documents transmitted within the network through the use of electronic signatures.
Keywords:
multiservice network, relevant criminalistic information, law enforcement agencies of the CIS, informatization, investigation of crimes, secure data exchange, cooperation of law enforcement organizations, electronic workflow, criminalistic record, procedural documents
Reference:
Galanina N.A., Ivanova N.N., Pesoshin V.A..
Ways of implementing digital signal encoders using residues in the residue number system
// Cybernetics and programming.
2013. № 1.
P. 21-36.
DOI: 10.7256/2306-4196.2013.1.8311 URL: https://en.nbpublish.com/library_read_article.php?id=8311
Abstract:
The article presents an analytical review of ways to implement encoders input residues in the residue number system and justified their selection of optimal structures. Authors evaluates instrumental and time costs considered circuit solutions. The purpose of the study is to consider all possible options encoding input signals in residual classes with the modern element base and full advantage of the system of residual classes, evaluation of hardware and time complexity of these options and the selection and justification of the best solution in terms of the criteria indicated above. Hardware cost of encoders are expressed as logical number of two-input logical elements, and to an EPROM contained in the form of its information bits in capacity. Instrumental cost of encoders on logic circuits depends on how many parts the input sequence is divided. It is concluded that it is possible to further simplify logical encoders and, as a consequence, reduce hardware expenses.
Keywords:
the element base, integrated circuit, scrambler, research, residual classes, digital signal, coding, bits, microcircuitry, hardware expenses
Reference:
Korobeinikov A.G., Kutuzov I.M., Kolesnikov P.Y..
Analysis methods of obfuscation
// Cybernetics and programming.
2012. № 1.
P. 31-37.
DOI: 10.7256/2306-4196.2012.1.13858 URL: https://en.nbpublish.com/library_read_article.php?id=13858
Abstract:
Modern computer technology makes it a variety of tasks relevant to the field of information security. For example, for the protection of copyright in the images methods of steganography are used. To solve the problem of proving authorship (or vice versa) code obfuscation techniques are used. Obfuscation (from Lat. Obfuscare - obscure, obfuscate, and English. Obfuscate - make non-obvious, confusing, confusing) or obfuscated code - is the process of bringing the source code or executable program code to the form, that keeps its functionality, but complicates the analysis, understanding algorithms and modification during decompilation. Currently, there are special programs called obfuscators that performes obfuscation to solve the task in different ways. The article discusses the techniques of obfuscation from the most basic to sophisticated polymorphic generators obfuscators performing the mathematical transformation of the code, as well as the relationship of obfuscation and efficiency of program code execution and reduce the size of the program. The authors describe further development of obfuscation techniques.
Keywords:
optimization, obfuscator, recognition, methods of obfuscation, concealment of information, information protection, copyright, steganography, obfuscation, decompilation