Programming languages
Reference:
Smorkalov A.Yu., Kirsanov A.N.
Bots behavior programming for virtual reality
// Software systems and computational methods.
2014. ¹ 2.
P. 149-159.
URL: https://en.nbpublish.com/library_read_article.php?id=65258
Abstract:
nowadays the field of application of virtual worlds in education grows steadily.
Training systems, simulators, role and serious games are the most successful solutions for
education in virtual environments. An important part of the above mentioned approaches
to education is usage of pedagogical agents (bots), participating in the learning process and
helping students to complete educational assignment. The vAcademia virtual world supports
implementations of active learning forms through vJS programming language, however bots
programming and using was so far unavailable. The article reviews system of bot managing,
which allows each user of vAcademia to place and to configure bots, as well as to set their
behavior via extended vJS language. Programming of bots’ behavior is based on the objectoriented
approach, auto synchronized functions, specifying sequences of asynchronous
actions and organization of bot-user interaction through voiced dialogs with pre-defined
multiple-choice answers. The interaction with preprogrammed bots can be saved in the form
of 3D-record for further viewing, which is highly important in sphere of education.
Keywords:
virtual worlds, virtual environments, learning tools, virtuality, bots, programming languages, scripts, built-in programming languages, synchronization, avatars
Forms and methods of information security administration
Reference:
Korobeinikov A.G., Pirozhnikova O.I.
Model of mathematical calculations of the probability of unauthorized physical
penetration to information assets
// Software systems and computational methods.
2014. ¹ 2.
P. 160-165.
URL: https://en.nbpublish.com/library_read_article.php?id=65259
Abstract:
according to the current state standards, “security of information assets” combines
protection of four types: physical, technical, legal and cryptographic. This implies that it is a
complex concept. Furthermore, in accordance with regulatory documents, protective actions
for providing information security are subdivided into organizational and technical measures.
Technical protective actions directed at such functions as restricting threats, deterrence,
prevention, detection, notification of various events at the facility of informatization, monitoring
the state of information assets, error correction, asset recovery system etc. Analysis of the current
state of methods and means for the alarm system, which is the most important component
of complex system of information security, showed that such systems need to be constantly improved to meet the constantly raising requirements for protection of modern objects of
information. Hence it follows that development of mathematical models for calculating the
probabilities of unauthorized physical penetration to information assets, forming the integrated
system of information security, is an urgent task. To solve the presented problem the article uses
methods of information protection, theory of graphs and probability theory. The presented
results were obtained using the Maple system of computer algebra. Scientific novelty is in
the methods being based on the graph theory and mathematical model of calculating the
probability of unauthorized physical penetration to information assets. The model itself is built
in three stages on the base of on the specific source data from the estimation of the probability
of detecting an unauthorized physical penetration to information assets by alarm system.
Keywords:
neograf, acyclic graph, unauthorized physical penetration, technical protection measures, protection of the information assets, orgraph, adjacency matrix, weight matrix, Dijkstra algorithm, somposition of probabilities
Quality aspects and improving the margin of reliability of software systems
Reference:
Boikov S.A.
Expert evaluation of functional completeness of automated information systems for
public institutions
// Software systems and computational methods.
2014. ¹ 2.
P. 166-173.
URL: https://en.nbpublish.com/library_read_article.php?id=65260
Abstract:
the article studies methods of determining functional completeness of automated
information systems implemented in state social institutions. The author defines a list of
automated functions, directly affecting the efficiency of the institution. The article reviews
the use of technique based on the Delphi method in expert evaluation. The author considers
features of the above mentioned technique as well as Spearman’s rank correlation coefficient
for the convergence of expert evaluations. The application of the method is demonstrated on
the example of eight different software products for automation of services provisioning by
public social institutions. For the first time the article proposes the use of technique based on
the Delphi method in expert evaluation with the Spearman’s rank correlation coefficient for the
convergence of expert evaluations for evaluating the quality and functional completeness of
automated information systems implemented in state social institutions. The analysis described
in the article proves that appliance of the technique mentioned above significantly increases
the objectiveness of the evaluation, through the use of feedback, analysis of the results of
previous stages, as well as their account when estimating the significance of expert opinion.
Keywords:
functional completeness, expert evaluation, information systems, Delphi method, Spearman coefficient, methods of peer review, the convergence of expert evaluations, the pair correlation coefficient, coefficient of concordance, Kendall coefficient
Knowledge Base, Intelligent Systems, Expert Systems, Decision Support Systems
Reference:
Borodin A.V.
Architecture of information system of decision support in personnel management for
retail subsystem of commercial bank
// Software systems and computational methods.
2014. ¹ 2.
P. 174-190.
URL: https://en.nbpublish.com/library_read_article.php?id=65261
Abstract:
despite the reduction in corporate market volumes for many branches of
commercial banks the center of attraction for the front-office subdivisions increasingly
shifts towards retail segment. However, the huge competition in retail forces banks to move
towards the unpopular measures of labor intensification and personnel optimization. Under
such conditions traditional ways of decision making typical for HR do not work well. The new
approaches for retrieving data for decision making must be found, new tools for objective
analysis of the situation and developing optimal solutions are needed. The article is devoted to a
description of a practical solution to the mentioned above problem. In other words, the subject
of the study presented in this paper is HR-process in retail subsystem of a commercial bank.
The research of the process of decision making in the HR-subsystem of a commercial bank was
held in terms of system analysis. The author developed visual model representing processes
typical for the subsystem, revealed the sources of information for decision making. Next, using
the built model, based on the simulation techniques and methods of numerical optimization,
the technology of automated preparation of recommendations on personnel policy for the
retail subsystem of credit institutions was developed. The approach suggested in this paper
is fundamentally different from its analogues by the width of coverage of available sources of
information and methods of extracting knowledge from databases. Suggested approach for
the first time implements simulation modeling of operational risks for retail subsystem of the
commercial bank, allowing calculating a complete probability space of outcomes. Thus the
higher accuracy and stability calculations is provided compared to Monte Carlo methods with
comparable computational cost.
Keywords:
computational complexity, simulation modeling, Petri nets, risk, decision making support, personnel management, commercial bank, class of complexity, optimization, Nelder-Mead method
Knowledge Base, Intelligent Systems, Expert Systems, Decision Support Systems
Reference:
Galochkin V.I.
Enumeration of cost-constraint based decision trees on AND-OR tree
// Software systems and computational methods.
2014. ¹ 2.
P. 191-196.
URL: https://en.nbpublish.com/library_read_article.php?id=65262
Abstract:
the article reviews AND-OR trees with defined cost of arcs or vertices, widely
used in artificial intelligence systems. The author describes branch and bound algorithm
allowing enumerating all decision trees with cost less or equaling to defined constant value.
The complexity of producing another decision tree is O(N), where N is the number of vertices of the AND-OR tree. The article shows a way to use stack for information organization, reducing
the memory consumption to O(N) without changing the previous complexity estimate. The
article presents software realization of the described algorithm, proving the theoretical
evaluation of complexity and amount of the required memory in tests. The effectiveness of
search is increased by introduction of the concept of a minimal AND-OR tree cost-constraint
subset, which ensures the existence of valid decision trees while descending the decision tree.
The decision subtrees are not listed separately but organized in blocs of AND-OR subtress in
which all options are possible.
Keywords:
artificial intelligence, algorithm, enumeration, AND-OR graph, AND-OR tree, decision tree, AND-OR tree version, cost, cost constraint
Systems analysis , search, analysis and information filtering
Reference:
Batura T.V.
Techniques of determining author’s text style and their software implementation
// Software systems and computational methods.
2014. ¹ 2.
P. 197-216.
URL: https://en.nbpublish.com/library_read_article.php?id=65263
Abstract:
the article presents a review of formal methods of text attribution. The problem of
determining the authorship of texts is present in different field and is important for philologists,
literary critics, historians, lawyers. In solving the problem of text attribution the main interest
and the main complexity is in the analysis of syntactic, lexical/idiomatic and stylistic levels
of text. In a sense, a narrower task is in the text sentiment-analysis (defining the tone of the
text). Techniques for solving the task can be useful for identifying authorship of the text.
Unfortunately, expert analysis of author’s style is complex and time consuming. It’s desirable to
find new approaches, allowing at least partially automate experts’ work. Therefore the article
pays special attention exactly to the formal methods of author’s identification and software
implementation of such methods. Currently, algorithms of data compression, methods of
mathematical statistics, probability theory, neural networks algorithms and cluster analysis
algorithms are applied for text attribution. The article describes the most popular software
systems for author’s style identification for Russian language. Author attempts to make a
comparative analysis, identify features and drawbacks of the reviews approaches. Among the
problems hindering researches in text attribution there are a problem of selecting linguostylistic
parameters of the text and a problem of selecting sample texts. The author states
that there is a need in further researches, aimed at finding new or improving existing methods
of texts attribution, at finding new characteristics allowing to clearly separate author’s style,
including cases of short texts and small number of sample texts.
Keywords:
text attribution, defining authorship, formal text parameters, author’s style, text classification, machine learning, statistical analysis, computer linguistics, identification of author’s style, analysis of textual information
Mathematical models and computer simulation experiment
Reference:
Egoshin A.V., Motorov M.N.
Coordinate calculations in GPS and GLONASS navigation systems based on the
measuring of time of satellite signals arrival
// Software systems and computational methods.
2014. ¹ 2.
P. 217-227.
URL: https://en.nbpublish.com/library_read_article.php?id=65264
Abstract:
in navigation there are two systems of radio navigation: NAVSTAR or GLONASS.
Both systems use same approach – determination of the distance from satellites to the
object. Distance measuring is performed by measuring the propagation time of the satellite
signal to the object. For that purpose the receiver generates pseudorandom code at the
exact same moment, when the satellite transmits the signal. On the receiving of the signal
by the receiver, the receiver calculates the propagation time as the difference between time
of the pseudorandom code generation and the rime of signal receiving. This rises the need
to synchronize clocks of the satellite and the receiver. Due to hardware limitations not all
receivers can be synchronized with the satellite’s clock. The method presented in the research
is based on the analysis of radio navigation systems, principles of functioning and existing
techniques of defining object’s coordinates. The article proposes a new way of determining
the coordinates, based on the measuring the time difference of signals arrival from different
satellites. Because of this there is no need in strict synchronization of the receiver clock and
satellite clock, detecting the moment of signal transition from satellite to receiver and use of
corrective systems.
Keywords:
GPS, delay, trilateration, coordinate determination, satellite navigation, GLONASS, corrective systems, radio navigation, WAAS, range difference method
Software for innovative information technologies
Reference:
Bakhrushin V.E.
Software implementation of non-linear statistics relationships analysis methods in the
R system
// Software systems and computational methods.
2014. ¹ 2.
P. 228-238.
URL: https://en.nbpublish.com/library_read_article.php?id=65265
Abstract:
existing software for data statistical analysis (SPSS, Statistica etc.) usually offer
for defining correlations just methods applicable for finding linear relationships in numerical
data, along with some relation indicators for rank, qualitative and mixed data. However actual
relation between quantitative data is often nonlinear. This leads to the fact that present
means do not allow identifying such relations, which can lead to false conclusions about the
absence of correlation. An universal indicator of present statistical correlation between two
rows of numerical data is sample coefficient of determination. There are two approaches to
calculated that coefficient: first is based on the approximation of some unknown function with
piecewise constant function, second is based on the smoothing available data. The article
proposes software realization for both methods in R system. The advantage of this system is in the availability of a large number of specialized library functions for statistical analysis, as
well as in writing programs for non-standard tasks. Testing of the developed application on
model examples proved their correctness allowing the use for solving practical problems in
nonlinear correlation analysis.
Keywords:
nonlinear relationship, coefficient of determination, software, R programming Language, data smoothing, correlation ratio, Pearson correlation coefficient, testing, data grouping, piecewise constant function
Software for innovative information technologies
Reference:
Giniyatullin V.M., Arslanov I.G., Bogdanova P.D., Gabitov R.N., Salikhova M.A.
Methods of implementation of ternary logic functions
// Software systems and computational methods.
2014. ¹ 2.
P. 239-254.
URL: https://en.nbpublish.com/library_read_article.php?id=65266
Abstract:
the study uses initial data in form of truth table of three-dimensional function of
binary, ternary and mixed logics. Calculation of values of the functions is done by their geometric
interpretations, disjunctive / conjunctive normal forms, incompletely connected artificial neural
networks and perceptrons with hidden layer. The article in detail reviews the intermediate
computation results for all methods mention above. The authors study properties of functions
of mixed logic: binary-ternary and 3-2 logics in one-, two- and three-dimensions. The article
presents mutually equivalent implementations of logic functions in the form of disjunctive
normal form and in the form of incompletely connected artificial neural network. The authors
performed replacement of continuous activation function with ternary threshold function. The
study includes building disjunctive normal forms, direct synthesis of incompletely connected
artificial neural network weights matrix. The perceptron is trained with Back Propagation
algorithm. Some conclusions are formed based on the laws of mathematical induction. The
article shows that: 1. minimization of the quantity of neurons in perceptron’s hidden layer
implicitly leads to the usage of many-valued logics; 2. some functions of binary-ternary logics
may be used to build disjunctive forms; 3. a bijective way to convert disjunctive normal form
into the form of incompletely connected artificial neural network and vice versa exists; 4. in
one-dimensional 3-2 logic there are only eight functions and all of them are listed; 5. proposed
structure of incompletely connected artificial neural network may implement any function of
ternary logic in any dimensionality.
Keywords:
XOR problem, perceptron, separating hyperplane, activation function, perfect disjunctive form, binary-ternary logic, 3-2 logic, ternary logic, neural network training, Back Propagation algorithm