Information security risk assessment is a crucial component of industrial management techniques that aids in identifying, quantifying, and evaluating risks in comparison to criteria for risk acceptance and organizationally pertinent objectives. Due to its capacity to combine several parameters to determine an overall risk, the traditional fuzzy-rule-based risk assessment technique has been used in numerous industries. The technique has a drawback because it is used in situations where there are several parameters that need to be evaluated, and each parameter is expressed by a different set of linguistic phrases. In this paper, fuzzy set theory and an artificial neural network (ANN) risk prediction model that can solve the issue at hand are provided. Also developed is an algorithm that may change the risk-related factors and the overall risk level from a fuzzy property to a crisp-valued attribute is developed. The system was trained by using twelve samples representing 70%, 15%, and 15% of the dataset for training, testing, and validation, respectively. In addition, a stepwise regression model has also been designed, and its results are compared with the results of ANN. In terms of overall efficiency, the ANN model (R 2 = 0.99981 , RMSE= 0.00288 , and MSE= 0.00001 ,) performed better, though both models are satisfactory enough. It is concluded that a risk-predicting ANN model can produce accurate results as long as the training data accounts for all conceivable conditions.
Modern methods of process planning in conveyor systems with buffers of a certain size between processing devices allow optimizing schedules for single tasks or fixed task packages with a limited number of them and a limited number of devices. The use of mathematical models of the processes of performing single tasks (task packages) used by these methods in optimizing the composition of packages and schedules for their execution in systems with an arbitrary number of packages and devices is impossible. At the same time, mathematical models of the processes of executing task packages in conveyor systems in the presence of buffers of limited sizes between devices are the basis for the development of methods for optimizing their (package) compositions and schedules for the implementation of actions with them on the devices of conveyor systems. In this regard, the article develops mathematical models of multi-stage processes of performing an arbitrary number of task packages in conveyor systems in the presence of intermediate buffers of limited sizes for two and three devices, as well as for an arbitrary number of devices. The use of these models makes it possible to determine the time points of the start of the execution of task packages on the devices of conveyor systems, taking into account the limited size of intermediate buffers, as well as the duration of time intervals for the use of these resources and the efficiency of their use over time. An algorithm has also been developed for mathematical modeling of the processes of executing task packages in conveyor systems in the presence of intermediate buffers of limited size, which calculates the time characteristics of these processes based on a given order of implementation of actions with task packages on the devices of conveyor systems. An application has been developed that implements synthesized mathematical models of the processes of executing task packages in conveyor systems with intermediate buffers of limited sizes and an appropriate method for modeling these processes. Versatile testing of the developed application has shown that the obtained mathematical models and the modeling method adequately describe the course of multi-stage processes of task packages in pipeline systems, set using different values of their (processes) parameters.
Information is given about a new approach to the application of methods of the theory of semi-Markov processes to solve the applied problem of assessing the functional stability of elements that make up the information infrastructure, functioning under the influence of multiple computer attacks. The task of assessing functional stability is reduced to the task of finding the survivability function of the element under study and determining its extreme values. The relevance of the study is substantiated. The rationale is based on the assumption that quantitative methods of studying the stability of technical systems, which operate on the theory of reliability, cannot always be used to assess survivability. The concepts of «stability» and «computer attack» are being clarified. Verbal and formal statements of research tasks are formulated. The novelty of the results obtained lies in the application of well-known methods to solve a practically significant problem in a new formulation, taking into account the limitations on the resource allocated to maintain the survivability of the element under study, provided that arbitrary distribution laws are adopted for the random times of the implementation of computer attacks and the recovery times of the functional element. Recommendations on the formation of initial data, the content of the enlarged stages of modeling and a test case to demonstrate the performance of the model are given. The results of the test simulation are presented in the form of graphs of the survivability function. The resulting application can be used in practice to construct a survivability function when implementing up to three computer attacks, as well as a tool for evaluating the reliability of analogous statistical models. The limitation is explained by a progressive increase in the dimension of the analytical model and a decrease in the possibility of its meaningful interpretation.
The purpose of the study is to select the optimal conditions for collecting non-coordinate information about a spacecraft by a space optical-electronic means at the time objects pass the vicinity of the points of the minimum distance between their orbits. The quantitative indicator is proposed that characterize the measure of the possibility of obtaining non-coordinate information about space objects with the required level of quality. The arguments of the function characterizing the indicator are the distance between spacecraft; their relative speed; phase angle of illumination of a spacecraft by the Sun in relation to the optical-electronic means; the length of the time interval during which both objects are in the vicinity of the point of a minimum distance between their orbits. The value of the indicator is computed by solving three particular research problems. The first task is to search for neighborhoods that include the minimum distances between the orbits of the controlled spacecraft and optical-electronic means. To solve it, a fast algorithm for calculating the minimum distance between orbits used. Additionally, the drift of the found neighborhoods is taken into account on the time interval up to 60 hours. The second task is to estimate the characteristics of motion and the conditions of optical visibility of a controlled spacecraft in the vicinity of the minimum points of the distance between the orbits of spacecraft. The solution to this problem is carried out by using the SGP4 library of space objects motion forecast. The third task is justification and calculation of an index characterizing the measure of the possibility of obtaining an optical image of a spacecraft for given conditions of optical visibility. To solve the problem, the developed system of fuzzy inference rules and the Mamdani algorithm is used. The presented method is implemented as a program. In the course of a computational experiment, an assessment was made of the possibility of obtaining non-coordinate information on low-orbit and geostationary space objects. The proposed indicator provides an increase in the efficiency of the procedure for collecting non-coordinate information about space objects by choosing the most informative alternatives for monitoring space objects from the available set of possible observations at a given planning interval for collecting information about space objects.
The modern enterprises apply network technologies to their automated industrial control systems. Along with advantages of the above approach the risk of network attacks on automated control systems increases significantly. Hence there is an urgent need to develop automated monitoring means being capable of unauthorized access detection and of an adequate response to it. The enterprise security system should take into account components interaction and involve the ability of self-renewal throughout the entire life cycle.
The partial models of functioning of automated control systems of an enterprise under information threats are offered taking into account parameters of states of the enterprise at its different levels, realization of network threats, counteraction measures, etc. For each model it is possible to form the state space of a part of an enterprise and on the basis of the series of tests to define state transition parameters thus enabling model representation in the form of a marked graph. The sequences of states possess the properties of semi-Markov processes so semi-Markov apparatus is applicable. Probabilities of state transitions could be computed as a result of numerical solution of the corresponding system of integral equations by Lagrange-Stieltjes technique.
Application of Semi-Markov apparatus for the detection of non-authorized activities during data transfer under network scanning attack proved the validity of the above methods. In addition its application results in creation of a set of security assurance measures to be undertaken. Having obtained state transition probabilities the development of integral security indicator becomes possible thus contributing to the enterprise performance enhancement.
Emergency situations, that cause risks for human life and health, dictate elevated requirements to completeness and accuracy of the presentation of information about current ground environment. Modern robotic systems include sensors, that operate on different physical principles. This causes incrementation of information entering control system. Computing resources and technical capabilities of robotic systems are limited in range and detection probabilities of appearing objects. In case of insufficient performance of the on-board computer system and high uncertainties of ground environment, robotic systems are not able to perform without combining information from robotic group and producing a single view of ground environment. Complex information from a group of robotic systems occurs in real time and a non-deterministic environment.
To solve the problem of identifying attribute vectors related to a single object, as well as to evaluate the effectiveness of obtained solutions, is possible using known formulas of the theory of statistical hypothesis testing and probability theory only under the normal distribution law with the known mathematical expectation of an attribute vector and a correlation matrix. However, these conditions are usually not met in practice. Problems also arise when methods of nonparametric statistics are used with an unknown law of probability distribution.
The new method of identifying attribute vectors is proposed, that does not rely on a statistical approach and, therefore, does not require knowledge of the type of distribution law and the values of its parameters. Proposed method is based on the idea of combining cluster analysis and fuzzy logic, and is relatively simple to the basic methods of multidimensional nonparametric statistics.
The results of modeling information processes are presented. The advantages of proposed method are shown. The comparative values for the number of false recognitions are given. The recommendations are given for constructing fuzzy inference rules when creating an expert system knowledge base.
Traditional approaches to assessing the effectiveness of information security, based on a comparison of the possibilities of realizing threats to information security in absence and application of protection measures, do not allow to analyze the dynamics of suppression by security measures of the process of implementing threats. The paper proposes a new indicator of the effectiveness of protection of electronic documents, aimed at assessing the possibility of advancing security measures of the process of implementing threats in electronic document management systems using the probability-time characteristics of the dynamics of the application of protection measures and the implementation of threats to electronic documents. Mathematical models were developed using the Petri-Markov network apparatus and analytical relationships were obtained for calculating the proposed indicator using the example of the "traffic tunneling" threat (placing intruder packets in trusted user packets) and unauthorized access (network attacks) to electronic documents, as well as the threat of intrusion of malicious program by carrying out an "blind IP spoofing" attack (network address spoofing). Examples of calculating the proposed indicator and graphs of its dependence on the probability of detecting network attacks by the intrusion detection system and on the probability of malware detection by the anti-virus protection system are given. Quantitative dependencies are obtained for the effectiveness of protection of electronic documents due to being ahead of protection measures for threat realization processes, both on the probability of detecting an intrusion or the probability of detecting a malicious program, and on the ratio of the time spent by the protection system on detecting an attempt to implement a threat and taking measures to curb its implementation, and threat implementation time. Models allow not only to evaluate the effectiveness of measures to protect electronic documents from threats of destruction, copying, unauthorized changes, etc., but also to quantify the requirements for the response time of adaptive security systems to detectable actions aimed at violating the security of electronic documents, depending on the probability -temporal characteristics of threat realization processes, to identify weaknesses in protection systems related to the dynamics of threat realization and the reaction of defense systems to such threats electronic document.
The research aims to develop the technique for an automated detection of information system assets and comparative assessment of their criticality for farther security analysis of the target infrastructure. The assets are all information and technology objects of the target infrastructure. The size, heterogeneity, complexity of interconnections, distribution and constant modification of the modern information systems complicate this task. An automated and adaptive determination of information and technology assets and connections between them based on the determination of the static and dynamic objects of the initially uncertain infrastructure is rather challenging problem. The paper proposes dynamic model of connections between objects of the target infrastructure and the technique for its building based on the event correlation approach. The developed technique is based on the statistical analysis of the empirical data on the system events. The technique allows determining main types of analysed infrastructure, their characteristics and hierarchy. The hierarchy is constructed considering the frequency of objects use, and as the result represents their relative criticality for the system operation. For the listed goals the indexes are introduced that determine belonging of properties to the same type, joint use of the properties, as well as dynamic indexes that characterize the variability of properties relative to each other. The resulting model is used for the initial comparative assessment of criticality for the system objects. The paper describes the input data, the developed models and proposed technique for the assets detection and comparison of their criticality. The experiments that demonstrate an application of the developed technique on the example of analyzing security logs of Windows operating system are provided.
The most important task of modern robotics is the development of robots to perform the work in potentially dangerous fields which can cause the risk to human health. Currently robotic systems can not become a full replacement for man for solving complex problems in a dynamic environment despite an active development of artificial intelligence technologies.
The robots that implement the copying type of control or the so-called virtual presence of the operator are the most advanced for use in the nearest future. The principle of copying control is based on the motion capture of the remote operator and the formation of control signals for the robot’s drives. A tracking system or systems based on movement planning can be used to control the drives. The tracking systems are simpler, but systems based on motion planning allow to achieve more smooth motion and less wear on the parts of the control object. An artificial delay between the movements of the operator and the control object for necessary data collection is used to implement the control-based motion planning.
The aim of research is a reduction of delay, which appears when controlling the anthropomorphic manipulator drives based on the solution of the inverse dynamic problem, when real time copying type of control is used . For motion path planning it is proposed to use forecast values of the generalized coordinates for manipulator. Based on the measured values of the generalized coordinates of the operator's hand, time series are formed and their prediction is performed. Predictive values of generalized coordinates are used in planning the anthropomorphic manipulator trajectory and solving the inverse dynamic problem. Prediction is based on linear regression with relatively low computational complexity, which is an important criterion for the system operation in the real time operation mode. The developed mathematical apparatus, based on prediction parameters and maximum permissible accelerations of the manipulator drives, allows to find a theoretical estimate of error values limits for planning the operator's hand trajectory using the proposed approach for specific tasks. The adequacy of the maximum theoretical value of the prediction error, as well as the prospects of the proposed approach for testing in practice is confirmed by the software simulation in Matlab environment.
Mathematical models of the Earth system and its components represent one of the most powerful and effective instruments applied to explore the Earth system's behaviour in the past and present, and to predict its future state considering external influence. These models are critically reliant on a large number of various observations (in situ and remotely sensed) since the prediction accuracy is determined by, amongst other things, the accuracy of the initial state of the system in question, which, in turn, is defined by observational data provided by many different instrument types. The development of an observing network is very costly, hence the estimation of the effectiveness of existing observation network and the design of a prospective one, is very important. The objectives of this paper are (1) to present the adjoint-based approach that allows us to estimate the impact of various observations on the accuracy of prediction of the Earth system and its components, and (2) to illustrate the application of this approach to two coupled low-order chaotic dynamical systems and to the ACCESS (Australian Community Climate and Earth System Simulator) global model used operationally in the Australian Bureau of Meteorology. The results of numerical experiments show that by using the adjoint-based method it is possible to rank the observations by the degree of their importance and also to estimate the influence of target observations on the quality of predictions.
Nowadays, the systems developed to integrate real physical processes and virtual computational processes — the cyber-physical systems (CPS), are used in multiple areas of industry and critical national infrastructure, such as manufacturing, medicine, traffic management and security, automotive engineering, industrial process control, energy saving, ecological management, industrial robots, technical infrastructure management, distributed robotic systems, protection target systems, nanotechnology and biological systems technology. With wide use, the level of IT and cyberrisks increases drastically and successful attacks against the CPS will lead to unmanageable and unimaginable consequence. Thus, the need in well-designed risk assessment system of CPS is clear and such system can provide an overall view of CPS security status and support efficient allocations of safeguard resources. The nature of CPS differs from IT mainly with the requirement for real-time operations, thus, traditional risk assessment method for IT system can be adopted in CPS. Design of a unified modelling language based domain specific language described in this paper achieves synergy from in IT industry widely used UML modelling technique and the domain specific risk management extensions. As a novelty for UML modelling, especially for simulation purposes, the presented DSL is enriched by a set of stochastic attributes of modelled activities. Such stochastic attributes are usable for further implementation of discrete-event system simulators.
The paper offers an approach for assessment of cyber-resilience of computer networks based on analytical simulation of computer attacks using a stochastic networks conversion method. The concept of cyber-resilience of computer networks is justified. The mathematical foundations of its assessment, allowing to calculate cyber-resilience indices by means of analytical expressions, are considered. The coefficient of serviceability on cyber-resilience is offered to be used as the key such indicator. The considered approach assumes the creation of analytical models of cyber-attacks. The method of the stochastic networks conversion is applied to create analytical models of cyber-attacks. The time distribution function and average time to implement cyber-attacks are the simulation results. These estimates are used then to search cyber-resilience indices. The experimental results of analytical simulation which showed that the offered approach has rather high accuracy and stability of the received solutions are given
The method of assessment of semantic similarity of documents, which is based on the use of the latent and semantic analysis, dynamics of change of singular values of a term-document matrix and automatic determination of a range of rank values, is offered. Assessment of semantic similarity of documents is considered in relation to the solution of problems of identification of duplication and contradictions in databases and storages of data.
A short review of the approaches used at assessment of semantic similarity of documents, identification of duplication and contradictions in databases is provided. Results of numerical examples of assessment of semantic dependences between terms of documents for the benefit of identification of duplication and contradictions in databases and storages of data are given. In this case, the degree of correspondence between the compared documents as the resultant characteristic is calculated.
Comparative estimates of the accuracy of the calculation of the degree of correspondence of λ documents with the help of the main methods (cosine proximity measure, vector model, Spearman rank correlation coefficient, static measure tf-idf — frequency of the term — reverse document frequency) are given.
It is shown that application of the offered method of the latent and semantic analysis with automatic detection of a range of rank values allows eliminating dependence of results of application of a method of the latent semantic analysis on the chosen rank.
Researches are part of works on creation of the small size robotic research vessel for complex environmental monitoring of the sea coastal water area of the Black Sea, and also lakes and the rivers, with basing in Sevastopol. On the basis of the created analytical nonlinear model of the vessel of catamaran type with two rudder propellers, considering its mass-dimensional and traction characteristics, the initial and simplified ways of management of angular speed and ship's heading at "strong" maneuvers with the use of the method of final states are developed. "Strong" maneuvers in navigation are understood as maneuvers with big corners of turn of wheels when the linearized models of vessels are inapplicable. The developed ways of management have appearance of the simple structures which obviously depend on the setting influence, length of the vessel, emphasis of rudder propellers, vessel moment of inertia, and also the known nonlinear functions of current status — speeds on longitudinal and cross axes of the vessel, angular speed and turning angle of rudder propellers. In the simplified way values of measurements of speeds are not used.
The article deals with a new approach to text classification considering the existence of different types of classification features (binary, nominal, ordinal and interval).
The specialty of the approach is a phased classification process, which makes it possible to not cause different types of classification features to a single range. The author describes a computational experiment using texts included in Russian National Corpus and suggests the set of classification features for Russian text classification based on the age of theirs supposed readers. Text documents included in the sample are divided into two categories – for adults and for children, — according to the views of experts.
The problem of estimating the vulnerability of the speech information of a confidential nature is currently topical. However, in the use of means of acoustic protection, i.e. in conditions of strong noise, the existing instrumental and computational methods give greater accuracy when compared with the extremely labor intensive methods of articulation.
In the paper we study the method of estimating the security of voice data based on the Pearson correlation coefficient. This ratio has poor sensitivity to the spectral properties of the acoustic signals. Therefore, the author suggests an approach to the definition of the security indicator of voice data based on the mathematical apparatus of the coher-ence function of source and noisy signals.
We propose to split the entire speech frequency range of the coherence function into separate octaves. We also offer to calculate the expectation of the coherence function components in octaves and on the basis of convolution function obtain an expression for calculating the index of the vulnerability of speech.
The proposed algorithm for determining the vulnerability index of voice data allows improving the assessment accuracy.
In the paper, the dynamic model of social tension formation, presented in the form of the nonlinear differential equations, is proposed. The research of the developed model is conducted, which allowed one to study the dynamic features of the process of social tension formation and to estimate the efficiency of managing the social tension level.
Revaluation of the basic dependences which are used in the method of an assessment of security of speech information from leakage through technical channels is made. The statement and results of experiment for definition of formant distribution for cases of the usual and forced speech are described. Using these distributions it is possible to define contributions of frequency bands to total legibility of speech. Dependence of word legibility on formant legibility for a case of forced speech is received. The paper presents an experiment to determine amplitude structure of the speech by results of which conclusions of sufficient levels of a test signal are drawn for an assessment of security of speech information.
This issue briefly covers the need of numerical evaluation for Information Security Management Systems (ISMS) effectiveness in accordance with the requirements of two or more different standardization systems, such as ISO / IEC 27001 series of standards and Information Security Providing System STO Gazprom series 4.2 (ISPS). This problem is important to minimize the violation of IT-security risks and ensure the information processes stability in the information systems. This issue describes methodological difficulties in reconciling the requirements of different Standardization systems both ISO / IEC and ISPS that must be considered when assessing the ISMS effectiveness. The formulas have been proposed to solve the problem for calculating the ISMS effectiveness and discussed practical examples (cases), explaining the calculation for specific situations. These results can be used to create models and methods to provide the ISMS audits and monitoring IT-security facilities both ISMS and / or ISPS Gazprom.
Discusses the principles of evaluation the effectiveness of the malefactor in the critical infrastructure. The "operating complex" process modeling of security breach is presented. Investigated uncertainty modeling process of malefactor and ways to overcome them. A mathematical model of aggregate effectiveness of the malefactor, which removes some limitations of existing probabilistic models of random phenomena in the field of information security is developed. The model is called stochastic super indicator and is intended for research of conflict situations in critical infrastructure.
The technique of formalization of fuzzy predicates together with crisp logical variables for the specification of fuzzy logic-dynamic situations and crisp logical commands is considered. The technique is based on the submission of crisp and fuzzy logic variables by means of membership functions and on the use of fuzzy inference rules. Here we only used the forms of presentation of fuzzy logic functions which are also suitable for presentation of crisp logic functions. By the examples the possibility of using the considered technique for the computer implementation of hybrid processes is shown.
The article describes the models and methods of design automation processes functioning man-machine systems based on functional-structural theory of man-machine systems and generalized structural method of prof. A.I. Gubinsky. Describes the basic concepts and definitions of the functional-structural theory. An algorithm for generating a series-parallel connection operations with the additional constraints, the algorithm generating alternatives process of man-machine systems based on the coincidence of the objectives of operations, the algorithm generating parametric alternatives based on a template. The basic concepts and definitions necessary for the generation algorithm of process fragments with the mandatory combinations of operations. Proposed the use of combinations of binding matrix operations, in which the nonzero elements of rows are meaningful only possible combinations of standard ways of performing the relevant functional units in alternatives. Introduced the concept of the composition and the concept of incompatibility steam trains, on the basis of which the distribution of functions performed compositions. Describes the integration of optimization models processes functioning man-machine systems with simulation method, as of the functional-structural theory is only applicable for processes without aftereffect and in the absence of dependent operations. Suggested remedy this limitation by integrating design technology processes functioning man-machine systems based on the functional-structural theory with the method of simulation of the process areas that are not fulfilled the above requirements of the functional-structural theory.
Realization of medical risks leads to occurrence of adverse effects which negatively affect patient’s health; result in irrational use of human and economic recourses, economic losses. In the framework of system risk analysis medical risks is connected with uncertainty related to crucial impact of human factor to the medical system. The problem of medical risks assessment and decision making on different stages of patients’ health care support systems’ construction comes up. In the paper I provide a state-of-art analysis of Bayesian belief networks use for medical risk assessment and decision making under uncertainty support in particular in the framework of health care organizations’ risk management and insurance risk assessment.
The information systems (IS) observed a significant amount of critical threats that caused the emergence of new attack vectors, as well as deficiencies in risk management. Respectively, is of particular interest to study the problem of information security competence assessment accompanying the IP level service providers. In this issue proposes the "IT-Security Paradox" wording, which allows to consider the most important (critical) IT-Security threats and propose an approach based on the modern risk-based standards implementation, especially international standards ISO. The proposed concept of assessing the level of IT service providers information security for industrial facilities consists of 2 basic principles and a few extensions that allow to take into account the specific requirements for the IT-security specific functioning of IS and provide an opportunity to assess (qualitatively or quantitatively) as part of routine inspections (audits).
Descriptions and results of the computing experiments series devoted to the analysis of a lag effect of chaotic processes are submitted. Materials of article are continuation of the researches given in article [1]. Essential difference from the specified work is refusal of segmentation of studied process change area. Such approach allows to adjust more flexibly system of the analysis of a chaotic dynamics lag effect. Earlier received conclusions about existence of a smoothed dynamics lag effect are confirmed. Possibility of effective control strategy creation on the basis of the received conclusions demands the additional researches connected with studying of a trend dynamic properties.
In this paper a new justi_cation of maximum entropy production principle (MEPP) is proposed. The dynamics of systems with continuous distribution of parameters are reviewed on the basis of the extreme principle of the speed gradient and on the condition that the system follows the principle of maximum entropy. A set of equations is derived to describe the dynamics of the probability distribution function (pdf). It is shown that for pdfs with compact carrier the limit pdf is unique and can be obtained from Jaynes's MaxEnt principle. The asymptotic convergence is proved. The constraints imposed are the mass conservation law and the energy conservation law.
A problem of construction of a level description of classes with objects characterized by properties of their elements and relations between them is under consi\-de\-ration in the paper. The problems of recognition and analysis of such objects are NP-hard, but if descriptions of classes contain short enough and frequently occurred sub-formulas then it is possible to build a level description of classes essentially decreasing an exponent in upper bounds of steps for an algorithm solving the problemr. Usually an extracting of these sub-formulas is leaved to the investigator will. An approach to their automatic extraction is proposed in the paper.
The fundamental problem of existence inertia effect in quasichaotic processes on the basis of a computing experiments series is considered. As the data polygon are used long intervals of supervision over currency tools quotations in the electronic market Forex. The technology of observed dynamic segmentation process for ensuring visualization is used. It is established that the hypothesis of existence of inertia effect is confirmed only for smoothed process.
The possibility to use based on the transformational rules transitive approach for the specification and the computer implementation of continuous processes is considered. The examples show techniques of the processes initial specifications conversion in the specifications as sets of transformation rules. As starting specifications in the form of a physical model, the block diagram of dynamic links, in the form of ordinary differential equations are considered. These examples demonstrate the simplicity, clarity and versatility of this approach. Some problems of implementation of the processes specified by using rules are briefly discussed. The resulting model implementations of processes are evaluated using analytical methods and compared with the numerical solutions found using Matlab and MathCad.
The redundant variables method for checking and correction of computing processes in real time is considered, that is necessary for increasing of the computing processes reliability. The questions of equivalence of the initial and extended systems, improve immunity, correction ahead are considered. The redundant variables method is compared with other known methods of control, diagnosis and correction of computer systems.
This article reviews the stages of study to assess the intelligibility and quality of speech, held together Oncology Institute of RAMS and Tomsk State University of Control Systems and Radio Electronics. Software to gather material for research, a database for storing the collected material, the current state of the database to fill and further research plans was considered.
While a modern management systems creating (include - Integrated Management Systems, IMS), the range of security aspects for core business processes of the organization should be solved. Priority areas of security, especially information security (IS) is increasing due to gain the competitive environment, the emergence of new threats and the considerable complexity of the risk management procedures. IMS is highly relevant to the problem of obtaining security assessment, allowing the short and / or evaluate the prognostic aspects inherent in the organization's risks, to design an effectiveness information security management system (ISMS) and implement efficiency reasonable security measures. In this issue proposed some approaches to creating a models for IMS security assessment in accordance with the requirements both of ISO / IEC 27001:2005 and ISO 22301:2012. Given the relative newness of these standards in the practical application to the research problem in the ISM, the proposed approaches can be useful in the planning of the ISMS, security assessment has created IMS, and, in particular, to solve practical problems – IT-security audits in organizations.
There is presented the definition of logic-dynamic situation that is used in specifying the operation of hybrid dynamic systems, including hybrid control systems. The influence of random effects on the behavior of systems is discussed. The methods of estimating the probabilities of occurrence of situations based on estimates of the statistical characteristics of random processes are considered. There show how to use estimates of the probabilities for the control and decision support. The results of experimental research of methods under consideration are given.
The article deals with reliability assessment methods for systems with three-state elements. It is shown that further development of conventional logic-and-probabilistic methods (LPM) eliminates the LPM deficiencies such as analysis of only two element states and assumption of their independency. Two models of element failure impact on the system reliability are shown which demonstrate the principles of reliability assessment using the ARBITER software which incorporates capabilities of algebra of disjoint event groups and is based on the general logic-and-probabilistic method (GLPM). Description of logic and probabilistic function transformations for disjoint events is shown in the annexes.
Means of use of based on transition rules formalism for determinate processes specification for random processes implementation and their characteristic estimation are considered. A short description of the formalism and methods of its use for simulation of dynamical systems in the presence of random impacts are presented. Methods of realization of random processes with specified statistic properties and methods of random ergodic process numerical characteristic and correlation functions estimation are discussed. Examples of random process implementations and results of their characteristics and correlation functions estimations are produced.
The article attempts to use the mathematical apparatus of fuzzy logic for the evaluation of ecological and economic security of enterprises and assessment of the risks arising in the course of their economic activity. Zoning of threats to the ecological and economic safety and developed a method of assessing the effectiveness of the recommendations on the complex of necessary measures to prevent damage and to minimize losses.
With the intensification of human activities, recent years natural disasters and major accidents, the effective use of the results of ground-space monitoring and its integration with the processes of national economic management becomes important strategic factor for accelerating socio-economic development of any region of the world. This article discusses the possibility of combining modern social technologies and the process of ground-space monitoring of natural and technological objects, as well as improving the efficiency and social importance of this process, by involving public representatives to the dissemination and use of the monitoring data
A review of existing approaches to the qualification assessment of a visualization system for flight simulators is described in the paper. A set of criteria for assessing the quality of clouds visualization as well as procedures for their use are reported.
The redundant variables method for checking and correction of computing processes in real time is considered, that is necessary for increasing of the computing processes reliability. The systems with reproducible function are synthesized and analyzed. The results of simulation for different methods of integration are considered. It are registered, that the using of the redundant variables method permit to increase the accuracy of calculations.
The paper considers the last researches in the area of the security metrics. Classification of the known metrics is suggested. Multilevel approach to the security assessment is suggested. It is based on the attack graphs and service dependencies graphs. The approach allows evaluating different aspect of the system security considering its topology, operation mode, historical data about incidents and other information.
Realization of economical risks leads to occurrence of adverse effects which result in economic losses in the enterprise. The problem of different types of economical risks associated with the enterprise activities assessment and decision making systems’ construction on enterprise level as well as on different levels of enterprise performance comes up. In the paper I provide a state-of-art analysis of Bayesian belief networks use for economical risk assessment and decision making under uncertainty support in the framework of enterprise risk management. The areas of operational risk management and project risk management are singled out.
The analysis of recent publications concerning an approaches to design, implementation and maintenance of the systems for personal data (PD) protection enables to note consistently high interest in this critical problem in the aspect of ensuring information security. Sure, the proposed models based on both International and Russian standards indicate deep-in-side knowledge of all aspects of protection PD, but at the same time present new questions, an effective solution of which has yet to be synthesized and tested in practice. The present article provides some approaches for creating models of PD security assessment in accordance with the requirements of standard GOST R ISO/IEC 27001:2005. Taking into account the relative newness of this standard as applied to the research problem, the proposed approaches can be useful in planning security systems PD, security assessment of designed ITSecurity with PD and, in particular, in solving of practical problems - IT-Security audits in organizations.
In this paper the urgency of the problem of assessing the level of information protection against unauthorized access in computer networks is shown. The purpose of the paper is the development of the method of assessing the level of information protection against unauthorized access in computer networks on the basis of the security graph. The developed method provides the increase in information security management efficiency in computer networks due to complex security metric, and also application of the security graph which considers real structure of a computer network and information security system.
The approach is based on using of the known situation-event formalism of an interacting hybrid processes specification. A short description of the formalism is set out. Some advantages of its using for dynamic systems computer implementation are pointed out and its abilities for processes coordination are discussed. Hybrid processes coordination methods based on it and peculiarities of their usage are reviewed. Instances of automatic coordination systems which illustrate some methods usage are given.
In the article new approaches to an estimation of hearing quality with the use of mobile computers are considered. New classification of hearing research methods allowing to allocate possibilities of working out of elements of control systems as a part of computers are suggested. Models of research of the acoustic analyzer of the person allowing to use classical techniques of research of hearing in existing control systems of mobile computers are considered. The modified technique of audiologic researches, namely the basic tonometer experiences (experiments of Rinne, Weber, Zhelle and Federichi) varying in time and qualifying characteristics (to 50 %) is presented.
The problem of interrelation between changes of currency and stock markets is considered. The purpose of researches is to establish basic possibility of restoration and forecasting of one market state evolution on quotations trajectories of another market. Features of correlation characteristics changes depending on the size of an observations window used for their estimation are established
The problem of the modeling of trading assets quoted at the currency markets, shares etc. is considered. The historical review of quotations’ process models evolution is presented. Critical analysis of the basic approaches in the decision of the given problem is carried out. Numerical researches specifying structure of price formation process are presented.
The effective minimal join graph set synthesis algorithm (of self-managing cliques)from a given maximal knowledge pattern set and two its improvements, each of which is actualized in an algorithm, exists, however there is no algorithm that implements the two improvements. The goal of the work is to designing a new algorithm that actualizes all known improvements of the basic algorithm, and thus, is the most effective one. Such an algorithm implementing suggested improvement is designed and correctness of its work is proven.algebraic Bayesian networksalgebraic Bayesian networks
The effective minimal join graph set synthesis algorithm (of self-managing cliques) from a given maximal knowledge pattern set exists, however it can be improved by engaging results of theory of algebraic Bayesian network global structure that is under development. The goal of the work is to improve the known minimal join graph set algorithm by perfecting designing set of vertices that cliques include: instead of full search of all the pairs of a vertex and a clique weight, particular search for each clique its descendants is suggested. A new algorithm of self-managed possessions cliques implementing suggested improvement is designed.
The effective minimal join graph set synthesis algorithm (of self-managing cliques) from a given maximal knowledge pattern set is known, but it can be improved by engaging the developed theory of algebraic Bayesian network global structure. The goal of the work is to improve the known minimal join graph set algorithm by perfecting designing possessions (connection components of strong narrowings) — the key objects for the set synthesis: to design themnot with the straight search but with the corresponding cliques children’s intersections analysis. A newalgorithmimplementing suggested improvements is designed.
Various methods of aggregative indices are widely used in mathematical apparatus of control and monitoring systems aimed at detecting the typical situations and potential dangers (in terminology of JDP-information fusion models). The danger detection (identification, recognition) is not an end in itself at the information fusion process; at its peak this process develops a decision on the detected dangers’ prevention. This paper presents a methodological apparatus intended for a choice of an appropriate version of the detected danger’s prevention strategy and based upon a certain interpretation of the aggregative indices method in analytical planning methods.
The problem of creation of methodological and technological support of processes of creation of the structured knowledge of models of functioning(operation) of technical systems and decision rules on handle(control) of their states is considered(examined). The concept of its(her) solution grounded on construction obshchesistemnyh of prototypes of such models and rules for a class «technical systems» is offered.
This paper is devoted to the problem of situation assessment and prediction in application where it required to have a flexible means to change the scenario behavior, depending on the progress of the system and the current state of the environment in real time. Paper reviews and analyzes the advantages and disadvantages of existing workflow languages and it is shown that the traditional language of the specification that can represent primarily reactive behavior, lack the expressive possibilities and then not be able to cope with a task. Features of the problem specification, evaluation and prediction of the situations shown in the task management system, a fragment of rocket fuel starting complex. The paper proposes a new language, which is intended to describe the knowledge of the scenarios to assess the current state of performance scenario, to predict of its development, and to select the option to continue depending on the progress of states and the state of the environment. A description of the basic elements, graphical notation and operational semantics of language are presented. Developed language capabilities are demonstrated by the example of describing a model diagnostic of unexpected situations in the process of a fragment of fuel. This application presents examples of process specifications developed in terms of language scenarios.
The article is devoted to сelection of currency tools groups and bias correction manners for group analysis technology improvement.
Is proposed the universal skeleton functional simulator of administrative processes by technical systems (TS), with the detailed treatment of functions forming decision making process. Simulator is presented in the form of IDEF0-graphs. With the use of this simulator is determined the functional composition of simulators TS utilized in the process of the acceptance of managerial decisions. Are determined the principles of the classification of unstaff situations and detected the reasons of their appearance. Are determined abilities and the perspectives of the applicable use of such simulator.
The article is devoted to DM (Data Mining) algorithms which are the basis of new type of automatic control of multivariate dynamic processes. DM methods combine immediate control decisions with deep numerical analysis of retrospective data. New conceptual principles of analytic control allow DM to be separate part of information technology.
The determination of the functional walls of the simulators of administrative processes by the conditions of technical systems is produced on the intial stage of simulation, and are realized by empiricism. Mistakes, accepted in their determination, restrict the abilities of simulation and the automations of these processes. In paper is considered approach to the determination of the functional walls of these processes with the use of their SADT-simulators and proposed corresponding methodical recommendations for the solution of this goal. The use of these simulators and recommendations permits to realize goal-seeking information retrieval demanded for the solution of the goal of the determination of the functional walls of scrutinized processes and due to it is substantial to simplify its solution.
An approach to development of hybrid process collection executable specifications is considered. The approach is based on presentation of discrete state transition function with help of situation transformation rules. Possible structures of the rules are discussed in the view of their expressiveness and their fitness for creation of reliable specifications and high-performance executing procedures. The rules properties are shown by the examples of simple hybrid process specifications.
Rule based executable specifications of interacting different dynamic processes are discussed. Possible structures of process state transition rules are presented and peculiarities of simple hybrid process programming, based on these rules, are considered.
The researches of the Multiphase Method and the robast control algorithm of the fuzzy logic, based on the Model Memory of the Shape for assembly robots with reference to the task of measurement of coordinates for the autonomous assembly robots and the space manipulators is submitted. Also, there are submitted the software and the instrumentation of the optical television system used for 6-coordinates assigning glove for the control of the robot in a real time mode, and outcomes of experimental researches of dynamic parameters and estimations of accuracy of the optic-television system, and also, the experiments on usage of a method of the teaching by a show for the assembly manipulators. This paper is supported by the Project INTAS № 96-049 "Assembly robotics, robast forcetorque control, optimal planing based on the fuzzy logic".
The variant of evolutionary spectra analysis with relatively evident frequency domain interpretation is proposed. The procedure is to supply detailed diagnostic information on default localization of the mechanism under regard.
Methods and means for organizing programs in multiprocessor environment with dynamic architecture are considered. Object-oriented method for programming autotransforming network- represented programs that reflect mainly the structure of the problem solved but properties of computational environment is suggested. Means for supporting this approach on hardware level and on the level of the operating system and the method of graphical parallel program design in the network representation are discussed.
This article proposes algorithms for planning and controlling the movement of a mobile robot in a two-dimensional stationary environment with obstacles. The task is to reduce the length of the planned path, take into account the dynamic constraints of the robot and obtain a smooth trajectory. To take into account the dynamic constraints of the mobile robot, virtual obstacles are added to the map to cover the unfeasible sectors of the movement. This way of accounting for dynamic constraints allows the use of map-oriented methods without increasing their complexity. An improved version of the rapidly exploring random tree algorithm (multi-parent nodes RRT – MPN-RRT) is proposed as a global planning algorithm. Several parent nodes decrease the length of the planned path in comprise with the original one-node version of RRT. The shortest path on the constructed graph is found using the ant colony optimization algorithm. It is shown that the use of two-parent nodes can reduce the average path length for an urban environment with a low building density. To solve the problem of slow convergence of algorithms based on random search and path smoothing, the RRT algorithm is supplemented with a local optimization algorithm. The RRT algorithm searches for a global path, which is smoothed and optimized by an iterative local algorithm. The lower-level control algorithms developed in this article automatically decrease the robot’s velocity when approaching obstacles or turning. The overall efficiency of the developed algorithms is demonstrated by numerical simulation methods using a large number of experiments.
The sensitivity theory combines the principles and methods for exploring the influence of parameters variation on properties of control systems. The sensitivity theory was formed at the end 1950s – the beginning of 1960s as an independent scientific field. Rozenwasser E.N. and Yusupov R. M. made a significant contribution to its formation. The paper describes general aspects of the historical development of sensitivity theory, the authors’ contribution to its formation within the control theory and the general theory of dynamic systems, and scientific and organizational activity which was carried out in the country with participation of authors on promotion of the sensitivity theory.
Different approaches to decomposition of measured upward radiation into the component reflected by the water object and the noise component, generated by the reflection from atmospheric layers are considered. Advantages and disadvantages of the considered approaches are analyzed from the point of view of their application for the retrieval of surface water quality from the remote sensing observations.
Technological scheme of surface water remote sensing in visible and near infrared spectral regions under natural illumination is considered. The technology is based on the numerical solution of radiative transfer equation in the atmosphere and water media and includes the formation of special data bases for the acceleration of water state retrieval procedure.
The Deep Data Diver system uses a new technology of associative rules search which is based on modified tools of linear algebra and the usage of a data self-organization procedure and an informational structural resonance effect. The unique characteristics of the system allow to search data for highly accurate associations of the items comprising the initial transaction set with a given item. These sets form a basket with high support level and long itemsets. The article provides a general overview of the Deep Data Diver system and gives the comparison results of solving the specific task of market basket analysis.
An index and a method of its estimation are proposed for an efficiency evaluation of man-machine systems (MMS). The notion of efficiency is introduced as a complex property of MMS functioning consisting of its productivity, resource demand and operational efficiency taken in the aggregate. The efficiency index bears a probabilistic meaning. For its evaluation, a system of graph and functional models of MMS is proposed. An example of MMS efficiency estimation is given.
Development of an effective strategy of the reduction of air pollution in the regional scale requires fast estimation of both existing pollution level and that resulting from the different possible sets of measures affecting the pollutant emissions from industrial sources. In order to accelerate the necessary calculations the Euleran-Lagrangian scheme for regional air pollution modelling and the corresponding computer code were modified for multiprocessor systems. The code was tested on three systems with local memory and different software environment. The results demonstrated the effectiveness of high performance computing in the field of environmental management.
New informative technologies on the base of fractal dynamics analysis of short fragments of electroencephalograms and the results of computer experiment on automatic classification of pathology types are briefly described. Also the results of correlation analysis of suggested informative fractal characteristics as well as clinical, neurophysiological and integral neuropsychological indexes of brain work in investigated patients are discussed
A computationally efficient algorithmic solution to the problem of optimal nonlinear filtering of information impact estimates in a generalized stochastic model of information warfare is developed in the article. The formed solution is applicable in the presence of heterogeneous rules for measuring the parameters of the information warfare model, on the basis of which a pair of systems of stochastic differential equations is formed. According to the criterion of maximum likelihood according to the determined evolution of the a posteriori conditional probability density function at a given observation interval, the evaluation of the information impact in the optimal nonlinear filtering model is performed. Taking into account the probability addition theorem, as the probability of the sum of two joint events, the density functions of which are established from the numerical solution of the corresponding robust Duncan-Mortensen-Zakai equations, finding a posteriori conditional probability density function at a given time is performed. For the first event, it is assumed that the first system of stochastic differential equations is the equation of state, and the second – is the equation of observation. For the second event, their definition is set in reverse order. The solution of the robust Duncan-Mortensen-Zakai equation is carried out in the formulation of the Galerkin spectral method when sampling the observation interval into subintervals and reducing the initial solution to a numerical recurrent study of the sequence of subtasks using the so-called Yau-Yau's algorithm, which assumes an estimate of the probability measure from the solution of the direct Kolmogorov equation with its subsequent correction by observation. To highlight the features of the algorithmic implementation of the compiled solution, an algorithm for optimal nonlinear filtering of information impact estimates in a generalized stochastic model of information confrontation when specifying the listing of the function implementing it, which is represented by a pseudocode, has been formed. To identify the preference of the compiled algorithmic solution for optimal nonlinear filtering of information impact assessments, a series of computational experiments on large-volume test samples was carried out. The result of the information impact assessment obtained by the proposed algorithm is compared with the determined solution: 1) by the average sample values from the observation models; 2) by an ensemble extended Kalman filter; 3) by a filtering algorithm involving a numerical study of the Duncan-Mortensen-Zakai equation. According to the conducted a posteriori study, quantitative indicators that establish the gain of the compiled algorithm and the limits of its applicability are highlighted.
In the field of recruitment and human resources management, the problem arises of automatization of the assessment process of the characteristics of human capital, taking into account, among other things, the personality characteristics of the employee. The article is devoted to the problem of identification of such characteristics that have the greatest contribution to some indicators of the effectiveness of an employee of an organization with self-reported data on professional skills and answers to questions–statements about various psychological aspects of personality. The general structure of the survey tools based on self-reports of employees is proposed, as well as the formalization of the proposed methods of data analysis. The cluster analysis was used for the identification of groups with similar professional skills. Special psychometric scales based on the questions–statements are selected and analyzed via the item response theory approach, giving the estimates of the latent variable, that reflects personal characteristics. At the final stage of the study, the relationship between the estimated factors (identified clusters and estimated latent variables) and the indicator of employee effectiveness was assessed. As such indicator, the fact of a managerial position was used. The proposed approach is a structure of a pilot study that allows to identify the characteristics of human capital (professional skills and personality traits) that have the greatest contribution to the performance indicators of an employee or organization, and is aimed at reducing labor costs at subsequent stages of a more detailed and targeted study. The possibilities of the proposed approach are demonstrated with data collected among state civil servants in Russia. The fact of having a managerial position is used as an indicator of effectiveness.
The connectivity of autonomous vehicles induces new attack surfaces and thus the demand for sophisticated cybersecurity management. Thus, it is important to ensure that in-vehicle network monitoring includes the ability to accurately detect intrusive behavior and analyze cyberattacks from vehicle data and vehicle logs in a privacy-friendly manner. For this purpose, we describe and evaluate a method that utilizes characteristic functions and compare it with an approach based on artificial neural networks. Visual analysis of the respective event streams complements the evaluation. Although the characteristic functions method is an order of magnitude faster, the accuracy of the results obtained is at least comparable to those obtained with the artificial neural network. Thus, this method is an interesting option for implementation in in-vehicle embedded systems. An important aspect for the usage of the analysis methods within a cybersecurity framework is the explainability of the detection results.
Bézier curve is a parametric polynomial that is applied to produce good piecewise interpolation methods with more advantage over the other piecewise polynomials. It is, therefore, crucial to construct Bézier curves that are smooth and able to increase the accuracy of the solutions. Most of the known strategies for determining internal control points for piecewise Bezier curves achieve only partial smoothness, satisfying the first order of continuity. Some solutions allow you to construct interpolation polynomials with smoothness in width along the approximating curve. However, they are still unable to handle the locations of the inner control points. The partial smoothness and non-controlling locations of inner control points may affect the accuracy of the approximate curve of the dataset. In order to improve the smoothness and accuracy of the previous strategies, а new piecewise cubic Bézier polynomial with second-order of continuity C 2 is proposed in this study to estimate missing values. The proposed method employs geometric construction to find the inner control points for each adjacent subinterval of the given dataset. Not only the proposed method preserves stability and smoothness, the error analysis of numerical results also indicates that the resultant interpolating polynomial is more accurate than the ones produced by the existing methods.
The paper claims that the primary importance in solving the classification problem is to find the conditions for dividing the General complexity into classes, determine the quality of such a bundle, and verify the classifier model. We consider a mathematical model of a non-randomized classifier of features obtained without a teacher, when the number of classes is not set a priori, but only its upper bound is set. The mathematical model is presented in the form of a statement of a minimax conditional extreme task, and it is a problem of searching for the matrix of belonging of objects to a class, and representative (reference) elements within each class. The development of the feature classifier is based on the synthesis of two-dimensional probability density in the coordinate space: classes-objects. Using generalized functions, the probabilistic problem of finding the minimum Bayesian risk is reduced to a deterministic problem on a set of non-randomized classifiers. At the same time, the use of specially introduced constraints fixes non-randomized decision rules and plunges the integer problem of nonlinear programming into a General continuous nonlinear problem. For correct synthesis of the classifier, the dispersion curve of the isotropic sample is necessary. It is necessary to use the total intra-class and inter-class variance to characterize the quality of classification. The classification problem can be interpreted as a particular problem of the theory of catastrophes. Under the conditions of limited initial data, a minimax functional was found that reflects the quality of classification for a quadratic loss function. The developed mathematical model is classified as an integer nonlinear programming problem. The model is given using polynomial constraints to the form of a General problem of nonlinear continuous programming. The necessary conditions for the bundle into classes are found. These conditions can be used as sufficient when testing the hypothesis about the existence of classes.
This article presents the results of applying method for obtaining formant components of vowel phonemes for the corpus of professional reading in Russian. In this paper, a review of existing areas of development of methods for obtaining formant characteristics of vowels for different languages was made. A review was also made of the extent to which formant picture patterns are used in speech technologies and natural language processing. On the corpus of professional reading CORPRES, data was obtained on formant components for 351929 realizations of vowel phonemes on the material of 8 speakers. The data obtained are grouped in accordance with the symbols in the real transcription, which was performed by phoneticians within the framework of segmenting the corpus. The formant planes represent the distribution of allophones of vowels for all speakers according to the two first formants. The variability of formant characteristics in the corpus for pre-tonic and post-tonic allophones are presented for one male speaker. The article also presents the results testifying the difference between the rounded unstressed /i/ and /a/, which are perceived by both naive speakers and expert phoneticians as /u/. As an experimental material, the recordings of reading by one male announcer of specially selected sentences, which took into account various linguistic factors, were used. Analysis of the data of the formant components of these vowels showed that the values of the first formant of these vowels are close to the values of the stressed vowel /u/ for this speaker. The closure of these vowels corresponds to the closure of /u/. The second formant values in the vowels [u], which were to be realized as [i] and [a] are different. They are more advanced in comparison with /u/.
The analysis of well-known methods for ensuring IT-security is presented, methods for evaluating security of IT-components and Cloud services in general are considered.
An attempt to analyze cloud services not from a commercial position of a popular marketing product, but from a position of system analysis is made. The previously introduced procedure for IT-components evaluation is not stable, since the end user has not a 100% guarantee of access to all IT-components, and even more so to the remote and uncontrolled Cloud service. A number of reviews point at increased efforts to create a secure network architecture and ability to continuously monitor deviations from established business goals. In contrast to the Zero Trust and Zero Trust eXtended models, according to which additional security functions are superimposed on existing IT-components, it is proposed to consider the set of IT-components as a new entity – an Information Processing System. This will allow to move to formal processes for assessing the degree of compliance with the criteria of standards for both existing and prospective IT-components while ensuring security of Cloud services.
A new method for evaluation which is based on the previously developed hybrid methodology using formal procedures based on two systems of criteria - assessment of the degree of compliance of Management systems (based on ISO/IEC 27001 series) and assessment of functional safety requirements (based on IEC 61508 series and ISO/IEC 15408 series) is proposed. This method provides reproducible and objective assessments of security risks of Cloud-based IT‑components that can be presented to an independent group of evaluators for verification. The results obtained can be applied in the independent assessment, including critical information infrastructure objects.
The paper considers methods for comparison of objects’ images represented by sets of points using computational topology methods. The algorithms for construction of sets of real barcodes for comparison of objects’ images are proposed. The determination of barcodes of object forms allows us to study continuous and discrete structures, making it useful in computational topology. A distinctive feature of the use of the proposed comparison methods versus the methods of algebraic topology is obtaining more information about objects’ form. An important area of application of real-valued barcodes is studying invariants of big data. Proposed method combines the technology of barcodes construction with embedded non-geometrical information (color, time of formation, pen pressure), represented as functions of simplicial complexes. To do this, barcodes are expanded with functions from simplexes to represent heterogeneous information. The proposed structure of extended barcodes increases the effectiveness of persistent homology methods when comparing images and pattern recognition. A modification of the Wasserstein method is proposed for finding the distance between images by introducing non-geometric information about the distances between images, due to inequalities of the functions of the source and terminal images of the corresponding simplexes. The geometric characteristics of an object can change with diffeomorphic deformations; the proposed algorithms for the formation of expanded image barcodes are invariant to rotation and translation transformations. We considered a method for determining the distance between sets of points representing the curves, taking into account an orientation of curves’ segments. The article is intended for a reader who is familiar with basic concepts of algebraic and computational topology, the theory of Lie groups, and diffeomorphic transformations.
Determination of the nucleotide sequence of DNA or RNA containing from several hundred to hundreds of millions of monomers units allows to obtain detailed information about the genome of humans, animals and plants. The deciphering of nucleic acids’ structure was learned quite a long time ago, but initially the decoding methods were low-performing, inefficient and expensive. Methods for decoding nucleotide nucleic acid sequences are usually called sequencing methods. Instruments designed to implement sequencing methods are called sequencers.
Sequencing new generation (SNP), mass parallel sequencing are related terms that describe the technology of high-performance DNA sequencing in which the entire human genome can be sequenced within a day or two. The previous technology used to decipher the human genome required more than ten years to get final results.
A hardware-software complex (HSC) is being developed to decipher the nucleic acid sequence (NA) of pathogenic microorganisms using the method of NGS in the Institute for Analytical Instrumentation of the Russian Academy of Sciences.
The software included in the HSC plays an essential role in solving genome deciphering problems. The purpose of this article is to show the need to create algorithms for the software of the HSC for processing signals obtained in the process of genetic analysis when solving genome deciphering problems, and also to demonstrate the capabilities of these algorithms.
The paper discusses the main problems of signal processing and methods for solving them, including: automatic and semi-automatic focusing, background correction, detection of cluster images, estimation of the coordinates of their positions, creation of templates of clusters of NA molecules on the surface of the reaction cell, correction of influence neighboring optical channels for intensities of signals and the assessment of the reliability of the results of genetic analysis
Both timely and adequate response on the computer security incidents and organization losses from the computer attacks depend on the accuracy of situation recognition under the cybersecurity monitoring. The paper is devoted to the enhancement of the attack models in the form of attack graphs for the cybersecurity monitoring tasks. A number of important issues related to the application of attack graphs and their solutions are considered. They include inaccuracies in the definition of the pre- and post-conditions of attack actions, the processing of attack graph cycles for the application of Bayesian inference for the attack graph analysis, the mapping of security incidents on an attack graph, the automatic countermeasure selection in case of a high security risk level. The paper demonstrates a software prototype of the security monitoring system component which was earlier implemented and modified considering the suggested enhancements. The results of experiments are described. The influence of the modifications on the cybersecurity monitoring results is shown on a case study.
The detection of anomalies in the movement of employees is an important task of the cyber-physical security of enterprises, including critical infrastructures. The paper presents a technique to analyze the routes of the organization employees based on combination of the data mining and interactive visualization techniques. It includes two stages – detection of the groups of the employees with similar behavior and anomaly discovery. The self-organizing Kohonen maps are used to group employees on the basis of their behavior. To present spatiotemporal patterns, authors developed special visualization model named BandView. To detect anomalies authors present a rating mechanism assessing spatiotemporal attributes of the movement. The visualization of the anomalies is done using heatmaps that allow an analyst to spot place and time with a possibly suspicious activity. The technique is tested against data set provided within VAST MiniChallenge-2 contest that contains logs from access control sensors describing employees’ movement within organization building.
There is currently a growth of radio channel fire alarm. One of their main disadvantage is the vulnerability of the communication channel to the data being compromised. The authors, based on our analysis and fuzzy logic unit, identified the most tamper-proof wireless technology fire alarm to give a quantitative evaluation of the security level. The most secure technologies from complex threats (view, substitution, interception, jamming) include technologies based on spread-spectrum signals (based on the technology of random sequences and ultra-wideband signals) and are the least secure cryptographic techniques (simulation protection device). It's the necessity of further research aimed at improving the security of the radio fire alarm. A promising development is seen systems with technologies to protect the radio channel based on spread-spectrum signals (based on the technology of random sequences and ultra-wideband signals).
A nonlinear multivariable multi-agent model of knowledge accumulation in the scientific school as a result of self-organization of scientific information exchange process is first proposed in the paper. Three groups of agents (researchers) as knowledge carriers interacting nonlinearly with each other paper and monitoring their activity on the knowledge accumulation are distinguished. Two modes of agents’ scientific information exchange are considered: free mode (in the form of discussion) and business mode (in the form of joint project implementation). The proposed models have the abstract-generalizing character. They represent a system of nonlinear differential equations describing processes of knowledge accumulation as a result of scientific information exchange in compliance with existing structural relations between the agents of the scientific school. The aim of this paper is to disclose the complex mechanism of knowledge accumulation in the scientific school through various forms of active interaction of the scientific school agents representing the unity of the "man-computer-knowledge base".
This paper questions the existing contradictions between the requirements for the performance characteristics of advanced space systems and traditional technologies of navigation-ballistic support. We have carried out the formalization of the task of systematic analysis of the status and ways of modernizing navigation-ballistic support of space systems on the basis of using in the control loop precision data from systems for high-accuracy determination of ephemeris and time corrections. We have shown the need to apply these data during the stages of creation, testing and operation of not only space systems but also promising means of weapons that use radio navigation equipment in control systems. The necessity for a standard of mathematical precision in the organization of technological cycles of navigation-ballistic support of advanced space systems is substantiated.
The article discusses approaches to long-term forecast of quantitative and qualitative indices of the security subsystem of information and telecommunication systems.The possibility of their use for the analysis of protection of systems against unauthorized access is assessed.
The problem of interrelations between processes of currency quotations change in the electronic market Forex is discussed. Existence of rather stable correlative relationships is regarded as an ordering factor over the chaotic temporary sequences reflecting dynamics of currency quotations. Results of carried out researches can be considered as a functional platform for regression estimates of currency current cost. So, it can be used for effective market indicators creation.
The problem of multiregression estimation of the currency cost is considered. The offered approach is based on an adaptive choice of the regressors formed by group of currency pairs, the most correlated with an estimated asset. In the conditions of chaotic dynamics of currency quotations, correlation degree between currency pairs changes in time. From here the problem of adaptive estimation with variable structure of group of regressors follows. The method of evolutionary modeling is used for an assessment of the potential prize, reached when using the corresponding control strategy.
This article considers approaches to information systems remote security analysis. The model of process of remote security analysis of information systems using decision making theory is proposed. Existing methods to solve partially observable Markov decision processes problem are reviewed.
In this paper we discuss the problem of ice situation in the Arctic and safe navigation monitoring system development for the Northern Sea Route with application of intelligent geoinformation system. Case study illustrates the adjustment of the ship route on the part of the Northern Sea Route in accordance with the ice situation using the developed monitoring system.
On the basis of a series of computing experiments the problem of efficiency of technical analysis operating strategy based on the analysis of local trends in quasichaotic processes is considered. As the range of data long intervals of observations over quotations of currency tools in the electronic market Forex are used. It is established that existence of a local trend isn't a sufficient condition for creation of efficiency operating strategy.
The process of designing, creating and implementing a modern management systems is, at this stage of development of society, the question is not of a technical nature. Obviously, the implementation of the project without serious consideration, accurate calculation and a clear understanding of risks, evaluation of resources (budget, staff, licenses, etc.) is not possible for a modern organization working in more than tough competitive conditions. Moreover, government organizations have all of the above requirements ensure enhanced regime of national security, which is confirmed by the requirements of relevant legislation and the practice of performing the IT projects. In this issue we propose some approaches to implement the process of decision support in the selection of the model for the development of the modern organization in the design phase and evaluate the appropriateness of choice: the composition of management systems on the applicable standards, as appropriate certification functions ensure stable growth, security of business processes, the protection of valuable assets (including intangible) on the basis of statistics certification ISO.
Existing objective and subjective TV image quality assessment metrics are considered. New metrics of digital image quality test are founded. Local entropy approach to form objective TV image quality assessment metrics is demonstrated experimentally.
Is offered the method of conceptual modelling of technical systems (TS), based on the combined and joint application of known methods of the structurally functional analysis and object-oriented modelling. As a result of it the useful effect which consists in simplification of problem solving of ordering and structurization of knowledge which are used for construction of conceptual models of a design, managerial processes by states and operated functioning of the TS is gained.
Analysis of security risks and calculation of security metrics is an important task for Security Information and Events Management (SIEM) systems. It allows recognizing the current security situation and necessary countermeasures. The paper considers a technique for calculation of the security metrics in the near real time and demonstrates it on the example of the recalculation of the attack potentiality.
The article is devoted to development of a complex speaker model for using at the text-independent speaker identification. The complex speaker model is based on gaussian mixture method. The model is formed by preliminary segmented speech signal, where each segment matches to certain broad phonetic class. Method of speaker models structuring is proposed. Speaker models are structured as a tree, which allows to identify speaker without running a full search on the set of models. Researches have shown the division of the acoustic space of speaker's voice on the set of classes that represent some phonetic events, increases the efficiency of voice identification and the proposed structuring method of models accelerates the search operation.
Software product line engineering is widespread approach for software development. Variability management aims to management for various reusable assets of product line. Variability management is one of the key activity in software product line engineering. In this paper we try to answer the question Are variability management tools ready for industry use? For that purpose we review existing variability management tools: TVL, SPLOT, FeatureIDE, XFeature, FMP, FeatureMapper, CVL Tool, PLUM, pure::variants, DocLine, Gears. We estimate current state, features and industrial applications for each tool, and outline the state of the area as a whole.
Practical necessity of the graph theory expansion is shown, in particular — for the modeling and technical systems investigation. Eshgraph term is suggested. It expands ―graph‖ term. Thus it expands graph theory. Examples of the eshgraph and geometric eshgraph usage are presented
The paper describes an architecture of the logistics system based on application of the smart space idea to finding fellow-travelers for drivers. The ontology formed by mobile devices of the system participants and interconnections between them are presented. The paper also describes algorithms for finding appropriate fellow-travelers for drivers as well as definition of the pick-up and drop-off points meeting requirements of both drivers and passengers. Due to the rather large dimension of the problem, the usage of heuristics significantly reducing the dimension of the task is proposed. To demonstrate the possibilities of the architecture and its underlying components, the software prototype of this system has been developed and described.
In article authors propose the algorithm of carrying out of the qualitative analysis of safety and reliability of systems, technological processes, and also of probable risk and possible losses, by using of a special method of the analysis which is directed on reduction of probability of failures and accidents, human victims and economic losses. The basic advantages of the given method, its system and components, and causes of failures of components are considered, procedure of construction of the given method is proposed, formulas for calculations are presented
The problem of the preliminary chaotic processes structure analysis in the capital markets allowing the most effective strategy of actives management consistently choose is considered.