Optimal Terminal Diagnostics of Controlled Dynamic Systems
Abstract
Introduction: The published works on terminal diagnostics of dynamic systems discuss the methods of calculating special test signals which provide natural motion of the system, for example, within specified bounds. Such a motion is accomplished with a specially calculated terminal control. There are many options for terminal control. Therefore, it is necessary to pose and solve the problem of optimal control which would minimize the given criterion and provide that the system moves along the optimal trajectory. Purpose: Developing methods for calculating the optimal terminal control by linear discrete dynamic systems, using the lowest energy cost as a criterion. Methods: The theory of optimal control of discrete systems is used, which allows you to build formulas for obtaining optimal control over the system motion within specified boundaries with the lowest energy cost. Results: Methods have been developed for calculating optimal terminal control under given or free initial conditions of the system, based on the theory of optimal control for Lagrange and Bolz problems. A technological process has been described for optimal terminal diagnostics of discrete dynamic systems. The obtained results make it possible to perform test diagnostics of discrete dynamic systems, which differs from the known diagnostics in that the test motion of the system turns out to be natural and optimal within the given boundaries and have the lowest energy cost. Computer simulation confirmed the theory.Published
2018-06-01
How to Cite
Britov, G. (2018). Optimal Terminal Diagnostics of Controlled Dynamic Systems. Information and Control Systems, (3), 17-24. https://doi.org/10.15217/issn1684-8853.2018.3.17
Issue
Section
Information processing and control