Imprecise computation methods in measurement

A.M. Varkonyi-Koczy, T.P. Dobrowiecki, B. Pataki

Department of Measurement and Instrumentation
Technical University of Budapest
H-1521 Budapest, Hungary
E-mail: koczy@mmt.bme.hu



Measurement techniques are traditionally involved in accuracy and the usual aim is to design more and more precise measurement equipment. There are however applications where the measurement problem puts strong limits upon the achievable accuracy, permitting thus to use and to gain from various so called imprecise computational methods.

Imprecise methods serve by definition imprecise results, they compensate this deficiency by increased speed, more balanced usage of computational resources and a better description of the measurement problem.

One area of applications where the measurement task can be severly handicapped by an excessive pursuit of computational accuracy are real-time monitoring and diagnosis problems. Measurement systems in such cases are typically distributed and equiped with concurrent real-time software. Varying information processing load of different system components and changing real-time limits present a formidable challenge to the task schedulers, where any kind of accurate computation, as an aim, is out of question.

Real-time applications led to two basic and accuracy related results. One are so called imprecise scheduling methods. An intelligent scheduler, accordingly to the actual time limits, resources and loads, disposes about the accuracy level of each particular task performed within the system. Its overall goal is to make the accuracy of the final results optimal. Another idea is that of any-time algorithms. Any-time algorithms can be interrupted at any time-point and serve admissible results. The later such algorithm is stopped, the better, i.e. the more accurate results it produces.

The second area, with the imprecise computation tools being a good if not only alterantive, are those measurement applications, where problems lack well developed analytical description. Analytical models are starting points to design optimal information processing algorithms, or helping tools to select, run and interpret pre-defined methods from algorithm library. When a satisfactory analytical model is missing the quality of numerical results supplied by the traditional numerical algorithms is difficult to judge and often such results are simply meaningless. Models of relaxed accuracy, but more qualitative in nature are interesting alternative, especially in the field of biomedical, chemical and other 'difficult' engineering. Such methods are generally termed 'soft computational methods' and involve fuzzy logic, artificial neural networks and genetic algorithms.

Fuzzy logic, introducing a novel notion of uncertainty and using logic based inference scheme, can support or even substitute normal numerical algorithms producing strickingly good results. Neural networks, although originated in medical research, from an engineer point of view present powerfull nonlinear classification and decision tools. Genetic and other evolutionary algorithms, drawing strongly from biological analogies, yield alternative methods of solving optimization problems.

For an engineer the most important knowledge is when and how such new tools can be applied, what decisions are left to the user and how to judge the final results. The presentation focuses upon these questions. The main properties of various methods are shortly overviewed and their advantages and disadvantaged discussed.