Volume 1 / Issue 3 Autumn 2006 - Features

Designing a High-Performance Telemedicine System


A.V. Bogdanov

Organisation: Institute for High

Performance Computing and Information Systems, Russia

Email: [email protected]

Website: www.csa.ru


A.B. Degtyarev

Organisation: Institute for High

Performance Computing and Information Systems, Russia

Email: [email protected]

Website: www.csa.ru


Yu.I. Nechaev

Organisation: Institute for High

Performance Computing and Information Systems, Russia

Email: [email protected]

Website: www.csa.ru


A.V. Valdenberg

Organistion: Leningrad Region Clinical Hospital, Russia

Email: [email protected]

Website: www.oblmed.spb.ru


For a copy of references contained in this article, please contact [email protected].


In the first part of this series (published Spring 2006), the steps taken towards designing a telemedicine system based on high-performance computer technologies for the Institute of High Performance Computing and Information Systems in St.

Petersburg, Russia were explained. In the second article (published Summer 2006), the concept proposal for a telemedicine Internet portal was presented. In the final article of the series,


Examples of Application – Complex Architecture

The conceptual basis for the creation of telemedicine intelligence complexes (IC) is based on the fundamental principles defining the architecture of a system and the levels of its management. Such technology effectively combines the stored system of knowledge with new approaches and paradigms of an artificial intellect. The practical application of ICs provides communication between remote patients and leading scientific centres, which in turn leads to a shift in diagnostic and advisory aid rendering.


The kernel of a telecommunication system represents the realtime expert system functioning on the basis of a multiprocessor cluster under the control of the Linux operating system. Such systems prove to be efficient in resolving specialised problems, in particular, as a “hot cluster” providing fast access to great volumes of information from various remote sources during irregular time intervals. Functions of the system kernel include: information gathering, control of coded information from remote users and also the processing and formation of initial data for inference management.


Both methodological and methodical principles applied towards problems of medical diagnostics are based on the multiple parameter analysis of symptoms which, in various situations, do not have identical differential-diagnostic values, i.e. semantic information density.


Integrated Knowledge System

The concept of IC design determines the development of data and knowledge assimilation technology. As such, ontology and data mining widely utilise new generations of IC technology.The result is the creation of features that have generated a new paradigm of computer data and knowledge processing, subsequently finding a niche within developing telemedicine intelligence technologies.


The mechanism of knowledge-based functioning realisation utilises various strategies of inference. These are improved during the accumulation and use of actual medical information during the system engineering process. During the research stage, the greatest interest is represented with the strategy of a stage-by-stage conclusion. Such a strategy minimises the amount of time spent on diagnostics. Consequently, diagnosis accuracy is maintained (and in some cases even increases) and the influence of a potentially less qualified attending physician on the conclusions decreases.


The diagnosis represents a four-rank assessment:

+ Suspicions are not present,

+ Conditions are satisfactory,

+ Consultation of an expert is necessary, and

+ Consultation of the expert is urgent.

The probability estimation of each diagnosed illness is found in the logic rules based on criteria convolution. In this case, a set of estimations (including negative) is attributed to each separate feature (symptom) in the structure of the concrete logic rule. The estimation is then characterised with a point, which is attributed by the expert to each symptom.


Diagnosing has three stages: the automatic processing of results measurements, preliminary diagnostics on the basis of a case history and interrogation of the patient. Corresponding simple symptoms measuring the current rank of the illness are subsequently used in each stage.The threshold value in points is then put in conformity to each rank.The result of this inference work is an expert diagnosis, conclusion of the examination report and a record of statistical estimations in the system database.


Self-training of the system’s adaptive components are carried out by an estimation of the specificity and sensitivity of concrete attributes Given characteristics are objective and do not depend on the competence of experts. Such a process permits the monitoring of inference results during the pre-production operation of the system in comparison to expert assessments made during postponed consultations.


Foundations of Information Processing

An increase in the reliability of estimations and forecasts for clinical situations is achieved with the use of this new approach, based on the development of the “soft computing” concept6, to information processing.This approach foresees the use of two theoretical principles (see Figure 1) and provides a rational organisation of computing technology for measurement data processing in relation to the forecast and analysis of extreme situational developments. It also makes it possible to formalise the information stream at the point of realisation of ‘fuzzy’ inference within a multiprocessor computing environment4.

Figure 1. Information flow in multiprocessor computing environment: MS – measurement system; CT – competitive technologies; AA – alternatives analysis; Φ1(◊) ,…, Φ(•α ) – measurement data giving to standard (SA) and neural network (ANN) algorithms; α1β1 ,…,αNβN – output data for SA and ANN; F1(•),…,FN(•) – situations determinate in result of alternatives analysis

The competition principle provides a comparative analysis of situation estimation results by using traditional algorithms and neural network models. The principle of fuzzy information formalisation within a multiprocessor computing environment permits the realisation of parallel chains of crisp and fuzzy inference.


Complicated Situational Modelling

Let us consider the characteristics of the construction of a cardiac activity model and the methods of its identification with the help of probabilistic approach. Spatial and temporal variability of cardiac human activity (EMF) written in an electrocardiogram (EKG) form is the subject of probabilistic modelling. Usually, during EKG imaging, the EMF field is registered in 8 plates. Therefore, let us pass from a model of a spatially temporal field to an affined vector


A detailed analysis of the EKG form shows that it is necessary to take into account the following characteristics in a probabilistic model:

+ The synchronous variability of all EKG leads,

+ The cyclostationarity of cardiac activity processes,

+ Characteristic geometrical peculiarities of EKG elements, and

+ The variability of R-R intervals.

The final property is determinant as it characterises the modulation process with higher scales of variability. It is a parameter of difference to the property of periodic non-stationarity.


Next, let us present an EKG model as system of consistently incorporating impulses of various length RR: Here is a set of parameters changing from impulse to impulse. The consequence of random impulses is presented here in the form of factorial decomposition.


The identification of a probabilistic model is carried out with use of a standard approach to the determination of factorial model characteristics. In this case, the identification algorithm is the following:

+ Values of RR-intervals are calculated on the basis of the measured initial realisation of an 8-leads EKG,

+ EKG cycles are normalized to RR-intervals .


The results of this normalization for the EKG of a somatically healthy man are shown in Figure 2a,

+The matrix (for 8 leads) covariance function is calculated.

Averaging is then carried out on cycle numbers, and

+ Natural orthogonal functions (NOF) are determined via the solution of an incomplete problem of eigen values for a matrix integral of the first kind in the Fredholm equation.


The average value of normalised impulse and assessments for the first and second NOF for a somatically healthy man are shown in Figure 2b. Further analysis is related in conducting the following steps:

+ Analysis of main components is carried out,

+ Realisation of coefficients on NOF expansion is calculated, and

+ Matrix covariance functions of expansion coefficients and RR intervals are estimated. They take possibility to reproduce model of EKG with the help of multidimensional autoregressive model for parameters2.


High-Performance Computing

Let us consider mapping a probabilistic model of cluster architecture. The synthesis of an EKG ensemble by means of a probabilistic model (1-2) could be reduced to a numerical realisation of an autoregressive model of order “R” describing a system of “L” stationary time series.Ageneral parallel algorithm could be described by the following sequence7:

+ Obtaining model parameters on the main parallel branch (MPB) - operation “A”, and passing to another PB, in accordance with the communication graph of a specific algorithm – operation “B”.

+ Calculation of time series realisations fragments of the length of n on p PB (including MPB) – operation “C”, and

+ Exchange of the calculated fragments between PB (in accor dance with the communication graph) - operation “D” and the pair unification – operation “E”.


In the presentation of the communication graph of the described algorithm, it is possible to realise the following two variants (Figure 3):

+ A centralised algorithm in the framework of a BSP-model effecting on parallel branches only operation “C”. Unification of all calculated fragments is realised only on MPB, and

+ An algorithm such as “divide-and-rule” (fun-in graph), which unites fragments from PB, with the final transmission of the result to MPB.

Figure 2: EKG of a somatically healthy man. (a) – result of normalization on the values of RR intervals, (b) – average value of normalized impulse and estimations of the first and second NOF.

Benchmarking shows that, for cardiac activity modelling irrespective of model parameters R, L for supercomputers of “SKIF” row algorithms based on the fun-in graph is more effective. Results of testing show satisfactory speed-ups that allow conclusions about good enough coordination of obtained results with theoretical assumptions.


Results of benchmarking show that, for achieving a suitable performance level, it is required that there should not be less than 16 processors in the system.


Expected Results

As discussed in part two of this series (published Summer 2006), concerning the distributed hardware-software complex installed in the peripheral clinic prophylactic organisations (CPOs) of the Leningrad region, CRCH and JI&RC are planned.The hardware and software will consist of computer and specialised medical equipment connected either by a high-speed telecommunications network, or by standard links. In the framework of a uniformtelemedicine complex (development will mainly be devoted to cardiology) software supporting the following systems be included:

+ A system of gathering and assimilating medical information,

+ A multi-agent system of medical and statistical information gathering,

+ A uniform database of the patients of the Leningrad region (electronic case history),

+ A database on medicines,

+ A database on preferential categories of citizens,

+ A database on medical experts,

+Specialised medical systems of mathematical modelling,

+Specialised expert systems and decision support systems,

+Visualisation system and a system of virtual reality,

+A system of situational modelling,

+Information services and systems on directions necessary for medical workers and patients of the Leningrad region, and

+Complex for tele and videoconferences. The majority of the listed services will be realised by means of a telemedicine Internet portal or alternatively, they will be connected with the Centre directly.


As a result of work the prototype of global telemedicine network (medical GRID) will be developed.


For a copy of the references contained in this article, please contact [email protected]

Print as PDF
AuthorsA.V. BogdanovOrganisation: Institute for HighPerformance Computing and InformationSystems, RussiaEmail: [email protected]: www.csa.ru A.B.

No comment

Please login to leave a comment...

Highlighted Products