Search results for: arrhythmia database
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1619

Search results for: arrhythmia database

1619 Detection of Cardiac Arrhythmia Using Principal Component Analysis and Xgboost Model

Authors: Sujay Kotwale, Ramasubba Reddy M.

Abstract:

Electrocardiogram (ECG) is a non-invasive technique used to study and analyze various heart diseases. Cardiac arrhythmia is a serious heart disease which leads to death of the patients, when left untreated. An early-time detection of cardiac arrhythmia would help the doctors to do proper treatment of the heart. In the past, various algorithms and machine learning (ML) models were used to early-time detection of cardiac arrhythmia, but few of them have achieved better results. In order to improve the performance, this paper implements principal component analysis (PCA) along with XGBoost model. The PCA was implemented to the raw ECG signals which suppress redundancy information and extracted significant features. The obtained significant ECG features were fed into XGBoost model and the performance of the model was evaluated. In order to valid the proposed technique, raw ECG signals obtained from standard MIT-BIH database were employed for the analysis. The result shows that the performance of proposed method is superior to the several state-of-the-arts techniques.

Keywords: cardiac arrhythmia, electrocardiogram, principal component analysis, XGBoost

Procedia PDF Downloads 88
1618 Bundle Block Detection Using Spectral Coherence and Levenberg Marquardt Neural Network

Authors: K. Padmavathi, K. Sri Ramakrishna

Abstract:

This study describes a procedure for the detection of Left and Right Bundle Branch Block (LBBB and RBBB) ECG patterns using spectral Coherence(SC) technique and LM Neural Network. The Coherence function finds common frequencies between two signals and evaluate the similarity of the two signals. The QT variations of Bundle Blocks are observed in lead V1 of ECG. Spectral Coherence technique uses Welch method for calculating PSD. For the detection of normal and Bundle block beats, SC output values are given as the input features for the LMNN classifier. Overall accuracy of LMNN classifier is 99.5 percent. The data was collected from MIT-BIH Arrhythmia database.

Keywords: bundle block, SC, LMNN classifier, welch method, PSD, MIT-BIH, arrhythmia database

Procedia PDF Downloads 249
1617 Choosing an Optimal Epsilon for Differentially Private Arrhythmia Analysis

Authors: Arin Ghazarian, Cyril Rakovski

Abstract:

Differential privacy has become the leading technique to protect the privacy of individuals in a database while allowing useful analysis to be done and the results to be shared. It puts a guarantee on the amount of privacy loss in the worst-case scenario. Differential privacy is not a toggle between full privacy and zero privacy. It controls the tradeoff between the accuracy of the results and the privacy loss using a single key parameter called

Keywords: arrhythmia, cardiology, differential privacy, ECG, epsilon, medi-cal data, privacy preserving analytics, statistical databases

Procedia PDF Downloads 119
1616 Auto Classification of Multiple ECG Arrhythmic Detection via Machine Learning Techniques: A Review

Authors: Ng Liang Shen, Hau Yuan Wen

Abstract:

Arrhythmia analysis of ECG signal plays a major role in diagnosing most of the cardiac diseases. Therefore, a single arrhythmia detection of an electrocardiographic (ECG) record can determine multiple pattern of various algorithms and match accordingly each ECG beats based on Machine Learning supervised learning. These researchers used different features and classification methods to classify different arrhythmia types. A major problem in these studies is the fact that the symptoms of the disease do not show all the time in the ECG record. Hence, a successful diagnosis might require the manual investigation of several hours of ECG records. The point of this paper presents investigations cardiovascular ailment in Electrocardiogram (ECG) Signals for Cardiac Arrhythmia utilizing examination of ECG irregular wave frames via heart beat as correspond arrhythmia which with Machine Learning Pattern Recognition.

Keywords: electrocardiogram, ECG, classification, machine learning, pattern recognition, detection, QRS

Procedia PDF Downloads 337
1615 A Heart Arrhythmia Prediction Using Machine Learning’s Classification Approach and the Concept of Data Mining

Authors: Roshani S. Golhar, Neerajkumar S. Sathawane, Snehal Dongre

Abstract:

Background and objectives: As the, cardiovascular illnesses increasing and becoming cause of mortality worldwide, killing around lot of people each year. Arrhythmia is a type of cardiac illness characterized by a change in the linearity of the heartbeat. The goal of this study is to develop novel deep learning algorithms for successfully interpreting arrhythmia using a single second segment. Because the ECG signal indicates unique electrical heart activity across time, considerable changes between time intervals are detected. Such variances, as well as the limited number of learning data available for each arrhythmia, make standard learning methods difficult, and so impede its exaggeration. Conclusions: The proposed method was able to outperform several state-of-the-art methods. Also proposed technique is an effective and convenient approach to deep learning for heartbeat interpretation, that could be probably used in real-time healthcare monitoring systems

Keywords: electrocardiogram, ECG classification, neural networks, convolutional neural networks, portable document format

Procedia PDF Downloads 40
1614 A Survey on Concurrency Control Methods in Distributed Database

Authors: Seyed Mohsen Jameii

Abstract:

In the last years, remarkable improvements have been made in the ability of distributed database systems performance. A distributed database is composed of some sites which are connected to each other through network connections. In this system, if good harmonization is not made between different transactions, it may result in database incoherence. Nowadays, because of the complexity of many sites and their connection methods, it is difficult to extend different models in distributed database serially. The principle goal of concurrency control in distributed database is to ensure not interfering in accessibility of common database by different sites. Different concurrency control algorithms have been suggested to use in distributed database systems. In this paper, some available methods have been introduced and compared for concurrency control in distributed database.

Keywords: distributed database, two phase locking protocol, transaction, concurrency

Procedia PDF Downloads 318
1613 Wavelet-Based Classification of Myocardial Ischemia, Arrhythmia, Congestive Heart Failure and Sleep Apnea

Authors: Santanu Chattopadhyay, Gautam Sarkar, Arabinda Das

Abstract:

This paper presents wavelet based classification of various heart diseases. Electrocardiogram signals of different heart patients have been studied. Statistical natures of electrocardiogram signals for different heart diseases have been compared with the statistical nature of electrocardiograms for normal persons. Under this study four different heart diseases have been considered as follows: Myocardial Ischemia (MI), Congestive Heart Failure (CHF), Arrhythmia and Sleep Apnea. Statistical nature of electrocardiograms for each case has been considered in terms of kurtosis values of two types of wavelet coefficients: approximate and detail. Nine wavelet decomposition levels have been considered in each case. Kurtosis corresponding to both approximate and detail coefficients has been considered for decomposition level one to decomposition level nine. Based on significant difference, few decomposition levels have been chosen and then used for classification.

Keywords: arrhythmia, congestive heart failure, discrete wavelet transform, electrocardiogram, myocardial ischemia, sleep apnea

Procedia PDF Downloads 100
1612 Metric Suite for Schema Evolution of a Relational Database

Authors: S. Ravichandra, D. V. L. N. Somayajulu

Abstract:

Requirement of stakeholders for adding more details to the database is the main cause of the schema evolution in the relational database. Further, this schema evolution causes the instability to the database. Hence, it is aimed to define a metric suite for schema evolution of a relational database. The metric suite will calculate the metrics based on the features of the database, analyse the queries on the database and measures the coupling, cohesion and component dependencies of the schema for existing and evolved versions of the database. This metric suite will also provide an indicator for the problems related to the stability and usability of the evolved database. The degree of change in the schema of a database is presented in the forms of graphs that acts as an indicator and also provides the relations between various parameters (metrics) related to the database architecture. The acquired information is used to defend and improve the stability of database architecture. The challenges arise in incorporating these metrics with varying parameters for formulating a suitable metric suite are discussed. To validate the proposed metric suite, an experimentation has been performed on publicly available datasets.

Keywords: cohesion, coupling, entropy, metric suite, schema evolution

Procedia PDF Downloads 417
1611 Diagnosis of the Heart Rhythm Disorders by Using Hybrid Classifiers

Authors: Sule Yucelbas, Gulay Tezel, Cuneyt Yucelbas, Seral Ozsen

Abstract:

In this study, it was tried to identify some heart rhythm disorders by electrocardiography (ECG) data that is taken from MIT-BIH arrhythmia database by subtracting the required features, presenting to artificial neural networks (ANN), artificial immune systems (AIS), artificial neural network based on artificial immune system (AIS-ANN) and particle swarm optimization based artificial neural network (PSO-NN) classifier systems. The main purpose of this study is to evaluate the performance of hybrid AIS-ANN and PSO-ANN classifiers with regard to the ANN and AIS. For this purpose, the normal sinus rhythm (NSR), atrial premature contraction (APC), sinus arrhythmia (SA), ventricular trigeminy (VTI), ventricular tachycardia (VTK) and atrial fibrillation (AF) data for each of the RR intervals were found. Then these data in the form of pairs (NSR-APC, NSR-SA, NSR-VTI, NSR-VTK and NSR-AF) is created by combining discrete wavelet transform which is applied to each of these two groups of data and two different data sets with 9 and 27 features were obtained from each of them after data reduction. Afterwards, the data randomly was firstly mixed within themselves, and then 4-fold cross validation method was applied to create the training and testing data. The training and testing accuracy rates and training time are compared with each other. As a result, performances of the hybrid classification systems, AIS-ANN and PSO-ANN were seen to be close to the performance of the ANN system. Also, the results of the hybrid systems were much better than AIS, too. However, ANN had much shorter period of training time than other systems. In terms of training times, ANN was followed by PSO-ANN, AIS-ANN and AIS systems respectively. Also, the features that extracted from the data affected the classification results significantly.

Keywords: AIS, ANN, ECG, hybrid classifiers, PSO

Procedia PDF Downloads 406
1610 Lipid Emulsion versus DigiFab in a Rat Model of Acute Digoxin Toxicity

Authors: Cansu Arslan Turan, Tuba Cimilli Ozturk, Ebru Unal Akoglu, Kemal Aygun, Ecem Deniz Kırkpantur, Ozge Ecmel Onur

Abstract:

Although the mechanism of action is not well known, Intravenous Lipid Emulsion (ILE) has been shown to be effective in the treatment of lipophilic drug intoxications. It is thought that ILE probably separate the lipophilic drugs from target tissue by creating a lipid-rich compartment in the plasma. The second theory is that ILE provides energy to myocardium with high dose free fatty acids activating the voltage gated calcium channels in the myocytes. In this study, the effects of ILE treatment on digoxin overdose which are frequently observed in emergency departments was searched in an animal model in terms of cardiac side effects and survival. The study was carried out at Yeditepe University, Faculty of Medicine-Experimental Animals Research Center Labs in December 2015. 40 Sprague-Dawley rats weighing 300-400 g were divided into 5 groups randomly. As the pre-treatment, the first group received saline, the second group received lipid, the third group received DigiFab, and the fourth group received DigiFab and lipid. Following that, digoxin was infused to all groups until death except the control group. First arrhythmia and cardiac arrest occurrence times were recorded. As no medication causing arrhythmia was infused, Group 5 was excluded from the statistical analysis performed for the comparisons of first arrhythmia and death time. According to the results although there was no significant difference in the statistical analysis comparing the four groups, as the rats, only exposed to digoxin intoxication were compared with the rats pre-treated with ILE in terms of first arrhythmia time and cardiac arrest occurrence times, significant difference was observed between the groups. According to our results, using DigiFab treatment, intralipid treatment, intralipid and DigiFab treatment for the rats exposed to digoxin intoxication makes no significant difference in terms of the first arrhythmia and death occurrence time. However, it is not possible to say that at the doses we use in the study, ILE treatment might be successful at least as a known antidote. The fact that the statistical significance between the two groups is not observed in the inter-comparisons of all the groups, the study should be repeated in the larger groups.

Keywords: arrhytmia, cardiac arrest, DigiFab, digoxin intoxication

Procedia PDF Downloads 201
1609 Dynamic Store Procedures in Database

Authors: Muhammet Dursun Kaya, Hasan Asil

Abstract:

In recent years, different methods have been proposed to optimize question processing in database. Although different methods have been proposed to optimize the query, but the problem which exists here is that most of these methods destroy the query execution plan after executing the query. This research attempts to solve the above problem by using a combination of methods of communicating with the database (the present questions in the programming code and using store procedures) and making query processing adaptive in database, and proposing a new approach for optimization of query processing by introducing the idea of dynamic store procedures. This research creates dynamic store procedures in the database according to the proposed algorithm. This method has been tested on applied software and results shows a significant improvement in reducing the query processing time and also reducing the workload of DBMS. Other advantages of this algorithm include: making the programming environment a single environment, eliminating the parametric limitations of the stored procedures in the database, making the stored procedures in the database dynamic, etc.

Keywords: relational database, agent, query processing, adaptable, communication with the database

Procedia PDF Downloads 337
1608 Programming Language Extension Using Structured Query Language for Database Access

Authors: Chapman Eze Nnadozie

Abstract:

Relational databases constitute a very vital tool for the effective management and administration of both personal and organizational data. Data access ranges from a single user database management software to a more complex distributed server system. This paper intends to appraise the use a programming language extension like structured query language (SQL) to establish links to a relational database (Microsoft Access 2013) using Visual C++ 9 programming language environment. The methodology used involves the creation of tables to form a database using Microsoft Access 2013, which is Object Linking and Embedding (OLE) database compliant. The SQL command is used to query the tables in the database for easy extraction of expected records inside the visual C++ environment. The findings of this paper reveal that records can easily be accessed and manipulated to filter exactly what the user wants, such as retrieval of records with specified criteria, updating of records, and deletion of part or the whole records in a table.

Keywords: data access, database, database management system, OLE, programming language, records, relational database, software, SQL, table

Procedia PDF Downloads 159
1607 Deploying a Platform as a Service Cloud Solution to Support Student Learning

Authors: Jiangping Wang

Abstract:

This presentation describes the design and implementation of PaaS (platform as a service) cloud-based labs that are used in database-related courses to teach students practical skills. Traditionally, all labs are implemented in a desktop-based environment where students have to install heavy client software to access database servers. In order to release students from that burden, we have successfully deployed the cloud-based solution to support database-related courses, from which students and teachers can practice and learn database topics in various database courses via cloud access. With its development environment, execution runtime, web server, database server, and collaboration capability, it offers a shared pool of configurable computing resources and comprehensive environment that supports students’ needs without the complexity of maintaining the infrastructure.

Keywords: PaaS, database environment, e-learning, web server

Procedia PDF Downloads 234
1606 Diagnostic and Prognostic Use of Kinetics of Microrna and Cardiac Biomarker in Acute Myocardial Infarction

Authors: V. Kuzhandai Velu, R. Ramesh

Abstract:

Background and objectives: Acute myocardial infarction (AMI) is the most common cause of mortality and morbidity. Over the last decade, microRNAs (miRs) have emerged as a potential marker for detecting AMI. The current study evaluates the kinetics and importance of miRs in the differential diagnosis of ST-segment elevated MI (STEMI) and non-STEMI (NSTEMI) and its correlation to conventional biomarkers and to predict the immediate outcome of AMI for arrhythmias and left ventricular (LV) dysfunction. Materials and Method: A total of 100 AMI patients were recruited for the study. Routine cardiac biomarker and miRNA levels were measured during diagnosis and serially at admission, 6, 12, 24, and 72hrs. The baseline biochemical parameters were analyzed. The expression of miRs was compared between STEMI and NSTEMI at different time intervals. Diagnostic utility of miR-1, miR-133, miR-208, and miR-499 levels were analyzed by using RT-PCR and with various diagnostics statistical tools like ROC, odds ratio, and likelihood ratio. Results: The miR-1, miR-133, and miR-499 showed peak concentration at 6 hours, whereas miR-208 showed high significant differences at all time intervals. miR-133 demonstrated the maximum area under the curve at different time intervals in the differential diagnosis of STEMI and NSTEMI which was followed by miR-499 and miR-208. Evaluation of miRs for predicting arrhythmia and LV dysfunction using admission sample demonstrated that miR-1 (OR = 8.64; LR = 1.76) and miR-208 (OR = 26.25; LR = 5.96) showed maximum odds ratio and likelihood respectively. Conclusion: Circulating miRNA showed a highly significant difference between STEMI and NSTEMI in AMI patients. The peak was much earlier than the conventional biomarkers. miR-133, miR-208, and miR-499 can be used in the differential diagnosis of STEMI and NSTEMI, whereas miR-1 and miR-208 could be used in the prediction of arrhythmia and LV dysfunction, respectively.

Keywords: myocardial infarction, cardiac biomarkers, microRNA, arrhythmia, left ventricular dysfunction

Procedia PDF Downloads 98
1605 HRV Analysis Based Arrhythmic Beat Detection Using kNN Classifier

Authors: Onder Yakut, Oguzhan Timus, Emine Dogru Bolat

Abstract:

Health diseases have a vital significance affecting human being's life and life quality. Sudden death events can be prevented owing to early diagnosis and treatment methods. Electrical signals, taken from the human being's body using non-invasive methods and showing the heart activity is called Electrocardiogram (ECG). The ECG signal is used for following daily activity of the heart by clinicians. Heart Rate Variability (HRV) is a physiological parameter giving the variation between the heart beats. ECG data taken from MITBIH Arrhythmia Database is used in the model employed in this study. The detection of arrhythmic heart beats is aimed utilizing the features extracted from the HRV time domain parameters. The developed model provides a satisfactory performance with ~89% accuracy, 91.7 % sensitivity and 85% specificity rates for the detection of arrhythmic beats.

Keywords: arrhythmic beat detection, ECG, HRV, kNN classifier

Procedia PDF Downloads 315
1604 Young Female’s Heart Was Bitten by Unknown Ghost (Isolated Cardiac Sarcoidosis): A Case Report

Authors: Heru Al Amin

Abstract:

Sarcoidosis is a granulomatous inflammatory disorder of unclear etiology that can affect multiple different organ systems. Isolated cardiac sarcoidosis is a very rare condition that causes lethal arrhythmia and heart failure. A definite diagnosis of cardiac sarcoidosis remains challenging. The use of multimodality imaging plays a pivotal role in the diagnosis of this entity. Case summary: In this report, we discuss a case of a 50-year-old woman who presented with recurrent palpitation, dizziness, vertigo and presyncope. Electrocardiogram revealed variable heart blocks, including first-degree AV block, second-degree AV block, high-degree AV block, complete AV block, trifascicular block and sometimes supraventricular arrhythmia. Twenty-four hours of Holter monitoring show atrial bigeminy, first-degree AV block and trifascicular block. Transthoracic echocardiography showed Thinning of basal anteroseptal and inferred septum with LV dilatation with reduction of Global Longitudinal Strain. A dual-chamber pacemaker was implanted. CT Coronary angiogram showed no coronary artery disease. Cardiac magnetic resonance revealed basal anteroseptal and inferior septum thinning with focal edema with LGE suggestive of sarcoidosis. Computed tomography of the chest showed no lymphadenopathy or pulmonary infiltration. 18F-fluorodeoxyglucose positron emission tomography (FDG-PET) of the whole body showed. We started steroids and followed up with the patient. Conclusion: This case serves to highlight the challenges in identifying and managing isolated CS in a young patient with recurrent syncope with variable heart block. Early, even late initiation of steroids can improve arrhythmia as well as left ventricular function.

Keywords: cardiac sarcoidosis, conduction abnormality, syncope, cardiac MRI

Procedia PDF Downloads 54
1603 Secured Embedding of Patient’s Confidential Data in Electrocardiogram Using Chaotic Maps

Authors: Butta Singh

Abstract:

This paper presents a chaotic map based approach for secured embedding of patient’s confidential data in electrocardiogram (ECG) signal. The chaotic map generates predefined locations through the use of selective control parameters. The sample value difference method effectually hides the confidential data in ECG sample pairs at these predefined locations. Evaluation of proposed method on all 48 records of MIT-BIH arrhythmia ECG database demonstrates that the embedding does not alter the diagnostic features of cover ECG. The secret data imperceptibility in stego-ECG is evident through various statistical and clinical performance measures. Statistical metrics comprise of Percentage Root Mean Square Difference (PRD) and Peak Signal to Noise Ratio (PSNR). Further, a comparative analysis between proposed method and existing approaches was also performed. The results clearly demonstrated the superiority of proposed method.

Keywords: chaotic maps, ECG steganography, data embedding, electrocardiogram

Procedia PDF Downloads 146
1602 Development of Database for Risk Assessment Appling to Ballast Water Managements

Authors: Eun-Chan Kim, Jeong-Hwan Oh, Seung-Guk Lee

Abstract:

Billions of tones of ballast water including various aquatic organisms are being carried around the world by ships. When the ballast water is discharged into new environments, some aquatic organisms discharged with ballast water may become invasive and severely disrupt the native ecology. Thus, International Maritime Organization (IMO) adopted the Ballast Water Management Convention in 2004. Regulation A-4 of the convention states that a government in waters under their jurisdiction may grant exemptions to any requirements to ballast water management, but only when they are granted to a ship or ships on a voyage or voyages between specified ports or locations, or to a ship which operates exclusively between specified ports or locations. In order to grant exemptions, risk assessment should be conducted based on the guidelines for risk assessment developed by the IMO. For the risk assessment, it is essential to collect the relevant information and establish a database system. This paper studies the database system for ballast water risk assessment. This database consists of the shipping database, ballast water database, port environment database and species database. The shipping database has been established based on the data collected from the port management information system of Korea Government. For the ballast water database, ballast water discharge has only been estimated by the loading/unloading of the cargoes as the convention has not come into effect yet. The port environment database and species database are being established based on the reference documents, and existing and newly collected monitoring data. This database system has been approved to be a useful system, capable of appropriately analyzing the risk assessment in the all ports of Korea.

Keywords: ballast water, IMO, risk assessment, shipping, environment, species

Procedia PDF Downloads 476
1601 Performance-Based Quality Evaluation of Database Conceptual Schemas

Authors: Janusz Getta, Zhaoxi Pan

Abstract:

Performance-based quality evaluation of database conceptual schemas is an important aspect of database design process. It is evident that different conceptual schemas provide different logical schemas and performance of user applications strongly depends on logical and physical database structures. This work presents the entire process of performance-based quality evaluation of conceptual schemas. First, we show format. Then, the paper proposes a new specification of object algebra for representation of conceptual level database applications. Transformation of conceptual schemas and expression of object algebra into implementation schema and implementation in a particular database system allows for precise estimation of the processing costs of database applications and as a consequence for precise evaluation of performance-based quality of conceptual schemas. Then we describe an experiment as a proof of concept for the evaluation procedure presented in the paper.

Keywords: conceptual schema, implementation schema, logical schema, object algebra, performance evaluation, query processing

Procedia PDF Downloads 261
1600 The Video Database for Teaching and Learning in Football Refereeing

Authors: M. Armenteros, A. Domínguez, M. Fernández, A. J. Benítez

Abstract:

The following paper describes the video database tool used by the Fédération Internationale de Football Association (FIFA) as part of the research project developed in collaboration with the Carlos III University of Madrid. The database project began in 2012, with the aim of creating an educational tool for the training of instructors, referees and assistant referees, and it has been used in all FUTURO III courses since 2013. The platform now contains 3,135 video clips of different match situations from FIFA competitions. It has 1,835 users (FIFA instructors, referees and assistant referees). In this work, the main features of the database are described, such as the use of a search tool and the creation of multimedia presentations and video quizzes. The database has been developed in MySQL, ActionScript, Ruby on Rails and HTML. This tool has been rated by users as "very good" in all courses, which prompt us to introduce it as an ideal tool for any other sport that requires the use of video analysis.

Keywords: assistants referees, cloud computing, e-learning, instructors, FIFA, referees, soccer, video database

Procedia PDF Downloads 407
1599 Myocardial Reperfusion Injury during Percutaneous Coronary Intervention in Patient with Triple-Vessel Disease in Limited Resources Hospital: A Case Report

Authors: Fanniyah Anis, Bram Kilapong

Abstract:

Myocardial reperfusion injury is defined as the cellular damage that results from a period of ischemia, followed by the reestablishment of the blood supply to the infarcted tissue. Ventricular tachycardia is one of the most commonly encountered reperfusion arrhythmia as one of the types of myocardial perfusion injury. Prompt and early treatment can reduce mortality, despite limited resources of the hospital in high risk patients with history of triple vessel disease. Case report, Male 53 years old has been diagnosed with NSTEMI with 3VD and comorbid disease of Hypertension and has undergone revascularization management with Percutaneous coronary intervention. Ventricular tachycardia leading to cardiac arrest occurred right after the stent was inserted. Resuscitation was performed for almost 2 hours until spontaneous circulation returned. Patient admitted in ICU with refractory cardiac shock despite using combination of ionotropic and vasopressor agents under standard non-invasive monitoring due to the limitation of the hospital. Angiography was performed again 5 hours later to exclude other possibilities of blockage of coronary arteries and conclude diagnosis of myocardial reperfusion injury. Patient continually managed with combination of antiplatelet agents and maintenance dose of anti-arrhythmia agents. The handling of the patient was to focus more on supportive and preventive from further deteriorating of the condition. Patient showed clinically improvement and regained consciousness within 24 hours. Patient was successfully discharged from ICU within 3 days without any neurological sequela and was discharge from hospital after 3 days observation in general ward. Limited Resource of hospital did not refrain the physician from attaining a good outcome for this myocardial reperfusion injury case and angiography alone can be used to confirm the diagnosis of myocardial reperfusion injury.

Keywords: limited resources hospital, myocardial reperfusion injury, prolonged resuscitation, refractory cardiogenic shock, reperfusion arrhythmia, revascularization, triple-vessel disease

Procedia PDF Downloads 273
1598 Exposure to Ionizing Radiation Resulting from the Chernobyl Fallout and Childhood Cardiac Arrhythmia: A Population Based Study

Authors: Geraldine Landon, Enora Clero, Jean-Rene Jourdain

Abstract:

In 2005, the Institut de Radioprotection et de Sûreté Nucléaire (IRSN, France) launched a research program named EPICE (acronym for 'Evaluation of Pathologies potentially Induced by CaEsium') to collect scientific information on non-cancer effects possibly induced by chronic exposures to low doses of ionizing radiation with the view of addressing a question raised by several French NGOs related to health consequences of the Chernobyl nuclear accident in children. The implementation of the program was preceded by a pilot phase to ensure that the project would be feasible and determine the conditions for implementing an epidemiological study on a population of several thousand children. The EPICE program focused on childhood cardiac arrhythmias started in May 2009 for 4 years, in partnership with the Russian Bryansk Diagnostic Center. The purpose of this cross-sectional study was to determine the prevalence of cardiac arrhythmias in the Bryansk oblast (depending on the contamination of the territory and the caesium-137 whole-body burden) and to assess whether caesium-137 was or not a factor associated with the onset of cardiac arrhythmias. To address these questions, a study bringing together 18 152 children aged 2 to 18 years was initiated; each child received three medical examinations (ECG, echocardiography, and caesium-137 whole-body activity measurement) and some of them were given with a 24-hour Holter monitoring and blood tests. The findings of the study, currently submitted to an international journal justifying that no results can be given at this step, allow us to answer clearly to the issue of radiation-induced childhood arrhythmia, a subject that has been debated for many years. Our results will be certainly helpful for health professionals responsible for the monitoring of population exposed to the releases from the Fukushima Dai-ichi nuclear power plant and also useful for future comparative study in children exposed to ionizing radiation in other contexts, such as cancer radiation therapies.

Keywords: Caesium-137, cardiac arrhythmia, Chernobyl, children

Procedia PDF Downloads 218
1597 Multivariate Data Analysis for Automatic Atrial Fibrillation Detection

Authors: Zouhair Haddi, Stephane Delliaux, Jean-Francois Pons, Ismail Kechaf, Jean-Claude De Haro, Mustapha Ouladsine

Abstract:

Atrial fibrillation (AF) has been considered as the most common cardiac arrhythmia, and a major public health burden associated with significant morbidity and mortality. Nowadays, telemedical approaches targeting cardiac outpatients situate AF among the most challenged medical issues. The automatic, early, and fast AF detection is still a major concern for the healthcare professional. Several algorithms based on univariate analysis have been developed to detect atrial fibrillation. However, the published results do not show satisfactory classification accuracy. This work was aimed at resolving this shortcoming by proposing multivariate data analysis methods for automatic AF detection. Four publicly-accessible sets of clinical data (AF Termination Challenge Database, MIT-BIH AF, Normal Sinus Rhythm RR Interval Database, and MIT-BIH Normal Sinus Rhythm Databases) were used for assessment. All time series were segmented in 1 min RR intervals window and then four specific features were calculated. Two pattern recognition methods, i.e., Principal Component Analysis (PCA) and Learning Vector Quantization (LVQ) neural network were used to develop classification models. PCA, as a feature reduction method, was employed to find important features to discriminate between AF and Normal Sinus Rhythm. Despite its very simple structure, the results show that the LVQ model performs better on the analyzed databases than do existing algorithms, with high sensitivity and specificity (99.19% and 99.39%, respectively). The proposed AF detection holds several interesting properties, and can be implemented with just a few arithmetical operations which make it a suitable choice for telecare applications.

Keywords: atrial fibrillation, multivariate data analysis, automatic detection, telemedicine

Procedia PDF Downloads 237
1596 Attentional Engagement for Movie

Authors: Wuon-Shik Kim, Hyoung-Min Choi, Jeonggeon Woo, Sun Jung Kwon, SeungHee Lee

Abstract:

The research on attentional engagement (AE) in movies using physiological signals is rare and controversial. Therefore, whether physiological responses can be applied to evaluate AE in actual movies is unclear. To clarify this, we measured electrocardiogram and electroencephalogram (EEG) of 16 Japanese university students as they watched the American movie Iron Man. After the viewing, we evaluated the subjective AE and affection levels for 11 film content segments in Iron Man. Based on self-reports for AE, we selected two film content segments as stimuli: Film Content 9 describing Tony Stark (the main character) flying through the night sky (with the highest AE score) and Film Content 1, describing Tony Stark and his colleagues telling indecent jokes (with the lowest score). We divided these two content segments into two time intervals, respectively. Results indicated that the Film Content by Interval interaction for HR was significant, at F (1, 11)=35.64, p<.001, η2=.76; while HR in Film Content 1 decreased, that of in Film Content 9 increased. In Film Content 9, the main effects of the Interval for respiratory sinus arrhythmia (RSA) (F (1, 11)=5.91, p<.05, η2=.35) and for the attention index of EEG (F (1, 11)=5.23, p<.05, η2=.37) were significant. The increase in the RSA was significant (p<.05) as well, whereas that of the EEG attention index was nearly significant (p=.069). In conclusion, while RSA increases, HR decreases when people direct their attention toward normal films. However, while paying attention to a film evoking excitement, HR as well as RSA can increase.

Keywords: attentional engagement, electroencephalogram, movie, respiratory sinus arrhythmia

Procedia PDF Downloads 335
1595 The Incidence of Cardiac Arrhythmias Using Trans-Telephonic, Portable Electrocardiography Recorder, in Out-Patients Faculty of Medicine Ramathibodi Hospital

Authors: Urasri Imsomboon, Sopita Areerob, Kanchaporn Kongchauy, Tuchapong Ngarmukos

Abstract:

Objective: The Trans-telephonic Electrocardiography (ECG) monitoring is used to diagnose of infrequent cardiac arrhythmias and improve outcome of early detection and treatment on suspected cardiac patients. The objectives of this study were to explore incidence of cardiac arrhythmia using Trans-Telephonic and to explore time to first symptomatic episode and documented cardiac arrhythmia in outpatients. Methods: Descriptive research study was conducted between February 1, 2016, and December 31, 2016. A total of 117 patients who visited outpatient clinic were purposively selected. Research instruments in this study were the personal data questionnaire and the record form of incidence of cardiac arrhythmias using Trans-Telephonic ECG recorder. Results: A total of 117 patients aged between 15-92 years old (mean age 52.7 ±17.1 years), majority of studied sample was women (64.1%). The results revealed that 387 ECGs (Average 2.88 ECGs/person, SD = 3.55, Range 0 – 21) were sent to Cardiac Monitoring Center at Coronary Care Unit. Of these, normal sinus rhythm was found mostly 46%. Top 5 of cardiac arrhythmias were documented at the time of symptoms: sinus tachycardia 43.5%, premature atrial contraction 17.7%, premature ventricular contraction 14.3%, sinus bradycardia 11.5% and atrial fibrillation 8.6%. Presenting symptom were tachycardia 94%, palpitation 83.8%, dyspnea 51.3%, chest pain 19.6%, and syncope 14.5%. Mostly activities during symptom were no activity 64.8%, sleep 55.6% and work 25.6%.The mean time until the first symptomatic episode occurred on average after 6.88 ± 7.72 days (median 3 days). The first documented cardiac arrhythmia occurred on average after 9 ± 7.92 days (median 7 day). The treatments after patients known actual cardiac arrhythmias were observe themselves 68%, continue same medications 15%, got further investigations (7 patients), and corrected causes of cardiac arrhythmias via invasive cardiac procedures (5 patients). Conclusion: Trans-telephonic: portable ECGs recorder is effective in the diagnosis of suspected symptomatic cardiac arrhythmias in outpatient clinic.

Keywords: cardiac arrhythmias, diagnosis, outpatient clinic, trans-telephonic: portable ECG recorder

Procedia PDF Downloads 166
1594 A Query Optimization Strategy for Autonomous Distributed Database Systems

Authors: Dina K. Badawy, Dina M. Ibrahim, Alsayed A. Sallam

Abstract:

Distributed database is a collection of logically related databases that cooperate in a transparent manner. Query processing uses a communication network for transmitting data between sites. It refers to one of the challenges in the database world. The development of sophisticated query optimization technology is the reason for the commercial success of database systems, which complexity and cost increase with increasing number of relations in the query. Mariposa, query trading and query trading with processing task-trading strategies developed for autonomous distributed database systems, but they cause high optimization cost because of involvement of all nodes in generating an optimal plan. In this paper, we proposed a modification on the autonomous strategy K-QTPT that make the seller’s nodes with the lowest cost have gradually high priorities to reduce the optimization time. We implement our proposed strategy and present the results and analysis based on those results.

Keywords: autonomous strategies, distributed database systems, high priority, query optimization

Procedia PDF Downloads 490
1593 A Comparative Study of GTC and PSP Algorithms for Mining Sequential Patterns Embedded in Database with Time Constraints

Authors: Safa Adi

Abstract:

This paper will consider the problem of sequential mining patterns embedded in a database by handling the time constraints as defined in the GSP algorithm (level wise algorithms). We will compare two previous approaches GTC and PSP, that resumes the general principles of GSP. Furthermore this paper will discuss PG-hybrid algorithm, that using PSP and GTC. The results show that PSP and GTC are more efficient than GSP. On the other hand, the GTC algorithm performs better than PSP. The PG-hybrid algorithm use PSP algorithm for the two first passes on the database, and GTC approach for the following scans. Experiments show that the hybrid approach is very efficient for short, frequent sequences.

Keywords: database, GTC algorithm, PSP algorithm, sequential patterns, time constraints

Procedia PDF Downloads 354
1592 Local Boundary Analysis for Generative Theory of Tonal Music: From the Aspect of Classic Music Melody Analysis

Authors: Po-Chun Wang, Yan-Ru Lai, Sophia I. C. Lin, Alvin W. Y. Su

Abstract:

The Generative Theory of Tonal Music (GTTM) provides systematic approaches to recognizing local boundaries of music. The rules have been implemented in some automated melody segmentation algorithms. Besides, there are also deep learning methods with GTTM features applied to boundary detection tasks. However, these studies might face constraints such as a lack of or inconsistent label data. The GTTM database is currently the most widely used GTTM database, which includes manually labeled GTTM rules and local boundaries. Even so, we found some problems with these labels. They are sometimes discrepancies with GTTM rules. In addition, since it is labeled at different times by multiple musicians, they are not within the same scope in some cases. Therefore, in this paper, we examine this database with musicians from the aspect of classical music and relabel the scores. The relabeled database - GTTM Database v2.0 - will be released for academic research usage. Despite the experimental and statistical results showing that the relabeled database is more consistent, the improvement in boundary detection is not substantial. It seems that we need more clues than GTTM rules for boundary detection in the future.

Keywords: dataset, GTTM, local boundary, neural network

Procedia PDF Downloads 103
1591 Implementing a Database from a Requirement Specification

Authors: M. Omer, D. Wilson

Abstract:

Creating a database scheme is essentially a manual process. From a requirement specification, the information contained within has to be analyzed and reduced into a set of tables, attributes and relationships. This is a time-consuming process that has to go through several stages before an acceptable database schema is achieved. The purpose of this paper is to implement a Natural Language Processing (NLP) based tool to produce a from a requirement specification. The Stanford CoreNLP version 3.3.1 and the Java programming were used to implement the proposed model. The outcome of this study indicates that the first draft of a relational database schema can be extracted from a requirement specification by using NLP tools and techniques with minimum user intervention. Therefore, this method is a step forward in finding a solution that requires little or no user intervention.

Keywords: information extraction, natural language processing, relation extraction

Procedia PDF Downloads 231
1590 Railway Accidents: Using the Global Railway Accident Database and Evaluation for Risk Analysis

Authors: Mathias Linden, André Schneider, Harald F. O. von Korflesch

Abstract:

The risk of train accidents is an ongoing concern for railway organizations, governments, insurance companies and other depended sectors. Safety technologies are installed to reduce and to prevent potential damages of train accidents. Since the budgetary for the safety of railway organizations is limited, it is necessary not only to achieve a high availability and high safety standard but also to be cost effective. Therefore, an economic assessment of safety technologies is fundamental to create an accurate risk analysis. In order to conduct an economical assessment of a railway safety technology and a quantification of the costs of the accident causes, the Global Railway Accident Database & Evaluation (GRADE) has been developed. The aim of this paper is to describe the structure of this accident database and to show how it can be used for risk analyses. A number of risk analysis methods, such as the probabilistic safety assessment method (PSA), was used to demonstrate this accident database’s different possibilities of risk analysis. In conclusion, it can be noted that these analyses would not be as accurate without GRADE. The information gathered in the accident database was not available in this way before. Our findings are relevant for railway operators, safety technology suppliers, assurances, governments and other concerned railway organizations.

Keywords: accident causes, accident costs, accident database, global railway accident database & evaluation, GRADE, probabilistic safety assessment, PSA, railway accidents, risk analysis

Procedia PDF Downloads 329