Search results for: patient database
285 Comparison of Real-Time PCR and FTIR with Chemometrics Technique in Analysing Halal Supplement Capsules
Authors: Mohd Sukri Hassan, Ahlam Inayatullah Badrul Munir, M. Husaini A. Rahman
Abstract:
Halal authentication and verification in supplement capsules are highly required as the gelatine available in the market can be from halal or non-halal sources. It is an obligation for Muslim to consume and use the halal consumer goods. At present, real-time polymerase chain reaction (RT-PCR) is the most common technique being used for the detection of porcine and bovine DNA in gelatine due to high sensitivity of the technique and higher stability of DNA compared to protein. In this study, twenty samples of supplements capsules from different products with different Halal logos were analyzed for porcine and bovine DNA using RT-PCR. Standard bovine and porcine gelatine from eurofins at a range of concentration from 10-1 to 10-5 ng/µl were used to determine the linearity range, limit of detection and specificity on RT-PCR (SYBR Green method). RT-PCR detected porcine (two samples), bovine (four samples) and mixture of porcine and bovine (six samples). The samples were also tested using FT-IR technique where normalized peak of IR spectra were pre-processed using Savitsky Golay method before Principal Components Analysis (PCA) was performed on the database. Scores plot of PCA shows three clusters of samples; bovine, porcine and mixture (bovine and porcine). The RT-PCR and FT-IR with chemometrics technique were found to give same results for porcine gelatine samples which can be used for Halal authentication.Keywords: Halal, real-time PCR, gelatin, FTIR and chemometrics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 919284 CFD Prediction of the Round Elbow Fitting Loss Coefficient
Authors: Ana Paula P. dos Santos, Claudia R. Andrade, Edson L. Zaparoli
Abstract:
Pressure loss in ductworks is an important factor to be considered in design of engineering systems such as power-plants, refineries, HVAC systems to reduce energy costs. Ductwork can be composed by straight ducts and different types of fittings (elbows, transitions, converging and diverging tees and wyes). Duct fittings are significant sources of pressure loss in fluid distribution systems. Fitting losses can be even more significant than equipment components such as coils, filters, and dampers. At the present work, a conventional 90o round elbow under turbulent incompressible airflow is studied. Mass, momentum, and k-e turbulence model equations are solved employing the finite volume method. The SIMPLE algorithm is used for the pressure-velocity coupling. In order to validate the numerical tool, the elbow pressure loss coefficient is determined using the same conditions to compare with ASHRAE database. Furthermore, the effect of Reynolds number variation on the elbow pressure loss coefficient is investigated. These results can be useful to perform better preliminary design of air distribution ductworks in air conditioning systems.
Keywords: Duct fitting, Pressure loss, Elbow.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4851283 A Neural Network Approach in Predicting the Blood Glucose Level for Diabetic Patients
Authors: Zarita Zainuddin, Ong Pauline, C. Ardil
Abstract:
Diabetes Mellitus is a chronic metabolic disorder, where the improper management of the blood glucose level in the diabetic patients will lead to the risk of heart attack, kidney disease and renal failure. This paper attempts to enhance the diagnostic accuracy of the advancing blood glucose levels of the diabetic patients, by combining principal component analysis and wavelet neural network. The proposed system makes separate blood glucose prediction in the morning, afternoon, evening and night intervals, using dataset from one patient covering a period of 77 days. Comparisons of the diagnostic accuracy with other neural network models, which use the same dataset are made. The comparison results showed overall improved accuracy, which indicates the effectiveness of this proposed system.Keywords: Diabetes Mellitus, principal component analysis, time-series, wavelet neural network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2989282 Organization Model of Semantic Document Repository and Search Techniques for Studying Information Technology
Authors: Nhon Do, Thuong Huynh, An Pham
Abstract:
Nowadays, organizing a repository of documents and resources for learning on a special field as Information Technology (IT), together with search techniques based on domain knowledge or document-s content is an urgent need in practice of teaching, learning and researching. There have been several works related to methods of organization and search by content. However, the results are still limited and insufficient to meet user-s demand for semantic document retrieval. This paper presents a solution for the organization of a repository that supports semantic representation and processing in search. The proposed solution is a model which integrates components such as an ontology describing domain knowledge, a database of document repository, semantic representation for documents and a file system; with problems, semantic processing techniques and advanced search techniques based on measuring semantic similarity. The solution is applied to build a IT learning materials management system of a university with semantic search function serving students, teachers, and manager as well. The application has been implemented, tested at the University of Information Technology, Ho Chi Minh City, Vietnam and has achieved good results.Keywords: document retrieval system, knowledgerepresentation, document representation, semantic search, ontology.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1710281 VoIP and Database Traffic Co-existence over IEEE 802.11b WLAN with Redundancy
Authors: Rizik Al-Sayyed, Colin Pattinson, Tony Dacre
Abstract:
This paper presents the findings of two experiments that were performed on the Redundancy in Wireless Connection Model (RiWC) using the 802.11b standard. The experiments were simulated using OPNET 11.5 Modeler software. The first was aimed at finding the maximum number of simultaneous Voice over Internet Protocol (VoIP) users the model would support under the G.711 and G.729 codec standards when the packetization interval was 10 milliseconds (ms). The second experiment examined the model?s VoIP user capacity using the G.729 codec standard along with background traffic using the same packetization interval as in the first experiment. To determine the capacity of the model under various experiments, we checked three metrics: jitter, delay and data loss. When background traffic was added, we checked the response time in addition to the previous three metrics. The findings of the first experiment indicated that the maximum number of simultaneous VoIP users the model was able to support was 5, which is consistent with recent research findings. When using the G.729 codec, the model was able to support up to 16 VoIP users; similar experiments in current literature have indicated a maximum of 7 users. The finding of the second experiment demonstrated that the maximum number of VoIP users the model was able to support was 12, with the existence of background traffic.
Keywords: WLAN, IEEE 802.11b, Codec, VoIP, OPNET, Background traffic, and QoS.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1693280 Concepts Extraction from Discharge Notes using Association Rule Mining
Authors: Basak Oguz Yolcular
Abstract:
A large amount of valuable information is available in plain text clinical reports. New techniques and technologies are applied to extract information from these reports. In this study, we developed a domain based software system to transform 600 Otorhinolaryngology discharge notes to a structured form for extracting clinical data from the discharge notes. In order to decrease the system process time discharge notes were transformed into a data table after preprocessing. Several word lists were constituted to identify common section in the discharge notes, including patient history, age, problems, and diagnosis etc. N-gram method was used for discovering terms co-Occurrences within each section. Using this method a dataset of concept candidates has been generated for the validation step, and then Predictive Apriori algorithm for Association Rule Mining (ARM) was applied to validate candidate concepts.Keywords: association rule mining, otorhinolaryngology, predictive apriori, text mining
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1614279 A Hybrid Feature Selection and Deep Learning Algorithm for Cancer Disease Classification
Authors: Niousha Bagheri Khulenjani, Mohammad Saniee Abadeh
Abstract:
Learning from very big datasets is a significant problem for most present data mining and machine learning algorithms. MicroRNA (miRNA) is one of the important big genomic and non-coding datasets presenting the genome sequences. In this paper, a hybrid method for the classification of the miRNA data is proposed. Due to the variety of cancers and high number of genes, analyzing the miRNA dataset has been a challenging problem for researchers. The number of features corresponding to the number of samples is high and the data suffer from being imbalanced. The feature selection method has been used to select features having more ability to distinguish classes and eliminating obscures features. Afterward, a Convolutional Neural Network (CNN) classifier for classification of cancer types is utilized, which employs a Genetic Algorithm to highlight optimized hyper-parameters of CNN. In order to make the process of classification by CNN faster, Graphics Processing Unit (GPU) is recommended for calculating the mathematic equation in a parallel way. The proposed method is tested on a real-world dataset with 8,129 patients, 29 different types of tumors, and 1,046 miRNA biomarkers, taken from The Cancer Genome Atlas (TCGA) database.
Keywords: Cancer classification, feature selection, deep learning, genetic algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1271278 Using HMM-based Classifier Adapted to Background Noises with Improved Sounds Features for Audio Surveillance Application
Authors: Asma Rabaoui, Zied Lachiri, Noureddine Ellouze
Abstract:
Discrimination between different classes of environmental sounds is the goal of our work. The use of a sound recognition system can offer concrete potentialities for surveillance and security applications. The first paper contribution to this research field is represented by a thorough investigation of the applicability of state-of-the-art audio features in the domain of environmental sound recognition. Additionally, a set of novel features obtained by combining the basic parameters is introduced. The quality of the features investigated is evaluated by a HMM-based classifier to which a great interest was done. In fact, we propose to use a Multi-Style training system based on HMMs: one recognizer is trained on a database including different levels of background noises and is used as a universal recognizer for every environment. In order to enhance the system robustness by reducing the environmental variability, we explore different adaptation algorithms including Maximum Likelihood Linear Regression (MLLR), Maximum A Posteriori (MAP) and the MAP/MLLR algorithm that combines MAP and MLLR. Experimental evaluation shows that a rather good recognition rate can be reached, even under important noise degradation conditions when the system is fed by the convenient set of features.Keywords: Sounds recognition, HMM classifier, Multi-style training, Environmental Adaptation, Feature combinations.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1644277 Orthosis and Finite Elements: A Study for Development of New Designs through Additive Manufacturing
Authors: M. Volpini, D. Alves, A. Horta, M. Borges, P. Reis
Abstract:
The gait pattern in people that present motor limitations foment the demand for auxiliary locomotion devices. These artifacts for movement assistance vary according to its shape, size and functional features, following the clinical applications desired. Among the ortheses of lower limbs, the ankle-foot orthesis aims to improve the ability to walk in people with different neuromuscular limitations, although they do not always answer patients' expectations for their aesthetic and functional characteristics. The purpose of this study is to explore the possibility of using new design in additive manufacturer to reproduce the shape and functional features of a ankle-foot orthesis in an efficient and modern way. Therefore, this work presents a study about the performance of the mechanical forces through the analysis of finite elements in an ankle-foot orthesis. It will be demonstrated a study of distribution of the stress on the orthopedic device in orthostatism and during the movement in the course of patient's walk.
Keywords: Additive manufacture, new designs, orthoses, finite elements.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1137276 Blood Elements Activation in Hemodialysis – Animal Model Studies
Authors: Karolina Grzeszczuk-Kuć, Jolanta Bujok, Tomasz Walski, Małgorzata Komorowska
Abstract:
Haemodialysis (HD) is a procedure saving patient lives around the world, unfortunately it brings numerous complications. Oxidative stress is one of the major factors which lead to erythrocytes destruction during extracorporeal circulation. Repeated HD procedures destroy blood elements and the organism is not able to keep up with their production. 30 HD procedures on healthy sheep were performed to evaluate effects of such treatment. Oxidative stress study was performed together with an analysis of basic blood parameters and empirical assessment of dialyzer condition after the procedure. A reversible decline in absolute leukocyte count, during first 30 min of HD, was observed. Blood clots were formed in the area of the blood inlet and outlet of the dialyzer. Our results are consistent with outcomes presented throughout the literature specifically with respect to the effects observed in humans and will provide a basis to evaluate methods for blood protection during haemodialysis.
Keywords: Animal model, blood components, haemodialysis, leukocytes, oxidative stress, sheep.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2346275 Spatio-Temporal Data Mining with Association Rules for Lake Van
Authors: T. Aydin, M. F. Alaeddinoglu
Abstract:
People, throughout the history, have made estimates and inferences about the future by using their past experiences. Developing information technologies and the improvements in the database management systems make it possible to extract useful information from knowledge in hand for the strategic decisions. Therefore, different methods have been developed. Data mining by association rules learning is one of such methods. Apriori algorithm, one of the well-known association rules learning algorithms, is not commonly used in spatio-temporal data sets. However, it is possible to embed time and space features into the data sets and make Apriori algorithm a suitable data mining technique for learning spatiotemporal association rules. Lake Van, the largest lake of Turkey, is a closed basin. This feature causes the volume of the lake to increase or decrease as a result of change in water amount it holds. In this study, evaporation, humidity, lake altitude, amount of rainfall and temperature parameters recorded in Lake Van region throughout the years are used by the Apriori algorithm and a spatio-temporal data mining application is developed to identify overflows and newlyformed soil regions (underflows) occurring in the coastal parts of Lake Van. Identifying possible reasons of overflows and underflows may be used to alert the experts to take precautions and make the necessary investments.Keywords: Apriori algorithm, association rules, data mining, spatio-temporal data.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1405274 Speaker Independent Quranic Recognizer Basedon Maximum Likelihood Linear Regression
Authors: Ehab Mourtaga, Ahmad Sharieh, Mousa Abdallah
Abstract:
An automatic speech recognition system for the formal Arabic language is needed. The Quran is the most formal spoken book in Arabic, it is spoken all over the world. In this research, an automatic speech recognizer for Quranic based speakerindependent was developed and tested. The system was developed based on the tri-phone Hidden Markov Model and Maximum Likelihood Linear Regression (MLLR). The MLLR computes a set of transformations which reduces the mismatch between an initial model set and the adaptation data. It uses the regression class tree, as well as, estimates a set of linear transformations for the mean and variance parameters of a Gaussian mixture HMM system. The 30th Chapter of the Quran, with five of the most famous readers of the Quran, was used for the training and testing of the data. The chapter includes about 2000 distinct words. The advantages of using the Quranic verses as the database in this developed recognizer are the uniqueness of the words and the high level of orderliness between verses. The level of accuracy from the tested data ranged 68 to 85%.Keywords: Hidden Markov Model (HMM), MaximumLikelihood Linear Regression (MLLR), Quran, Regression ClassTree, Speech Recognition, Speaker-independent.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1915273 Image Indexing Using a Color Similarity Metric based on the Human Visual System
Authors: Angelo Nodari, Ignazio Gallo
Abstract:
The novelty proposed in this study is twofold and consists in the developing of a new color similarity metric based on the human visual system and a new color indexing based on a textual approach. The new color similarity metric proposed is based on the color perception of the human visual system. Consequently the results returned by the indexing system can fulfill as much as possibile the user expectations. We developed a web application to collect the users judgments about the similarities between colors, whose results are used to estimate the metric proposed in this study. In order to index the image's colors, we used a text indexing engine to facilitate the integration of visual features in a database of text documents. The textual signature is build by weighting the image's colors in according to their occurrence in the image. The use of a textual indexing engine, provide us a simple, fast and robust solution to index images. A typical usage of the system proposed in this study, is the development of applications whose data type is both visual and textual. In order to evaluate the proposed method we chose a price comparison engine as a case of study, collecting a series of commercial offers containing the textual description and the image representing a specific commercial offer.
Keywords: Color Extraction, Content-Based Image Retrieval, Indexing
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3027272 Localisation of Anatomical Soft Tissue Landmarks of the Head in CT Images
Authors: M. Ovinis, D. Kerr, K. Bouazza-Marouf, M. Vloeberghs
Abstract:
In this paper, algorithms for the automatic localisation of two anatomical soft tissue landmarks of the head the medial canthus (inner corner of the eye) and the tragus (a small, pointed, cartilaginous flap of the ear), in CT images are describet. These landmarks are to be used as a basis for an automated image-to-patient registration system we are developing. The landmarks are localised on a surface model extracted from CT images, based on surface curvature and a rule based system that incorporates prior knowledge of the landmark characteristics. The approach was tested on a dataset of near isotropic CT images of 95 patients. The position of the automatically localised landmarks was compared to the position of the manually localised landmarks. The average difference was 1.5 mm and 0.8 mm for the medial canthus and tragus, with a maximum difference of 4.5 mm and 2.6 mm respectively.The medial canthus and tragus can be automatically localised in CT images, with performance comparable to manual localisationKeywords: Anatomical soft tissue landmarks, automatic localisation, Computed Tomography (CT)
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1844271 Vehicle Risk Evaluation in Low Speed Accidents: Consequences for Relevant Test Scenarios
Authors: Philip Feig, Klaus Gschwendtner, Julian Schatz, Frank Diermeyer
Abstract:
Projects of accident research analysis are mostly focused on accidents involving personal damage. Property damage only has a high frequency of occurrence combined with high economic impact. This paper describes main influencing parameters for the extent of damage and presents a repair cost model. For a prospective evaluation method of the monetary effect of advanced driver assistance systems (ADAS), it is necessary to be aware of and quantify all influencing parameters. Furthermore, this method allows the evaluation of vehicle concepts in combination with an ADAS at an early point in time of the product development process. In combination with a property damage database and the introduced repair cost model relevant test scenarios for specific vehicle configurations and their individual property damage risk may be determined. Currently, equipment rates of ADAS are low and a purchase incentive for customers would be beneficial. The next ADAS generation will prevent property damage to a large extent or at least reduce damage severity. Both effects may be a purchasing incentive for the customer and furthermore contribute to increased traffic safety.Keywords: Property damage analysis, effectiveness, ADAS, damage risk, accident research, accident scenarios.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1388270 Low-Cost Robotic-Assisted Laparoscope
Authors: Ege Can Onal, Enver Ersen, Meltem Elitas
Abstract:
Laparoscopy is a surgical operation, well known as keyhole surgery. The operation is performed through small holes, hence, scars of a patient become much smaller, patients can recover in a short time and the hospital stay becomes shorter in comparison to an open surgery. Several tools are used at laparoscopic operations; among them, the laparoscope has a crucial role. It provides the vision during the operation, which will be the main focus in here. Since the operation area is very small, motion of the surgical tools might be limited in laparoscopic operations compared to traditional surgeries. To overcome this limitation, most of the laparoscopic tools have become more precise, dexterous, multi-functional or automated. Here, we present a robotic-assisted laparoscope that is controlled with pedals directly by a surgeon. Thus, the movement of the laparoscope might be controlled better, so there will not be a need to calibrate the camera during the operation. The need for an assistant that controls the movement of the laparoscope will be eliminated. The duration of the laparoscopic operation might be shorter since the surgeon will directly operate the camera.
Keywords: Laparoscope, laparoscopy, low-cost, minimally invasive surgery, robotic-assisted surgery.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 949269 Application of Intuitionistic Fuzzy Cross Entropy Measure in Decision Making for Medical Diagnosis
Authors: Shikha Maheshwari, Amit Srivastava
Abstract:
In medical investigations, uncertainty is a major challenging problem in making decision for doctors/experts to identify the diseases with a common set of symptoms and also has been extensively increasing in medical diagnosis problems. The theory of cross entropy for intuitionistic fuzzy sets (IFS) is an effective approach in coping uncertainty in decision making for medical diagnosis problem. The main focus of this paper is to propose a new intuitionistic fuzzy cross entropy measure (IFCEM), which aid in reducing the uncertainty and doctors/experts will take their decision easily in context of patient’s disease. It is shown that the proposed measure has some elegant properties, which demonstrates its potency. Further, it is also exemplified in detail the efficiency and utility of the proposed measure by using a real life case study of diagnosis the disease in medical science.
Keywords: Intuitionistic fuzzy cross entropy (IFCEM), intuitionistic fuzzy set (IFS), medical diagnosis, uncertainty.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2045268 Brain Drain of Doctors; Causes and Consequences in Pakistan
Authors: Muhammad Wajid Tahir, Rubina Kauser, Majid Ali Tahir
Abstract:
Pakistani doctors (MBBS) are emigrating towards developed countries for professional adjustments. This study aims to highlight causes and consequences of doctors- brain drain from Pakistan. Primary data was collected from Mayo Hospital, Lahore by interviewing doctors (n=100) through systematic random sampling technique. It found that various socio-economic and political conditions are working as push and pull factors for brain drain of doctors in Pakistan. Majority of doctors (83%) declared poor remunerations and professional infrastructure of health department as push factor of doctors- brain drain. 81% claimed that continuous instability in political situation and threats of terrorism are responsible for emigration of doctors. 84% respondents considered fewer opportunities of further studies responsible for their emigration. Brain drain of doctors is affecting health sector-s policies / programs, standard doctor-patient ratios and quality of health services badly.
Keywords: Brain Drain, Emigration, Remuneration, Politicalinstability, MBBS doctors
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4620267 Revised PLWAP Tree with Non-frequent Items for Mining Sequential Pattern
Authors: R. Vishnu Priya, A. Vadivel
Abstract:
Sequential pattern mining is a challenging task in data mining area with large applications. One among those applications is mining patterns from weblog. Recent times, weblog is highly dynamic and some of them may become absolute over time. In addition, users may frequently change the threshold value during the data mining process until acquiring required output or mining interesting rules. Some of the recently proposed algorithms for mining weblog, build the tree with two scans and always consume large time and space. In this paper, we build Revised PLWAP with Non-frequent Items (RePLNI-tree) with single scan for all items. While mining sequential patterns, the links related to the nonfrequent items are not considered. Hence, it is not required to delete or maintain the information of nodes while revising the tree for mining updated transactions. The algorithm supports both incremental and interactive mining. It is not required to re-compute the patterns each time, while weblog is updated or minimum support changed. The performance of the proposed tree is better, even the size of incremental database is more than 50% of existing one. For evaluation purpose, we have used the benchmark weblog dataset and found that the performance of proposed tree is encouraging compared to some of the recently proposed approaches.
Keywords: Sequential pattern mining, weblog, frequent and non-frequent items, incremental and interactive mining.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1931266 A Formal Approach for Proof Constructions in Cryptography
Authors: Markus Kaiser, Johannes Buchmann
Abstract:
In this article we explore the application of a formal proof system to verification problems in cryptography. Cryptographic properties concerning correctness or security of some cryptographic algorithms are of great interest. Beside some basic lemmata, we explore an implementation of a complex function that is used in cryptography. More precisely, we describe formal properties of this implementation that we computer prove. We describe formalized probability distributions (σ-algebras, probability spaces and conditional probabilities). These are given in the formal language of the formal proof system Isabelle/HOL. Moreover, we computer prove Bayes- Formula. Besides, we describe an application of the presented formalized probability distributions to cryptography. Furthermore, this article shows that computer proofs of complex cryptographic functions are possible by presenting an implementation of the Miller- Rabin primality test that admits formal verification. Our achievements are a step towards computer verification of cryptographic primitives. They describe a basis for computer verification in cryptography. Computer verification can be applied to further problems in cryptographic research, if the corresponding basic mathematical knowledge is available in a database.Keywords: prime numbers, primality tests, (conditional) probabilitydistributions, formal proof system, higher-order logic, formalverification, Bayes' Formula, Miller-Rabin primality test.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1469265 Comparison between Associative Classification and Decision Tree for HCV Treatment Response Prediction
Authors: Enas M. F. El Houby, Marwa S. Hassan
Abstract:
Combined therapy using Interferon and Ribavirin is the standard treatment in patients with chronic hepatitis C. However, the number of responders to this treatment is low, whereas its cost and side effects are high. Therefore, there is a clear need to predict patient’s response to the treatment based on clinical information to protect the patients from the bad drawbacks, Intolerable side effects and waste of money. Different machine learning techniques have been developed to fulfill this purpose. From these techniques are Associative Classification (AC) and Decision Tree (DT). The aim of this research is to compare the performance of these two techniques in the prediction of virological response to the standard treatment of HCV from clinical information. 200 patients treated with Interferon and Ribavirin; were analyzed using AC and DT. 150 cases had been used to train the classifiers and 50 cases had been used to test the classifiers. The experiment results showed that the two techniques had given acceptable results however the best accuracy for the AC reached 92% whereas for DT reached 80%.
Keywords: Associative Classification, Data mining, Decision tree, HCV, interferon.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1899264 Computer Verification in Cryptography
Authors: Markus Kaiser, Johannes Buchmann
Abstract:
In this paper we explore the application of a formal proof system to verification problems in cryptography. Cryptographic properties concerning correctness or security of some cryptographic algorithms are of great interest. Beside some basic lemmata, we explore an implementation of a complex function that is used in cryptography. More precisely, we describe formal properties of this implementation that we computer prove. We describe formalized probability distributions (o--algebras, probability spaces and condi¬tional probabilities). These are given in the formal language of the formal proof system Isabelle/HOL. Moreover, we computer prove Bayes' Formula. Besides we describe an application of the presented formalized probability distributions to cryptography. Furthermore, this paper shows that computer proofs of complex cryptographic functions are possible by presenting an implementation of the Miller- Rabin primality test that admits formal verification. Our achievements are a step towards computer verification of cryptographic primitives. They describe a basis for computer verification in cryptography. Computer verification can be applied to further problems in crypto-graphic research, if the corresponding basic mathematical knowledge is available in a database.
Keywords: prime numbers, primality tests, (conditional) proba¬bility distributions, formal proof system, higher-order logic, formal verification, Bayes' Formula, Miller-Rabin primality test.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2181263 Automatic Classification of the Stand-to-Sit Phase in the TUG Test Using Machine Learning
Authors: Y. A. Adla, R. Soubra, M. Kasab, M. O. Diab, A. Chkeir
Abstract:
Over the past several years, researchers have shown a great interest in assessing the mobility of elderly people to measure their functional status. Usually, such an assessment is done by conducting tests that require the subject to walk a certain distance, turn around, and finally sit back down. Consequently, this study aims to provide an at home monitoring system to assess the patient’s status continuously. Thus, we proposed a technique to automatically detect when a subject sits down while walking at home. In this study, we utilized a Doppler radar system to capture the motion of the subjects. More than 20 features were extracted from the radar signals out of which 11 were chosen based on their Intraclass Correlation Coefficient (ICC > 0.75). Accordingly, the sequential floating forward selection wrapper was applied to further narrow down the final feature vector. Finally, five features were introduced to the Linear Discriminant Analysis classifier and an accuracy of 93.75% was achieved as well as a precision and recall of 95% and 90% respectively.
Keywords: Doppler radar system, stand-to-sit phase, TUG test, machine learning, classification
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 452262 Comparative Advantage of Mobile Agent Application in Procuring Software Products on the Internet
Authors: Michael K. Adu, Boniface K. Alese, Olumide S. Ogunnusi
Abstract:
This paper brings to fore the inherent advantages in application of mobile agents to procure software products rather than downloading software content on the Internet. It proposes a system whereby the products come on compact disk with mobile agent as deliverable. The client/user purchases a software product, but must connect to the remote server of the software developer before installation. The user provides an activation code that activates mobile agent which is part of the software product on compact disk. The validity of the activation code is checked on connection at the developer’s end to ascertain authenticity and prevent piracy. The system is implemented by downloading two different software products as compare with installing same products on compact disk with mobile agent’s application. Downloading software contents from developer’s database as in the traditional method requires a continuously open connection between the client and the developer’s end, a fixed network is not economically or technically feasible. Mobile agent after being dispatched into the network becomes independent of the creating process and can operate asynchronously and autonomously. It can reconnect later after completing its task and return for result delivery. Response Time and Network Load are very minimal with application of Mobile agent.Keywords: Activation code, internet, mobile agent, software developer, software products.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 627261 Rotation Invariant Face Recognition Based on Hybrid LPT/DCT Features
Authors: Rehab F. Abdel-Kader, Rabab M. Ramadan, Rawya Y. Rizk
Abstract:
The recognition of human faces, especially those with different orientations is a challenging and important problem in image analysis and classification. This paper proposes an effective scheme for rotation invariant face recognition using Log-Polar Transform and Discrete Cosine Transform combined features. The rotation invariant feature extraction for a given face image involves applying the logpolar transform to eliminate the rotation effect and to produce a row shifted log-polar image. The discrete cosine transform is then applied to eliminate the row shift effect and to generate the low-dimensional feature vector. A PSO-based feature selection algorithm is utilized to search the feature vector space for the optimal feature subset. Evolution is driven by a fitness function defined in terms of maximizing the between-class separation (scatter index). Experimental results, based on the ORL face database using testing data sets for images with different orientations; show that the proposed system outperforms other face recognition methods. The overall recognition rate for the rotated test images being 97%, demonstrating that the extracted feature vector is an effective rotation invariant feature set with minimal set of selected features.Keywords: Discrete Cosine Transform, Face Recognition, Feature Extraction, Log Polar Transform, Particle SwarmOptimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1873260 Kerma Profile Measurements in CT Chest Scans– a Comparison of Methodologies
Authors: Bruno B. Oliveira, Arnaldo P. Mourão, Teógenes A. da Silva
Abstract:
The Brazilian legislation has only established diagnostic reference levels (DRLs) in terms of Multiple Scan Average Dose (MSAD) as a quality control parameter for computed tomography (CT) scanners. Compliance with DRLs can be verified by measuring the Computed Tomography Kerma Index (Ca,100) with a pencil ionization chamber or by obtaining the kerma distribution in CT scans with radiochromic films or rod shape lithium fluoride termoluminescent dosimeters (TLD-100). TL dosimeters were used to record kerma profiles and to determine MSAD values of a Bright Speed model GE CT scanner. Measurements were done with radiochromic films and TL dosimeters distributed in cylinders positioned in the center and in four peripheral bores of a standard polymethylmethacrylate (PMMA) body CT dosimetry phantom. Irradiations were done using a protocol for adult chest. The maximum values were found at the midpoint of the longitudinal axis. The MSAD values obtained with three dosimetric techniques were compared.Keywords: Kerma profile, CT, MSAD, patient dosimetry
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2062259 Incorporating Semantic Similarity Measure in Genetic Algorithm : An Approach for Searching the Gene Ontology Terms
Authors: Razib M. Othman, Safaai Deris, Rosli M. Illias, Hany T. Alashwal, Rohayanti Hassan, FarhanMohamed
Abstract:
The most important property of the Gene Ontology is the terms. These control vocabularies are defined to provide consistent descriptions of gene products that are shareable and computationally accessible by humans, software agent, or other machine-readable meta-data. Each term is associated with information such as definition, synonyms, database references, amino acid sequences, and relationships to other terms. This information has made the Gene Ontology broadly applied in microarray and proteomic analysis. However, the process of searching the terms is still carried out using traditional approach which is based on keyword matching. The weaknesses of this approach are: ignoring semantic relationships between terms, and highly depending on a specialist to find similar terms. Therefore, this study combines semantic similarity measure and genetic algorithm to perform a better retrieval process for searching semantically similar terms. The semantic similarity measure is used to compute similitude strength between two terms. Then, the genetic algorithm is employed to perform batch retrievals and to handle the situation of the large search space of the Gene Ontology graph. The computational results are presented to show the effectiveness of the proposed algorithm.Keywords: Gene Ontology, Semantic similarity measure, Genetic algorithm, Ontology search
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1490258 Pulse Oximeter Concept for Vascular Occlusion Test
Authors: Fatanah M. Suhaimi, J. Geoffrey Chase, Christopher G. Pretty, Rodney Elliott, Geoffrey M. Shaw
Abstract:
Microcirculatory dysfunction is very common in sepsis and may results in organ failure and increased risk of death. Analyzing oxygen utilization can potentially assess microcirculation function of an individual. In this study, a modified pulse oximeter is used to extract information signals due to absorption of red (R) and infrared (IR) light. IR and R signal are related to the overall blood volume and reduced hemoglobin, respectively. Differences between these two signals thus represent the amount of oxygenated hemoglobin. Avascular occlusion test has been conducted on healthy individuals to validate the pulse oximeter concept. In this test, both R and IR signals rapidly changed according to the occlusion process. The pulse oximeter concept presented is capable of extracting valuable information to assess microcirculation condition. Implementing this concept on ICU patients has the potential to aid sepsis diagnosis and provide more accurate tracking of patient state and sepsis status.
Keywords: Microcirculation, sepsis, sepsis diagnosis, oxygen extraction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2032257 Inflammatory Markers in the Blood and Chronic Periodontitis
Authors: Saimir Heta, Ilma Robo, Nevila Alliu, Tea Meta
Abstract:
Background: Plasma levels of inflammatory markers are the expression of the infectious wastes of existing periodontitis, as well as of existing inflammation everywhere in the body. Materials and Methods: The study consists of the clinical part of the measurement of inflammatory markers of 23 patients diagnosed with chronic periodontitis and the recording of parental periodontal parameters of patient periodontal status: hemorrhage index and probe values, before and 7-10 days after non-surgical periodontal treatment. Results: The level of fibrinogen drops according to the categorization of disease progression, active and passive, with the biggest % (18%-30%) at the fluctuation 10-20 mg/d. Fluctuations in fibrinogen level according to the age of patients in the range 0-10 mg/dL under 40 years and over 40 years was 13%-26%, in the range 10-20 mg/dL was 26%-22%, in the 20-40 mg/dL was 9%-4%. Conclusions: Non-surgical periodontal treatment significantly reduces the level of non-inflammatory markers in the blood. Oral health significantly reduces the potential source for periodontal bacteria, with the potential of promoting thromboembolism, through interaction between thrombocytes.
Keywords: Chronic periodontitis, atherosclerosis, risk factor, inflammatory markers.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 563256 Speaker Identification using Neural Networks
Authors: R.V Pawar, P.P.Kajave, S.N.Mali
Abstract:
The speech signal conveys information about the identity of the speaker. The area of speaker identification is concerned with extracting the identity of the person speaking the utterance. As speech interaction with computers becomes more pervasive in activities such as the telephone, financial transactions and information retrieval from speech databases, the utility of automatically identifying a speaker is based solely on vocal characteristic. This paper emphasizes on text dependent speaker identification, which deals with detecting a particular speaker from a known population. The system prompts the user to provide speech utterance. System identifies the user by comparing the codebook of speech utterance with those of the stored in the database and lists, which contain the most likely speakers, could have given that speech utterance. The speech signal is recorded for N speakers further the features are extracted. Feature extraction is done by means of LPC coefficients, calculating AMDF, and DFT. The neural network is trained by applying these features as input parameters. The features are stored in templates for further comparison. The features for the speaker who has to be identified are extracted and compared with the stored templates using Back Propogation Algorithm. Here, the trained network corresponds to the output; the input is the extracted features of the speaker to be identified. The network does the weight adjustment and the best match is found to identify the speaker. The number of epochs required to get the target decides the network performance.Keywords: Average Mean Distance function, Backpropogation, Linear Predictive Coding, MultilayeredPerceptron,
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1893