Search results for: Fault Diagnosis of Transformer
169 Statistical Models of Network Traffic
Authors: Barath Kumar, Oliver Niggemann, Juergen Jasperneite
Abstract:
Model-based approaches have been applied successfully to a wide range of tasks such as specification, simulation, testing, and diagnosis. But one bottleneck often prevents the introduction of these ideas: Manual modeling is a non-trivial, time-consuming task. Automatically deriving models by observing and analyzing running systems is one possible way to amend this bottleneck. To derive a model automatically, some a-priori knowledge about the model structure–i.e. about the system–must exist. Such a model formalism would be used as follows: (i) By observing the network traffic, a model of the long-term system behavior could be generated automatically, (ii) Test vectors can be generated from the model, (iii) While the system is running, the model could be used to diagnose non-normal system behavior. The main contribution of this paper is the introduction of a model formalism called 'probabilistic regression automaton' suitable for the tasks mentioned above.Keywords: Model-based approach, Probabilistic regression automata, Statistical models and Timed automata.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1539168 An Evaluation of Sputum Smear Conversion and Haematological Parameter Alteration in Early Detection Period of New Pulmonary Tuberculosis (PTB) Patients
Authors: Tasnuva Tamanna, Sanjida Halim Topa
Abstract:
Sputum smear conversion after one month of antituberculosis therapy in new smear positive pulmonary tuberculosis patients (PTB+) is a vital indicator towards treatment success. The objective of this study is to determine the rate of sputum smear conversion in new PTB+ patients after one month under treatment of National Institute of Diseases of the Chest and Hospital (NIDCH). Analysis of sputum smear conversion was done by re-clinical examination with sputum smear microscopic test after one month. Socio-demographic and hematological parameters were evaluated to perceive the correlation with the disease status. Among all enrolled patients only 33.33% were available for follow up diagnosis and of them only 42.86% patients turned to smear negative. Probably this consequence is due to non-coherence to the proper disease management. 66.67% and 78.78% patients reported low haemoglobin and packed cell volume level respectively whereas 80% and 93.33% patients accounted accelerated platelet count and erythrocyte sedimentation rate correspondingly.Keywords: Followed up patients, PTB+ patients, sputum smear conversion, and sputum smear microscopic test.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2171167 A Real-Time Image Change Detection System
Authors: Madina Hamiane, Amina Khunji
Abstract:
Detecting changes in multiple images of the same scene has recently seen increased interest due to the many contemporary applications including smart security systems, smart homes, remote sensing, surveillance, medical diagnosis, weather forecasting, speed and distance measurement, post-disaster forensics and much more. These applications differ in the scale, nature, and speed of change. This paper presents an application of image processing techniques to implement a real-time change detection system. Change is identified by comparing the RGB representation of two consecutive frames captured in real-time. The detection threshold can be controlled to account for various luminance levels. The comparison result is passed through a filter before decision making to reduce false positives, especially at lower luminance conditions. The system is implemented with a MATLAB Graphical User interface with several controls to manage its operation and performance.Keywords: Image change detection, Image processing, image filtering, thresholding, B/W quantization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2563166 Concepts Extraction from Discharge Notes using Association Rule Mining
Authors: Basak Oguz Yolcular
Abstract:
A large amount of valuable information is available in plain text clinical reports. New techniques and technologies are applied to extract information from these reports. In this study, we developed a domain based software system to transform 600 Otorhinolaryngology discharge notes to a structured form for extracting clinical data from the discharge notes. In order to decrease the system process time discharge notes were transformed into a data table after preprocessing. Several word lists were constituted to identify common section in the discharge notes, including patient history, age, problems, and diagnosis etc. N-gram method was used for discovering terms co-Occurrences within each section. Using this method a dataset of concept candidates has been generated for the validation step, and then Predictive Apriori algorithm for Association Rule Mining (ARM) was applied to validate candidate concepts.Keywords: association rule mining, otorhinolaryngology, predictive apriori, text mining
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1614165 An Improved C-Means Model for MRI Segmentation
Authors: Ying Shen, Weihua Zhu
Abstract:
Medical images are important to help identifying different diseases, for example, Magnetic resonance imaging (MRI) can be used to investigate the brain, spinal cord, bones, joints, breasts, blood vessels, and heart. Image segmentation, in medical image analysis, is usually the first step to find out some characteristics with similar color, intensity or texture so that the diagnosis could be further carried out based on these features. This paper introduces an improved C-means model to segment the MRI images. The model is based on information entropy to evaluate the segmentation results by achieving global optimization. Several contributions are significant. Firstly, Genetic Algorithm (GA) is used for achieving global optimization in this model where fuzzy C-means clustering algorithm (FCMA) is not capable of doing that. Secondly, the information entropy after segmentation is used for measuring the effectiveness of MRI image processing. Experimental results show the outperformance of the proposed model by comparing with traditional approaches.
Keywords: Magnetic Resonance Image, C-means model, image segmentation, information entropy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 918164 Development of a Smart System for Measuring Strain Levels of Natural Gas and Petroleum Pipelines on Earthquake Fault Lines in Türkiye
Authors: Ahmet Yetik, Seyit Ali Kara, Cevat Özarpa
Abstract:
Load changes occur on natural gas and oil pipelines due to natural disasters. The displacement of the soil around the natural gas and oil pipes due to situations that may cause erosion, such as earthquakes, landslides, and floods, is the source of this load change. The exposure of natural gas and oil pipes to variable loads causes deformation, cracks, and breaks in these pipes. Such cracks and breaks can cause significant damage to people and the environment, including the risk of explosions. Especially with the examinations made after natural disasters, it can be easily understood which of the pipes has sustained more damage in those quake-affected regions. It has been determined that earthquakes in Türkiye have caused permanent damage to pipelines. This project was initiated in response to the identification of cracks and gas leaks in the insulation gaskets placed in the pipelines, especially at the junction points. In this study, a SCADA (Supervisory Control and Data Acquisition) application has been developed to monitor load changes caused by natural disasters. The developed SCADA application monitors the changes in the x, y, and z axes of the stresses occurring in the pipes with the help of strain gauge sensors placed on the pipes. For the developed SCADA system, test setups in accordance with the standards were created during the fieldwork. The test setups created were integrated into the SCADA system, and the system was followed up. Thanks to the SCADA system developed with the field application, the load changes that will occur on the natural gas and oil pipes are instantly monitored, and the accumulations that may create a load on the pipes and their surroundings are immediately intervened, and new risks that may arise are prevented. It has contributed to energy supply security, asset management, pipeline holistic management, and overall sustainability in the industry.
Keywords: Earthquake, natural gas pipes, oil pipes, voltage measurement, landslide.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 110163 Multidimensional Performance Management
Authors: David Wiese
Abstract:
In order to maximize efficiency of an information management platform and to assist in decision making, the collection, storage and analysis of performance-relevant data has become of fundamental importance. This paper addresses the merits and drawbacks provided by the OLAP paradigm for efficiently navigating large volumes of performance measurement data hierarchically. The system managers or database administrators navigate through adequately (re)structured measurement data aiming to detect performance bottlenecks, identify causes for performance problems or assessing the impact of configuration changes on the system and its representative metrics. Of particular importance is finding the root cause of an imminent problem, threatening availability and performance of an information system. Leveraging OLAP techniques, in contrast to traditional static reporting, this is supposed to be accomplished within moderate amount of time and little processing complexity. It is shown how OLAP techniques can help improve understandability and manageability of measurement data and, hence, improve the whole Performance Analysis process.
Keywords: Data Warehousing, OLAP, Multidimensional Navigation, Performance Diagnosis, Performance Management, Performance Tuning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2135162 Legal Basis for Water Resources Management in Brazil: Case Study of the Rio Grande Basin
Authors: Janaína F. Guidolini, Jean P. H. B. Ometto, Angélica Giarolla, Peter M. Toledo, Carlos A. Valera
Abstract:
The water crisis, a major problem of the 21st century, occurs mainly due to poor management. The central issue that should govern the management is the integration of the various aspects that interfere with the use of water resources and their protection, supported by legal basis. A watershed is a unit of water interacting with the physical, biotic, social, economic and cultural variables. The Brazilian law recognized river basin as the territorial management unit. Based on the diagnosis of the current situation of the water resources of the Rio Grande Basin, a discussion informed in the Brazilian legal basis was made to propose measures to fight or mitigate damages and environmental degradation in the Basin. To manage water resources more efficiently, conserve water and optimize their multiple uses, the integration of acquired scientific knowledge and management is essential. Moreover, it is necessary to monitor compliance with environmental legislation.
Keywords: Conservation of soil and water, river basin, sustainability, water governance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 984161 Hypogenic Karstification and Conduit System Controlling by Tectonic Pattern in Foundation Rocks of the Salman Farsi Dam in South-Western Iran
Authors: Mehran Koleini, Jan Louis Van Rooy, Adam Bumby
Abstract:
The Salman Farsi dam project is constructed on the Ghareh Agahaj River about 140km south of Shiraz city in the Zagros Mountains of southwestern Iran. This tectonic province of south-western Iran is characterized by a simple folded sedimentary sequence. The dam foundation rocks compose of the Asmari Formation of Oligo-miocene and generally comprise of a variety of karstified carbonate rocks varying from strong to weak rocks. Most of the rocks exposed at the dam site show a primary porosity due to incomplete diagenetic recrystallization and compaction. In addition to these primary dispositions to weathering, layering conditions (frequency and orientation of bedding) and the subvertical tectonic discontinuities channeled preferably the infiltrating by deep-sited hydrothermal solutions. Consequently the porosity results to be enlarged by dissolution and the rocks are expected to be karstified and to develop cavities in correspondence of bedding, major joint planes and fault zones. This kind of karsts is named hypogenic karsts which associated to the ascendant warm solutions. Field observations indicate strong karstification and vuggy intercalations especially in the middle part of the Asmari succession. The biggest karst in the dam axis which identified by speleological investigations is Golshany Cave with volume of about 150,000 m3. The tendency of the Asmari limestone for strong dissolution can alert about the seepage from the reservoir and area of the dam locality.
Keywords: Asmari Limestone, Karstification, Salman Farsi Dam, Tectonic Pattern.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2692160 Wavelet based Image Registration Technique for Matching Dental x-rays
Authors: P. Ramprasad, H. C. Nagaraj, M. K. Parasuram
Abstract:
Image registration plays an important role in the diagnosis of dental pathologies such as dental caries, alveolar bone loss and periapical lesions etc. This paper presents a new wavelet based algorithm for registering noisy and poor contrast dental x-rays. Proposed algorithm has two stages. First stage is a preprocessing stage, removes the noise from the x-ray images. Gaussian filter has been used. Second stage is a geometric transformation stage. Proposed work uses two levels of affine transformation. Wavelet coefficients are correlated instead of gray values. Algorithm has been applied on number of pre and post RCT (Root canal treatment) periapical radiographs. Root Mean Square Error (RMSE) and Correlation coefficients (CC) are used for quantitative evaluation. Proposed technique outperforms conventional Multiresolution strategy based image registration technique and manual registration technique.Keywords: Diagnostic imaging, geometric transformation, image registration, multiresolution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1762159 Neural Network based Texture Analysis of Liver Tumor from Computed Tomography Images
Authors: K.Mala, V.Sadasivam, S.Alagappan
Abstract:
Advances in clinical medical imaging have brought about the routine production of vast numbers of medical images that need to be analyzed. As a result an enormous amount of computer vision research effort has been targeted at achieving automated medical image analysis. Computed Tomography (CT) is highly accurate for diagnosing liver tumors. This study aimed to evaluate the potential role of the wavelet and the neural network in the differential diagnosis of liver tumors in CT images. The tumors considered in this study are hepatocellular carcinoma, cholangio carcinoma, hemangeoma and hepatoadenoma. Each suspicious tumor region was automatically extracted from the CT abdominal images and the textural information obtained was used to train the Probabilistic Neural Network (PNN) to classify the tumors. Results obtained were evaluated with the help of radiologists. The system differentiates the tumor with relatively high accuracy and is therefore clinically useful.
Keywords: Fuzzy c means clustering, texture analysis, probabilistic neural network, LVQ neural network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2988158 Behaviour of Base-Isolated Structures with High Initial Isolator Stiffness
Authors: Ajay Sharma, R.S. Jangid
Abstract:
Analytical seismic response of multi-story building supported on base isolation system is investigated under real earthquake motion. The superstructure is idealized as a shear type flexible building with lateral degree-of-freedom at each floor. The force-deformation behaviour of the isolation system is modelled by the bi-linear behaviour which can be effectively used to model all isolation systems in practice. The governing equations of motion of the isolated structural system are derived. The response of the system is obtained numerically by step-by-method under three real recorded earthquake motions and pulse motions associated in the near-fault earthquake motion. The variation of the top floor acceleration, interstory drift, base shear and bearing displacement of the isolated building is studied under different initial stiffness of the bi-linear isolation system. It was observed that the high initial stiffness of the isolation system excites higher modes in base-isolated structure and generate floor accelerations and story drift. Such behaviour of the base-isolated building especially supported on sliding type of isolation systems can be detrimental to sensitive equipment installed in the building. On the other hand, the bearing displacement and base shear found to reduce marginally with the increase of the initial stiffness of the initial stiffness of the isolation system. Further, the above behaviour of the base-isolated building was observed for different parameters of the bearing (i.e. post-yield stiffness and characteristic strength) and earthquake motions (i.e. real time history as well as pulse type motion).Keywords: base isolation, base shear, bi-linear, earthquake, floor accelerations, inter-story drift, multi-story building, pulsemotion, stiffness ratio.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2307157 A Predictive Rehabilitation Software for Cerebral Palsy Patients
Authors: J. Bouchard, B. Prosperi, G. Bavre, M. Daudé, E. Jeandupeux
Abstract:
Young patients suffering from Cerebral Palsy are facing difficult choices concerning heavy surgeries. Diagnosis settled by surgeons can be complex and on the other hand decision for patient about getting or not such a surgery involves important reflection effort. Proposed software combining prediction for surgeries and post surgery kinematic values, and from 3D model representing the patient is an innovative tool helpful for both patients and medicine professionals. Beginning with analysis and classification of kinematics values from Data Base extracted from gait analysis in 3 separated clusters, it is possible to determine close similarity between patients. Prediction surgery best adapted to improve a patient gait is then determined by operating a suitable preconditioned neural network. Finally, patient 3D modeling based on kinematic values analysis, is animated thanks to post surgery kinematic vectors characterizing the closest patient selected from patients clustering.
Keywords: Cerebral Palsy, Clustering, Crouch Gait, 3-D Modeling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2007156 Heuristics Analysis for Distributed Scheduling using MONARC Simulation Tool
Authors: Florin Pop
Abstract:
Simulation is a very powerful method used for highperformance and high-quality design in distributed system, and now maybe the only one, considering the heterogeneity, complexity and cost of distributed systems. In Grid environments, foe example, it is hard and even impossible to perform scheduler performance evaluation in a repeatable and controllable manner as resources and users are distributed across multiple organizations with their own policies. In addition, Grid test-beds are limited and creating an adequately-sized test-bed is expensive and time consuming. Scalability, reliability and fault-tolerance become important requirements for distributed systems in order to support distributed computation. A distributed system with such characteristics is called dependable. Large environments, like Cloud, offer unique advantages, such as low cost, dependability and satisfy QoS for all users. Resource management in large environments address performant scheduling algorithm guided by QoS constrains. This paper presents the performance evaluation of scheduling heuristics guided by different optimization criteria. The algorithms for distributed scheduling are analyzed in order to satisfy users constrains considering in the same time independent capabilities of resources. This analysis acts like a profiling step for algorithm calibration. The performance evaluation is based on simulation. The simulator is MONARC, a powerful tool for large scale distributed systems simulation. The novelty of this paper consists in synthetic analysis results that offer guidelines for scheduler service configuration and sustain the empirical-based decision. The results could be used in decisions regarding optimizations to existing Grid DAG Scheduling and for selecting the proper algorithm for DAG scheduling in various actual situations.Keywords: Scheduling, Simulation, Performance Evaluation, QoS, Distributed Systems, MONARC
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1752155 An Intelligent Human-Computer Interaction System for Decision Support
Authors: Chee Siong Teh, Chee Peng Lim
Abstract:
This paper proposes a novel architecture for developing decision support systems. Unlike conventional decision support systems, the proposed architecture endeavors to reveal the decision-making process such that humans' subjectivity can be incorporated into a computerized system and, at the same time, to preserve the capability of the computerized system in processing information objectively. A number of techniques used in developing the decision support system are elaborated to make the decisionmarking process transparent. These include procedures for high dimensional data visualization, pattern classification, prediction, and evolutionary computational search. An artificial data set is first employed to compare the proposed approach with other methods. A simulated handwritten data set and a real data set on liver disease diagnosis are then employed to evaluate the efficacy of the proposed approach. The results are analyzed and discussed. The potentials of the proposed architecture as a useful decision support system are demonstrated.
Keywords: Interactive evolutionary computation, multivariate data projection, pattern classification, topographic map.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1454154 A Hybrid Gene Selection Technique Using Improved Mutual Information and Fisher Score for Cancer Classification Using Microarrays
Authors: M. Anidha, K. Premalatha
Abstract:
Feature Selection is significant in order to perform constructive classification in the area of cancer diagnosis. However, a large number of features compared to the number of samples makes the task of classification computationally very hard and prone to errors in microarray gene expression datasets. In this paper, we present an innovative method for selecting highly informative gene subsets of gene expression data that effectively classifies the cancer data into tumorous and non-tumorous. The hybrid gene selection technique comprises of combined Mutual Information and Fisher score to select informative genes. The gene selection is validated by classification using Support Vector Machine (SVM) which is a supervised learning algorithm capable of solving complex classification problems. The results obtained from improved Mutual Information and F-Score with SVM as a classifier has produced efficient results.
Keywords: Gene selection, mutual information, Fisher score, classification, SVM.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1152153 AI-Based Techniques for Online Social Media Network Sentiment Analysis: A Methodical Review
Authors: A. M. John-Otumu, M. M. Rahman, O. C. Nwokonkwo, M. C. Onuoha
Abstract:
Online social media networks have long served as a primary arena for group conversations, gossip, text-based information sharing and distribution. The use of natural language processing techniques for text classification and unbiased decision making has not been far-fetched. Proper classification of these textual information in a given context has also been very difficult. As a result, a systematic review was conducted from previous literature on sentiment classification and AI-based techniques. The study was done in order to gain a better understanding of the process of designing and developing a robust and more accurate sentiment classifier that could correctly classify social media textual information of a given context between hate speech and inverted compliments with a high level of accuracy using the knowledge gain from the evaluation of different artificial intelligence techniques reviewed. The study evaluated over 250 articles from digital sources like ACM digital library, Google Scholar, and IEEE Xplore; and whittled down the number of research to 52 articles. Findings revealed that deep learning approaches such as Convolutional Neural Network (CNN), Recurrent Neural Network (RNN), Bidirectional Encoder Representations from Transformer (BERT), and Long Short-Term Memory (LSTM) outperformed various machine learning techniques in terms of performance accuracy. A large dataset is also required to develop a robust sentiment classifier. Results also revealed that data can be obtained from places like Twitter, movie reviews, Kaggle, Stanford Sentiment Treebank (SST), and SemEval Task4 based on the required domain. The hybrid deep learning techniques like CNN+LSTM, CNN+ Gated Recurrent Unit (GRU), CNN+BERT outperformed single deep learning techniques and machine learning techniques. Python programming language outperformed Java programming language in terms of development simplicity and AI-based library functionalities. Finally, the study recommended the findings obtained for building robust sentiment classifier in the future.
Keywords: Artificial Intelligence, Natural Language Processing, Sentiment Analysis, Social Network, Text.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 593152 Investigation of Failures in Wadi-Crossing Pipe Culverts, Sennar State, Sudan
Authors: Magdi M. E. Zumrawi
Abstract:
Crossing culverts are essential element of rural roads. The paper aims to investigate failures of recently constructed wadi-crossing pipe culverts in Sennar state and provide necessary remedial measures. The investigation is conducted to provide an extensive diagnosis study in order to find out the main structural and hydrological weaknesses of the culverts. Literature of steel pipe culverts related to construction practices and common types of culvert failures and their appropriate mitigation measures were reviewed. A detailed field survey was conducted to detect failures and defects appeared on the existing culverts. The results revealed that seepage of water through the embankment and foundation of the culverts leads to excessive erosion and scouring causing sever failures and damages. The design mistakes and poor construction were detected as the main causes of culverts failures. For sustainability of the culverts, various remedial measures are recommended to be considered in urgent rehabilitation of the existing crossings.Keywords: Culvert, erosion, failure, sustainability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1455151 A Quantitative Study Identifying the Prevalence of Anxiety in Dyslexic Students in Higher Education
Authors: Amanda Abbott-Jones
Abstract:
Adult students with dyslexia in higher education can receive support for their cognitive needs but may also experience negative emotion such as anxiety due to their dyslexia in connection with their studies. This paper aims to test the hypothesis that adult dyslexic learners have a higher prevalence of academic and social anxiety than their non-dyslexic peers. A quantitative approach was used to measure differences in academic and social anxiety between 102 students with a formal diagnosis of dyslexia compared to 72 students with no history of learning difficulties. Academic and social anxiety was measured in a questionnaire based on the State-Trait Anxiety Inventory. Findings showed that dyslexic students showed statistically significant higher levels of academic, but not social anxiety in comparison to the non-dyslexic sample. Dyslexic students in higher education show academic anxiety levels that are well above what is shown by students without dyslexia. The implications of this for the dyslexia practitioner is that delivery of strategies to deal with anxiety should be seen equally as important, if not more so, than interventions to deal with cognitive difficulties.Keywords: Academic, anxiety, dyslexia, quantitative, students, university.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1325150 Optimization of Doubly Fed Induction Generator Equivalent Circuit Parameters by Direct Search Method
Authors: Mamidi Ramakrishna Rao
Abstract:
Doubly-fed induction generator (DFIG) is currently the choice for many wind turbines. These generators, when connected to the grid through a converter, is subjected to varied power system conditions like voltage variation, frequency variation, short circuit fault conditions, etc. Further, many countries like Canada, Germany, UK, Scotland, etc. have distinct grid codes relating to wind turbines. Accordingly, following the network faults, wind turbines have to supply a definite reactive current. To satisfy the requirements including reactive current capability, an optimum electrical design becomes a mandate for DFIG to function. This paper intends to optimize the equivalent circuit parameters of an electrical design for satisfactory DFIG performance. Direct search method has been used for optimization of the parameters. The variables selected include electromagnetic core dimensions (diameters and stack length), slot dimensions, radial air gap between stator and rotor and winding copper cross section area. Optimization for 2 MW DFIG has been executed separately for three objective functions - maximum reactive power capability (Case I), maximum efficiency (Case II) and minimum weight (Case III). In the optimization analysis program, voltage variations (10%), power factor- leading and lagging (0.95), speeds for corresponding to slips (-0.3 to +0.3) have been considered. The optimum designs obtained for objective functions were compared. It can be concluded that direct search method of optimization helps in determining an optimum electrical design for each objective function like efficiency or reactive power capability or weight minimization.
Keywords: Direct search, DFIG, equivalent circuit parameters, optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 905149 Acute Coronary Syndrome Prediction Using Data Mining Techniques- An Application
Authors: Tahseen A. Jilani, Huda Yasin, Madiha Yasin, C. Ardil
Abstract:
In this paper we use data mining techniques to investigate factors that contribute significantly to enhancing the risk of acute coronary syndrome. We assume that the dependent variable is diagnosis – with dichotomous values showing presence or absence of disease. We have applied binary regression to the factors affecting the dependent variable. The data set has been taken from two different cardiac hospitals of Karachi, Pakistan. We have total sixteen variables out of which one is assumed dependent and other 15 are independent variables. For better performance of the regression model in predicting acute coronary syndrome, data reduction techniques like principle component analysis is applied. Based on results of data reduction, we have considered only 14 out of sixteen factors.
Keywords: Acute coronary syndrome (ACS), binary logistic regression analyses, myocardial ischemia (MI), principle component analysis, unstable angina (U.A.).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2114148 Automatic Method for Exudates and Hemorrhages Detection from Fundus Retinal Images
Authors: A. Biran, P. Sobhe Bidari, K. Raahemifar
Abstract:
Diabetic Retinopathy (DR) is an eye disease that leads to blindness. The earliest signs of DR are the appearance of red and yellow lesions on the retina called hemorrhages and exudates. Early diagnosis of DR prevents from blindness; hence, many automated algorithms have been proposed to extract hemorrhages and exudates. In this paper, an automated algorithm is presented to extract hemorrhages and exudates separately from retinal fundus images using different image processing techniques including Circular Hough Transform (CHT), Contrast Limited Adaptive Histogram Equalization (CLAHE), Gabor filter and thresholding. Since Optic Disc is the same color as the exudates, it is first localized and detected. The presented method has been tested on fundus images from Structured Analysis of the Retina (STARE) and Digital Retinal Images for Vessel Extraction (DRIVE) databases by using MATLAB codes. The results show that this method is perfectly capable of detecting hard exudates and the highly probable soft exudates. It is also capable of detecting the hemorrhages and distinguishing them from blood vessels.Keywords: Diabetic retinopathy, fundus, CHT, exudates, hemorrhages.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2642147 Frank Norris’ McTeague: An Entropic Melodrama
Authors: Mohsen Masoomi, Fazel Asadi Amjad, Monireh Arvin
Abstract:
According to Naturalistic principles, human destiny in the form of blind chance and determinism, entraps the individual, so man is a defenceless creature unable to escape from the ruthless paws of a stoical universe. In Naturalism; nonetheless, melodrama mirrors a conscious alternative with a peculiar function. A typical American Naturalistic character thus cannot be a subject for social criticism of American society since they are not victims of the ongoing virtual slavery, capitalist system, nor of a ruined milieu, but of their own volition, and more importantly, their character frailty. Through a Postmodern viewpoint, each Naturalistic work can encompass some entropic trends and changes culminating in an entire failure and devastation. Frank Norris in McTeague displays the futile struggles of ordinary men and how they end up brutes. McTeague encompasses intoxication, abuse, violation, and ruthless homicides. Norris’ depictions of the falling individual as a demon represent the entropic dimension of Naturalistic novels. McTeague’s defeat is somewhat his own fault, the result of his own blunders and resolution, not the result of sheer accident. Throughout the novel, each character is a kind of insane quester indicating McTeague’s decadence and, by inference, the decadence of Western civilisation. McTeague seems to designate Norris’ solicitude for a community fabricated by the elements of human negative demeanours and conducts hauling acute symptoms of infectious dehumanisation. The aim of this article is to illustrate how one specific negative human disposition gradually, like a running fire, can spread everywhere and burn everything in itself. The author applies the concept of entropy metaphorically to describe the individual devolutions that necessarily comprise community entropy in McTeague, a dying universe.
Keywords: Animal imagery, entropy, Gypsy, melodrama.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1443146 CRYPTO COPYCAT: A Fashion Centric Blockchain Framework for Eliminating Fashion Infringement
Authors: Magdi Elmessiry, Adel Elmessiry
Abstract:
The fashion industry represents a significant portion of the global gross domestic product, however, it is plagued by cheap imitators that infringe on the trademarks which destroys the fashion industry's hard work and investment. While eventually the copycats would be found and stopped, the damage has already been done, sales are missed and direct and indirect jobs are lost. The infringer thrives on two main facts: the time it takes to discover them and the lack of tracking technologies that can help the consumer distinguish them. Blockchain technology is a new emerging technology that provides a distributed encrypted immutable and fault resistant ledger. Blockchain presents a ripe technology to resolve the infringement epidemic facing the fashion industry. The significance of the study is that a new approach leveraging the state of the art blockchain technology coupled with artificial intelligence is used to create a framework addressing the fashion infringement problem. It transforms the current focus on legal enforcement, which is difficult at best, to consumer awareness that is far more effective. The framework, Crypto CopyCat, creates an immutable digital asset representing the actual product to empower the customer with a near real time query system. This combination emphasizes the consumer's awareness and appreciation of the product's authenticity, while provides real time feedback to the producer regarding the fake replicas. The main findings of this study are that implementing this approach can delay the fake product penetration of the original product market, thus allowing the original product the time to take advantage of the market. The shift in the fake adoption results in reduced returns, which impedes the copycat market and moves the emphasis to the original product innovation.Keywords: Fashion, infringement, Blockchain, artificial intelligence, textiles supply.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1234145 Proportion and Factors Associated with Presumptive Tuberculosis among Suspected Pediatric TB Patients
Authors: Naima Nur, Safa Islam, Saeema Islam, Md. Faridul Alam
Abstract:
The study addresses the increasing challenge of pediatric presumptive tuberculosis, emphasizing the need to understand the factors associated with it. The research aims to determine the proportion of presumptive TB and factors associated with it among suspected pediatric tuberculosis patients. A cross-sectional study was conducted at ICDDR-Bangladesh, collecting specimens from suspected pediatric patients and using logistic regression for data analysis. The study found a high proportion of presumptive TB (85.7%) but no statistically significant differences between presumptive and non-presumptive TB. Theoretical importance of the study highlights the importance of identifying factors associated with presumptive TB for better control and management strategies. Specimens were collected from 84 suspected pediatric patients diagnosed with TB based on clinical symptoms/radiological findings. Microbiological tests like smear-microscopy, culture, and GeneXpert were used to isolate presumptive TB and confirmed TB. The proportion of presumptive TB was 85.7% among suspected pediatric TB patients. Among various factors that were not found to be associated with the presumptive TB. The study concludes that despite a high proportion of presumptive TB, no significant differences were found between presumptive and non-presumptive TB cases.
Keywords: Presumptive tuberculosis, confirmed tuberculosis, patient's characteristics, diagnosis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 84144 A New Hybrid RMN Image Segmentation Algorithm
Authors: Abdelouahab Moussaoui, Nabila Ferahta, Victor Chen
Abstract:
The development of aid's systems for the medical diagnosis is not easy thing because of presence of inhomogeneities in the MRI, the variability of the data from a sequence to the other as well as of other different source distortions that accentuate this difficulty. A new automatic, contextual, adaptive and robust segmentation procedure by MRI brain tissue classification is described in this article. A first phase consists in estimating the density of probability of the data by the Parzen-Rozenblatt method. The classification procedure is completely automatic and doesn't make any assumptions nor on the clusters number nor on the prototypes of these clusters since these last are detected in an automatic manner by an operator of mathematical morphology called skeleton by influence zones detection (SKIZ). The problem of initialization of the prototypes as well as their number is transformed in an optimization problem; in more the procedure is adaptive since it takes in consideration the contextual information presents in every voxel by an adaptive and robust non parametric model by the Markov fields (MF). The number of bad classifications is reduced by the use of the criteria of MPM minimization (Maximum Posterior Marginal).Keywords: Clustering, Automatic Classification, SKIZ, MarkovFields, Image segmentation, Maximum Posterior Marginal (MPM).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1412143 Voice Disorders Identification Using Hybrid Approach: Wavelet Analysis and Multilayer Neural Networks
Authors: L. Salhi, M. Talbi, A. Cherif
Abstract:
This paper presents a new strategy of identification and classification of pathological voices using the hybrid method based on wavelet transform and neural networks. After speech acquisition from a patient, the speech signal is analysed in order to extract the acoustic parameters such as the pitch, the formants, Jitter, and shimmer. Obtained results will be compared to those normal and standard values thanks to a programmable database. Sounds are collected from normal people and patients, and then classified into two different categories. Speech data base is consists of several pathological and normal voices collected from the national hospital “Rabta-Tunis". Speech processing algorithm is conducted in a supervised mode for discrimination of normal and pathology voices and then for classification between neural and vocal pathologies (Parkinson, Alzheimer, laryngeal, dyslexia...). Several simulation results will be presented in function of the disease and will be compared with the clinical diagnosis in order to have an objective evaluation of the developed tool.Keywords: Formants, Neural Networks, Pathological Voices, Pitch, Wavelet Transform.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2842142 Robust Heart Sounds Segmentation Based on the Variation of the Phonocardiogram Curve Length
Authors: Mecheri Zeid Belmecheri, Maamar Ahfir, Izzet Kale
Abstract:
Automatic cardiac auscultation is still a subject of research in order to establish an objective diagnosis. Recorded heart sounds as Phonocardiogram (PCG) signals can be used for automatic segmentation into components that have clinical meanings. These are the first sound, S1, the second sound, S2, and the systolic and diastolic components, respectively. In this paper, an automatic method is proposed for the robust segmentation of heart sounds. This method is based on calculating an intermediate sawtooth-shaped signal from the length variation of the recorded PCG signal in the time domain and, using its positive derivative function that is a binary signal in training a Recurrent Neural Network (RNN). Results obtained in the context of a large database of recorded PCGs with their simultaneously recorded Electrocardiograms (ECGs) from different patients in clinical settings, including normal and abnormal subjects, show on average a segmentation testing performance average of 76% sensitivity and 94% specificity.
Keywords: Heart sounds, PCG segmentation, event detection, Recurrent Neural Networks, PCG curve length.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 321141 An Anatomically-Based Model of the Nerves in the Human Foot
Authors: Muhammad Zeeshan UlHaque, Peng Du, Leo K. Cheng, Marc D. Jacobs
Abstract:
Sensory nerves in the foot play an important part in the diagnosis of various neuropathydisorders, especially in diabetes mellitus.However, a detailed description of the anatomical distribution of the nerves is currently lacking. A computationalmodel of the afferent nerves inthe foot may bea useful tool for the study of diabetic neuropathy. In this study, we present the development of an anatomically-based model of various major sensory nerves of the sole and dorsal sidesof the foot. In addition, we presentan algorithm for generating synthetic somatosensory nerve networks in the big-toe region of a right foot model. The algorithm was based on a modified version of the Monte Carlo algorithm, with the capability of being able to vary the intra-epidermal nerve fiber density in differentregionsof the foot model. Preliminary results from the combinedmodel show the realistic anatomical structure of the major nerves as well as the smaller somatosensory nerves of the foot. The model may now be developed to investigate the functional outcomes of structural neuropathyindiabetic patients.
Keywords: Diabetic neuropathy, Finite element modeling, Monte Carlo Algorithm, Somatosensory nerve networks
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2335140 Security Architecture for At-Home Medical Care Using Sensor Network
Authors: S.S.Mohanavalli, Sheila Anand
Abstract:
This paper proposes a novel architecture for At- Home medical care which enables senior citizens, patients with chronic ailments and patients requiring post- operative care to be remotely monitored in the comfort of their homes. This architecture is implemented using sensors and wireless networking for transmitting patient data to the hospitals, health- care centers for monitoring by medical professionals. Patients are equipped with sensors to measure their physiological parameters, like blood pressure, pulse rate etc. and a Wearable Data Acquisition Unit is used to transmit the patient sensor data. Medical professionals can be alerted to any abnormal variations in these values for diagnosis and suitable treatment. Security threats and challenges inherent to wireless communication and sensor network have been discussed and a security mechanism to ensure data confidentiality and source authentication has been proposed. Symmetric key algorithm AES has been used for encrypting the data and a patent-free, two-pass block cipher mode CCFB has been used for implementing semantic security.Keywords: data confidentiality, integrity, remotemonitoring, source authentication
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1742