Search results for: MYSQL database
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1604

Search results for: MYSQL database

794 Detection and Classification of Myocardial Infarction Using New Extracted Features from Standard 12-Lead ECG Signals

Authors: Naser Safdarian, Nader Jafarnia Dabanloo

Abstract:

In this paper we used four features i.e. Q-wave integral, QRS complex integral, T-wave integral and total integral as extracted feature from normal and patient ECG signals to detection and localization of myocardial infarction (MI) in left ventricle of heart. In our research we focused on detection and localization of MI in standard ECG. We use the Q-wave integral and T-wave integral because this feature is important impression in detection of MI. We used some pattern recognition method such as Artificial Neural Network (ANN) to detect and localize the MI. Because these methods have good accuracy for classification of normal and abnormal signals. We used one type of Radial Basis Function (RBF) that called Probabilistic Neural Network (PNN) because of its nonlinearity property, and used other classifier such as k-Nearest Neighbors (KNN), Multilayer Perceptron (MLP) and Naive Bayes Classification. We used PhysioNet database as our training and test data. We reached over 80% for accuracy in test data for localization and over 95% for detection of MI. Main advantages of our method are simplicity and its good accuracy. Also we can improve accuracy of classification by adding more features in this method. A simple method based on using only four features which extracted from standard ECG is presented which has good accuracy in MI localization.

Keywords: ECG signal processing, myocardial infarction, features extraction, pattern recognition

Procedia PDF Downloads 439
793 Causes of Variation Orders in the Egyptian Construction Industry: Time and Cost Impacts

Authors: A. Samer Ezeldin, Jwanda M. El Sarag

Abstract:

Variation orders are of great importance in any construction project. Variation orders are defined as any change in the scope of works of a project that can be an addition omission, or even modification. This paper investigates the variation orders that occur during construction projects in Egypt. The literature review represents a comparison of causes of variation orders among Egypt, Tanzania, Nigeria, Malaysia and the United Kingdom. A classification of occurrence of variation orders due to owner related factors, consultant related factors and other factors are signified in the literature review. These classified events that lead to variation orders were introduced in a survey with 19 events to observe their frequency of occurrence, and their time and cost impacts. The survey data was obtained from 87 participants that included clients, consultants, and contractors and a database of 42 scenarios was created. A model is then developed to help assist project managers in predicting the frequency of variations and account for a budget for any additional costs and minimize any delays that can take place. Two experts with more than 25 years of experience were given the model to verify that the model was working effectively. The model was then validated on a residential compound that was completed in July 2016 to prove that the model actually produces acceptable results.

Keywords: construction, cost impact, Egypt, time impact, variation orders

Procedia PDF Downloads 161
792 Dual-Channel Multi-Band Spectral Subtraction Algorithm Dedicated to a Bilateral Cochlear Implant

Authors: Fathi Kallel, Ahmed Ben Hamida, Christian Berger-Vachon

Abstract:

In this paper, a Speech Enhancement Algorithm based on Multi-Band Spectral Subtraction (MBSS) principle is evaluated for Bilateral Cochlear Implant (BCI) users. Specifically, dual-channel noise power spectral estimation algorithm using Power Spectral Densities (PSD) and Cross Power Spectral Densities (CPSD) of the observed signals is studied. The enhanced speech signal is obtained using Dual-Channel Multi-Band Spectral Subtraction ‘DC-MBSS’ algorithm. For performance evaluation, objective speech assessment test relying on Perceptual Evaluation of Speech Quality (PESQ) score is performed to fix the optimal number of frequency bands needed in DC-MBSS algorithm. In order to evaluate the speech intelligibility, subjective listening tests are assessed with 3 deafened BCI patients. Experimental results obtained using French Lafon database corrupted by an additive babble noise at different Signal-to-Noise Ratios (SNR) showed that DC-MBSS algorithm improves speech understanding for single and multiple interfering noise sources.

Keywords: speech enhancement, spectral substracion, noise estimation, cochlear impalnt

Procedia PDF Downloads 533
791 Seismic Hazard Prediction Using Seismic Bumps: Artificial Neural Network Technique

Authors: Belkacem Selma, Boumediene Selma, Tourkia Guerzou, Abbes Labdelli

Abstract:

Natural disasters have occurred and will continue to cause human and material damage. Therefore, the idea of "preventing" natural disasters will never be possible. However, their prediction is possible with the advancement of technology. Even if natural disasters are effectively inevitable, their consequences may be partly controlled. The rapid growth and progress of artificial intelligence (AI) had a major impact on the prediction of natural disasters and risk assessment which are necessary for effective disaster reduction. The Earthquakes prediction to prevent the loss of human lives and even property damage is an important factor; that is why it is crucial to develop techniques for predicting this natural disaster. This present study aims to analyze the ability of artificial neural networks (ANNs) to predict earthquakes that occur in a given area. The used data describe the problem of high energy (higher than 10^4J) seismic bumps forecasting in a coal mine using two long walls as an example. For this purpose, seismic bumps data obtained from mines has been analyzed. The results obtained show that the ANN with high accuracy was able to predict earthquake parameters; the classification accuracy through neural networks is more than 94%, and that the models developed are efficient and robust and depend only weakly on the initial database.

Keywords: earthquake prediction, ANN, seismic bumps

Procedia PDF Downloads 108
790 The Role of Marketing Information System on Decision-Making: An Applied Study on Algeria Telecoms Mobile "MOBILIS"

Authors: Benlakhdar Mohamed Larbi, Yagoub Asma

Abstract:

Purpose: This study aims at highlighting the significance and importance of utilizing marketing information system (MKIS) on decision-making, by clarifying the need for quick and efficient decision-making due to time saving and preventing of duplication of work. Design, methodology, approach: The study shows the roles of each part of MKIS for developing marketing strategy, which present a real challenge to individuals and institutions in an era characterized by uncertainty and clarifying the importance of each part separately, depending on decision type and the nature of the situation. The empirical research method was evaluated by specialized experts, conducted by means of questionnaires. Correlation analysis was employed to test the validity of the procedure. Results: The empirical study findings confirmed positive relationships between the level of utilizing and adopting ‘decision support system and marketing intelligence’ and the success of an organizational decision-making, and provide the organization with a competitive advantage as it allows the organization to solve problems. Originality/value: The study offer better understanding of performance- increasing market share as an organizational decision making based on marketing information system.

Keywords: database, marketing research, marketing intelligence, decision support system, decision-making

Procedia PDF Downloads 305
789 Multichannel Surface Electromyography Trajectories for Hand Movement Recognition Using Intrasubject and Intersubject Evaluations

Authors: Christina Adly, Meena Abdelmeseeh, Tamer Basha

Abstract:

This paper proposes a system for hand movement recognition using multichannel surface EMG(sEMG) signals obtained from 40 subjects using 40 different exercises, which are available on the Ninapro(Non-Invasive Adaptive Prosthetics) database. First, we applied processing methods to the raw sEMG signals to convert them to their amplitudes. Second, we used deep learning methods to solve our problem by passing the preprocessed signals to Fully connected neural networks(FCNN) and recurrent neural networks(RNN) with Long Short Term Memory(LSTM). Using intrasubject evaluation, The accuracy using the FCNN is 72%, with a processing time for training around 76 minutes, and for RNN's accuracy is 79.9%, with 8 minutes and 22 seconds processing time. Third, we applied some postprocessing methods to improve the accuracy, like majority voting(MV) and Movement Error Rate(MER). The accuracy after applying MV is 75% and 86% for FCNN and RNN, respectively. The MER value has an inverse relationship with the prediction delay while varying the window length for measuring the MV. The different part uses the RNN with the intersubject evaluation. The experimental results showed that to get a good accuracy for testing with reasonable processing time, we should use around 20 subjects.

Keywords: hand movement recognition, recurrent neural network, movement error rate, intrasubject evaluation, intersubject evaluation

Procedia PDF Downloads 119
788 Lower Risk of Ischemic Stroke in Hormone Therapy Users with Use of Chinese Herbal Medicine

Authors: Shu-Hui Wen, Wei-Chuan Chang, Hsien-Chang Wu

Abstract:

Background: Little is known about the benefits and risks of use of Chinese herbal medicine (CHM) in conditions related to hormone therapy (HT) use on the risk of ischemic stroke (IS). The aim of this study is to explore the risk of IS in menopausal women treated with HT and CHM. Materials and methods: A total of 32,441 menopausal women without surgical menopause aged 40- 65 years were selected from 2003 to 2010 using the 2-million random samples of the National Health Insurance Research Database in Taiwan. According to the medication usage of HT and CHM, we divided the current and recent users into two groups: an HT use-only group (n = 4,989) and an HT/CHM group (n = 9,265). Propensity-score matching samples (4,079 pairs) were further created to deal with confounding by indication. The adjusted hazard ratios (HR) of IS during HT or CHM treatment were estimated by the robust Cox proportional hazards model. Results: The incidence rate of IS in the HT/CHM group was significantly lower than in the HT group (4.5 vs. 12.8 per 1000 person-year, p < 0.001). Multivariate analysis results indicated that additional CHM use was significant with a lower risk of IS (HR = 0.3; 95% confidence interval, 0.21-0.43). Further subgroup analyses and sensitivity analyses had similar findings. Conclusion: We found that combined use of HT and CHM was associated with a lower risk for IS than HT use only. Further study is needed to examine possible mechanism underlying this association.

Keywords: Chinese herbal medicine, hormone therapy, ischemic stroke, menopause

Procedia PDF Downloads 345
787 A Hybrid Feature Selection and Deep Learning Algorithm for Cancer Disease Classification

Authors: Niousha Bagheri Khulenjani, Mohammad Saniee Abadeh

Abstract:

Learning from very big datasets is a significant problem for most present data mining and machine learning algorithms. MicroRNA (miRNA) is one of the important big genomic and non-coding datasets presenting the genome sequences. In this paper, a hybrid method for the classification of the miRNA data is proposed. Due to the variety of cancers and high number of genes, analyzing the miRNA dataset has been a challenging problem for researchers. The number of features corresponding to the number of samples is high and the data suffer from being imbalanced. The feature selection method has been used to select features having more ability to distinguish classes and eliminating obscures features. Afterward, a Convolutional Neural Network (CNN) classifier for classification of cancer types is utilized, which employs a Genetic Algorithm to highlight optimized hyper-parameters of CNN. In order to make the process of classification by CNN faster, Graphics Processing Unit (GPU) is recommended for calculating the mathematic equation in a parallel way. The proposed method is tested on a real-world dataset with 8,129 patients, 29 different types of tumors, and 1,046 miRNA biomarkers, taken from The Cancer Genome Atlas (TCGA) database.

Keywords: cancer classification, feature selection, deep learning, genetic algorithm

Procedia PDF Downloads 95
786 Vehicle Risk Evaluation in Low Speed Accidents: Consequences for Relevant Test Scenarios

Authors: Philip Feig, Klaus Gschwendtner, Julian Schatz, Frank Diermeyer

Abstract:

Projects of accident research analysis are mostly focused on accidents involving personal damage. Property damage only has a high frequency of occurrence combined with high economic impact. This paper describes main influencing parameters for the extent of damage and presents a repair cost model. For a prospective evaluation method of the monetary effect of advanced driver assistance systems (ADAS), it is necessary to be aware of and quantify all influencing parameters. Furthermore, this method allows the evaluation of vehicle concepts in combination with an ADAS at an early point in time of the product development process. In combination with a property damage database and the introduced repair cost model relevant test scenarios for specific vehicle configurations and their individual property damage risk may be determined. Currently, equipment rates of ADAS are low and a purchase incentive for customers would be beneficial. The next ADAS generation will prevent property damage to a large extent or at least reduce damage severity. Both effects may be a purchasing incentive for the customer and furthermore contribute to increased traffic safety.

Keywords: accident research, accident scenarios, ADAS, effectiveness, property damage analysis

Procedia PDF Downloads 325
785 Reproductive Health of Women After Taking Chemotherapy for Gestational Trophoblastic Disease

Authors: Ezeh Chukwunonso Peter Excel, Akruti Vg

Abstract:

Aim/Background: To show that even after undergoing 1-5 courses of chemotherapy for Gestational Trophoblastic Disease (GTD) reproductive health of women is intact and they conceive successfully after it. Method: Retrospective cohort analysis using data from the Lugansk regional maternity hospital database of years 1993-2013, which shows n=18 females had GTD and underwent 1-5 courses of chemotherapy. Results and Discussion: Frequency of GTD was rare. All 18 patients (pts) belong to age group of 17-39 years, covering wide range of reproductive age. Out of 18 pts, 15 had hydatidiform mole (HM) while other 3 had choriocarcinoma (CC). In anamnesis, among CC pts, 1 had early pre-eclampsia at 24 weeks and 1 had 4th week of late postpartum (PP) bleeding, while all HM pts had genital inflammatory diseases, 1 pt of HM during follow-up had High hCG and 3 times curettage in 5 months. 18 women became pregnant for 25 times after chemotherapy. Chemotherapy was given under indication of either high level of HCG, luteal cyst >6cm or path-morphological results of curettage. CC 3 pts had (2 spontaneous abortions (SA), 2 term cesarean section (CS), 1 preterm CS). HM 15 pts had (3 artificial abortion, 2 SA, 7CS (5 term and 2 preterm), 8 vaginal deliveries (7 term and 1 preterm)). Conclusion: During our research we got 22.2% preterm deliveries and 55.6% CS which is higher than the normal cases, but still all the 18 women were able to have kids successfully after chemotherapy. So we can conclude that chemotherapy for GTD was successful in keeping the reproductive health of women intact.

Keywords: reproductive health, chemotherapy, gestational trophoblastic disease, women

Procedia PDF Downloads 369
784 Using Closed Frequent Itemsets for Hierarchical Document Clustering

Authors: Cheng-Jhe Lee, Chiun-Chieh Hsu

Abstract:

Due to the rapid development of the Internet and the increased availability of digital documents, the excessive information on the Internet has led to information overflow problem. In order to solve these problems for effective information retrieval, document clustering in text mining becomes a popular research topic. Clustering is the unsupervised classification of data items into groups without the need of training data. Many conventional document clustering methods perform inefficiently for large document collections because they were originally designed for relational database. Therefore they are impractical in real-world document clustering and require special handling for high dimensionality and high volume. We propose the FIHC (Frequent Itemset-based Hierarchical Clustering) method, which is a hierarchical clustering method developed for document clustering, where the intuition of FIHC is that there exist some common words for each cluster. FIHC uses such words to cluster documents and builds hierarchical topic tree. In this paper, we combine FIHC algorithm with ontology to solve the semantic problem and mine the meaning behind the words in documents. Furthermore, we use the closed frequent itemsets instead of only use frequent itemsets, which increases efficiency and scalability. The experimental results show that our method is more accurate than those of well-known document clustering algorithms.

Keywords: FIHC, documents clustering, ontology, closed frequent itemset

Procedia PDF Downloads 377
783 Using Analytics to Redefine Athlete Resilience

Authors: Phil P. Wagner

Abstract:

There is an overwhelming amount of athlete-centric information available for sport practitioners in this era of tech and big data, but protocols in athletic rehabilitation remain arbitrary. It is a common assumption that the rate at which tissue heals amongst individuals is the same; yielding protocols that are entirely time-based. Progressing athletes through rehab programs that lack individualization can potentially expose athletes to stimuli they are not prepared for or unnecessarily lengthen their recovery period. A 7-year aggregated and anonymous database was used to develop reliable and valid assessments to measure athletic resilience. Each assessment utilizes force plate technology with proprietary protocols and analysis to provide key thresholds for injury risk and recovery. Using a T score to analyze movement qualities, much like the Z score used for bone density from a Dexa scan, specific prescriptions are provided to mitigate the athlete’s inherent injury risk. In addition to obliging to surgical clearance, practitioners must put in place a clearance protocol guided by standardized assessments and achievement in strength thresholds. In order to truly hold individuals accountable (practitioners, athletic trainers, performance coaches, etc.), success in improving pre-defined key performance indicators must be frequently assessed and analyzed.

Keywords: analytics, athlete rehabilitation, athlete resilience, injury prediction, injury prevention

Procedia PDF Downloads 207
782 Intercultural Education through Literature Reception: An in-Depth Study of the Cultural and Literary Relations of Romania and China during 1948-2018

Authors: Iulia Elena Gîță

Abstract:

According to the sociological theory of literature, constraints on the creation and share of cultural works can be placed between two extremes: one with a high level of politicization and the other with a high level of commercialization. The overall objective of the present research is to follow the principles of Sociology of Translation to closely map and analyse the publishing activity of Romania concerning China and Chinese literature during four stages of Romanian history between 1948-2018. This paper proposes, thus, an extended approach to literature, to its cultural, political and economic reception. In achieving the proposed objectives, the research expands far beyond the literary text itself, to its macro context, analysing, through quantitative research methods, a statistical database created based on two phases - the first part containing literary and non-fictional works that address and discuss issues related to China; the second part includes literary translations of Chinese literature into Romanian, either by direct translation or by an intermediate language. Throughout this paper we will map not only the number of works, but also the topics approached by writers along the two periods of the political life of Romania.

Keywords: bilateral relations, Chinese literature, intercultural understanding, international relations, socio-cultural reception, socio-political constraints, publishing

Procedia PDF Downloads 116
781 Correlation of Material Mechanical Characteristics Obtained by Means of Standardized and Miniature Test Specimens

Authors: Vaclav Mentl, P. Zlabek, J. Volak

Abstract:

New methods of mechanical testing were developed recently that are based on making use of miniature test specimens (e.g. Small Punch Test). The most important advantage of these method is the nearly non-destructive withdrawal of test material and small size of test specimen what is interesting in cases of remaining lifetime assessment when a sufficient volume of the representative material cannot be withdrawn of the component in question. In opposite, the most important disadvantage of such methods stems from the necessity to correlate test results with the results of standardised test procedures and to build up a database of material data in service. The correlations among the miniature test specimen data and the results of standardised tests are necessary. The paper describes the results of fatigue tests performed on miniature tests specimens in comparison with traditional fatigue tests for several steels applied in power producing industry. Special miniature test specimens fixtures were designed and manufactured for the purposes of fatigue testing at the Zwick/Roell 10HPF5100 testing machine. The miniature test specimens were produced of the traditional test specimens. Seven different steels were fatigue loaded (R = 0.1) at room temperature.

Keywords: mechanical properties, miniature test specimens, correlations, small punch test, micro-tensile test, mini-charpy impact test

Procedia PDF Downloads 515
780 Development of Quality Assessment Tool to Gauge Fire Response Activities of Emergency Personnel in Denmark

Authors: Jennifer E. Lynette

Abstract:

The purpose of this study is to develop a nation-wide assessment tool to gauge the quality and efficiency of response activities by emergency personnel to fires in Denmark. Current fire incident reports lack detailed information that can lead to breakthroughs in research and improve emergency response efforts. Information generated from the report database is analyzed and assessed for efficiency and quality. By utilizing information collection gaps in the incident reports, an improved, indepth, and streamlined quality gauging system is developed for use by fire brigades. This study pinpoints previously unrecorded factors involved in the response phases of a fire. Variables are recorded and ranked based on their influence to event outcome. By assessing and measuring these data points, quality standards are developed. These quality standards include details of the response phase previously overlooked which individually and cumulatively impact the overall success of a fire response effort. Through the application of this tool and implementation of associated quality standards at Denmark’s fire brigades, there is potential to increase efficiency and quality in the preparedness and response phases, thereby saving additional lives, property, and resources.

Keywords: emergency management, fire, preparedness, quality standards, response

Procedia PDF Downloads 303
779 An Insight into the Conformational Dynamics of Glycan through Molecular Dynamics Simulation

Authors: K. Veluraja

Abstract:

Glycan of glycolipids and glycoproteins is playing a significant role in living systems particularly in molecular recognition processes. Molecular recognition processes are attributed to their occurrence on the surface of the cell, sequential arrangement and type of sugar molecules present in the oligosaccharide structure and glyosidic linkage diversity (glycoinformatics) and conformational diversity (glycoconformatics). Molecular Dynamics Simulation study is a theoretical-cum-computational tool successfully utilized to establish glycoconformatics of glycan. The study on various oligosaccharides of glycan clearly indicates that oligosaccharides do exist in multiple conformational states and these conformational states arise due to the flexibility associated with a glycosidic torsional angle (φ,ψ) . As an example: a single disaccharide structure NeuNacα(2-3) Gal exists in three different conformational states due to the differences in the preferential value of glycosidic torsional angles (φ,ψ). Hence establishing three dimensional structural and conformational models for glycan (cartesian coordinates of every individual atoms of an oligosaccharide structure in a preferred conformation) is quite crucial to understand various molecular recognition processes such as glycan-toxin interaction and glycan-virus interaction. The gycoconformatics models obtained for various glycan through Molecular Dynamics Simulation stored in our 3DSDSCAR (3DSDSCAR.ORG) a public domain database and its utility value in understanding the molecular recognition processes and in drug design venture will be discussed.

Keywords: glycan, glycoconformatics, molecular dynamics simulation, oligosaccharide

Procedia PDF Downloads 118
778 Code Evaluation on Web-Shear Capacity of Presstressed Hollow-Core Slabs

Authors: Min-Kook Park, Deuck Hang Lee, Hyun Mo Yang, Jae Hyun Kim, Kang Su Kim

Abstract:

Prestressed hollow-core slabs (HCS) are structurally optimized precast units with light-weight hollowed-sections and very economical due to the mass production by a unique production method. They have been thus widely used in the precast concrete constructions in many countries all around the world. It is, however, difficult to provide shear reinforcement in HCS units produced by the extrusion method, and thus all the shear forces should be resisted solely by concrete webs in the HCS units. This means that, for the HCS units, it is very important to estimate the contribution of web concrete to the shear resistance accurately. In design codes, however, the shear strengths for HCS units are estimated by the same equations that are used for typical prestressed concrete members, which were determined from the calibrations to experimental results of conventional prestressed concrete members other than HCS units. In this study, therefore, shear test results of HCS members with a wide range of influential variables were collected, and the shear strength equations in design codes were thoroughly examined by comparing to the experimental results in the shear database of HCS members. Acknowledgement: This research was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Science, ICT & Future Planning(NRF-2016R1A2B2010277).

Keywords: hollow-core, web-shear, precast concrete, prestress, capacity

Procedia PDF Downloads 489
777 Performance Comparison of Thread-Based and Event-Based Web Servers

Authors: Aikaterini Kentroti, Theodore H. Kaskalis

Abstract:

Today, web servers are expected to serve thousands of client requests concurrently within stringent response time limits. In this paper, we evaluate experimentally and compare the performance as well as the resource utilization of popular web servers, which differ in their approach to handle concurrency. More specifically, Central Processing Unit (CPU)- and I/O intensive tests were conducted against the thread-based Apache and Go as well as the event-based Nginx and Node.js under increasing concurrent load. The tests involved concurrent users requesting a term of the Fibonacci sequence (the 10th, 20th, 30th) and the content of a table from the database. The results show that Go achieved the best performance in all benchmark tests. For example, Go reached two times higher throughput than Node.js and five times higher than Apache and Nginx in the 20th Fibonacci term test. In addition, Go had the smallest memory footprint and demonstrated the most efficient resource utilization, in terms of CPU usage. Instead, Node.js had by far the largest memory footprint, consuming up to 90% more memory than Nginx and Apache. Regarding the performance of Apache and Nginx, our findings indicate that Hypertext Preprocessor (PHP) becomes a bottleneck when the servers are requested to respond by performing CPU-intensive tasks under increasing concurrent load.

Keywords: apache, Go, Nginx, node.js, web server benchmarking

Procedia PDF Downloads 76
776 Quality and Yield of Potato Seed Tubers as Influenced by Plant Growth Promoting Rhizobacteria

Authors: Muhammad Raqib Rasul, Tavga Sulaiman Rashid

Abstract:

Fertilization increases efficiency and obtains better quality of product recovery in agricultural activities. However, fertilizer consumption increased exponentially throughout the world, causing severe environmental problems. Biofertilizers can be a practical approach to minimize chemical fertilizer sources and ultimately develop soil fertility. This study was carried out to isolate, identify and characterize bacteria from medicinal plant (Rumex tuberosus L. and Verbascum sp.) rhizosphere for in vivo screening. 25 bacterial isolates were isolated and several biochemical tests were performed. Two isolates that were positive for most biochemical tests were chosen for the field experiment. The isolates were identified as Go1 Alcaligenes faecalis (Accession No. OP001725) and T11 (Bacillus sp.) based on the 16S rRNA sequence analysis that was compared with related bacteria in GenBank database using MEGA 6.1. For the field trial isolate GO1 and T11 (separately and mixed), NPK as a positive control was used. Both isolates increased plant height, chlorophyll content, number of tubers, and tuber’s weight. The results demonstrated that these two isolates of bacteria can potentially replace with chemical fertilizers for potato production.

Keywords: biofertilizer, Bacillus subtilis, Alcaligenes faecalis, potato tubers, in vivo screening

Procedia PDF Downloads 91
775 Experımental Study of Structural Insulated Panel under Lateral Load

Authors: H. Abbasi, K. Sennah

Abstract:

A Structural Insulated Panel (SIP) is a structural element contains of foam insulation core sandwiched between two oriented-strand boards (OSB), plywood boards, steel sheets or fibre cement boards. Superior insulation, exceptional strength and fast insulation are the specifications of a SIP-based structure. There are also many other benefits such as less total construction costs, speed of construction, less expensive HVAC equipment required, favourable energy-efficient mortgages comparing to wood-framed houses. This paper presents the experimental analysis on selected foam-timber SIPs to study their structural behaviour when used as walls in residential construction under lateral loading. The experimental program has also taken several stud panels in order to compare the performance of SIP with conventional wood-frame system. The results of lateral tests performed in this study established a database that can be used further to develop design tables of SIP wall subjected to lateral loading caused by wind or earthquake. A design table for walls subjected to lateral loading was developed. Experimental results proved that the tested SIPs are ‘as good as’ the conventional wood-frame system.

Keywords: structural insulated panel, experimental study, lateral load, design tables

Procedia PDF Downloads 303
774 Multivariate Output-Associative RVM for Multi-Dimensional Affect Predictions

Authors: Achut Manandhar, Kenneth D. Morton, Peter A. Torrione, Leslie M. Collins

Abstract:

The current trends in affect recognition research are to consider continuous observations from spontaneous natural interactions in people using multiple feature modalities, and to represent affect in terms of continuous dimensions, incorporate spatio-temporal correlation among affect dimensions, and provide fast affect predictions. These research efforts have been propelled by a growing effort to develop affect recognition system that can be implemented to enable seamless real-time human-computer interaction in a wide variety of applications. Motivated by these desired attributes of an affect recognition system, in this work a multi-dimensional affect prediction approach is proposed by integrating multivariate Relevance Vector Machine (MVRVM) with a recently developed Output-associative Relevance Vector Machine (OARVM) approach. The resulting approach can provide fast continuous affect predictions by jointly modeling the multiple affect dimensions and their correlations. Experiments on the RECOLA database show that the proposed approach performs competitively with the OARVM while providing faster predictions during testing.

Keywords: dimensional affect prediction, output-associative RVM, multivariate regression, fast testing

Procedia PDF Downloads 266
773 Towards Human-Interpretable, Automated Learning of Feedback Control for the Mixing Layer

Authors: Hao Li, Guy Y. Cornejo Maceda, Yiqing Li, Jianguo Tan, Marek Morzynski, Bernd R. Noack

Abstract:

We propose an automated analysis of the flow control behaviour from an ensemble of control laws and associated time-resolved flow snapshots. The input may be the rich database of machine learning control (MLC) optimizing a feedback law for a cost function in the plant. The proposed methodology provides (1) insights into the control landscape, which maps control laws to performance, including extrema and ridge-lines, (2) a catalogue of representative flow states and their contribution to cost function for investigated control laws and (3) visualization of the dynamics. Key enablers are classification and feature extraction methods of machine learning. The analysis is successfully applied to the stabilization of a mixing layer with sensor-based feedback driving an upstream actuator. The fluctuation energy is reduced by 26%. The control replaces unforced Kelvin-Helmholtz vortices with subsequent vortex pairing by higher-frequency Kelvin-Helmholtz structures of lower energy. These efforts target a human interpretable, fully automated analysis of MLC identifying qualitatively different actuation regimes, distilling corresponding coherent structures, and developing a digital twin of the plant.

Keywords: machine learning control, mixing layer, feedback control, model-free control

Procedia PDF Downloads 205
772 Integrating Building Information Modeling into Facilities Management Operations

Authors: Mojtaba Valinejadshoubi, Azin Shakibabarough, Ashutosh Bagchi

Abstract:

Facilities such as residential buildings, office buildings, and hospitals house large density of occupants. Therefore, a low-cost facility management program (FMP) should be used to provide a satisfactory built environment for these occupants. Facility management (FM) has been recently used in building projects as a critical task. It has been effective in reducing operation and maintenance cost of these facilities. Issues of information integration and visualization capabilities are critical for reducing the complexity and cost of FM. Building information modeling (BIM) can be used as a strong visual modeling tool and database in FM. The main objective of this study is to examine the applicability of BIM in the FM process during a building’s operational phase. For this purpose, a seven-storey office building is modeled Autodesk Revit software. Authors integrated the cloud-based environment using a visual programming tool, Dynamo, for the purpose of having a real-time cloud-based communication between the facility managers and the participants involved in the project. An appropriate and effective integrated data source and visual model such as BIM can reduce a building’s operational and maintenance costs by managing the building life cycle properly.

Keywords: building information modeling, facility management, operational phase, building life cycle

Procedia PDF Downloads 137
771 Understanding the Impact of Climate Change on Farmer's Technical Efficiency in Mali

Authors: Christelle Tchoupé Makougoum

Abstract:

In the context of agriculture, differences across localities in term of climate change can create systematic variation among farmers technical efficiency. Failure to account for climate variability could lead to wrong conclusions about farmers’ technical efficiency and also it could bias the ranking of farmers according to their managerial performance. The literature on agricultural productivity has given little attention to this issue whereas it is necessary for establishing to what extent climate affects farmers efficiency. This article contributes to the preview literature by two ways. First, it proposed a new econometric model that accounting for the climate change influences on technical efficiency in the specific area of agriculture. Second it estimates the inefficiency due to climate change and the real managerial performance of Malian farmers. Using the Mali’s data from agricultural census and CRU TS3 climatic database we implemented an adjusted stochastic frontier methodology to account for the impact of environmental factors. The results yield three main findings. First, instability in temperatures and rainfall decreases technical efficiency on average. Second, the climate change modifies the classification of the farmers according to their efficiency scores. Thirdly it is noted that, although climate changes are partly responsible for the deviation from the border, the capacity of farmers to combine inputs into the optimal proportion is more to undermine. The study concluded that improving farmer efficiency should include fostering their resilience to climate change.

Keywords: agriculture, climate change, stochastic production function, technical efficiency

Procedia PDF Downloads 501
770 Data Mining of Students' Performance Using Artificial Neural Network: Turkish Students as a Case Study

Authors: Samuel Nii Tackie, Oyebade K. Oyedotun, Ebenezer O. Olaniyi, Adnan Khashman

Abstract:

Artificial neural networks have been used in different fields of artificial intelligence, and more specifically in machine learning. Although, other machine learning options are feasible in most situations, but the ease with which neural networks lend themselves to different problems which include pattern recognition, image compression, classification, computer vision, regression etc. has earned it a remarkable place in the machine learning field. This research exploits neural networks as a data mining tool in predicting the number of times a student repeats a course, considering some attributes relating to the course itself, the teacher, and the particular student. Neural networks were used in this work to map the relationship between some attributes related to students’ course assessment and the number of times a student will possibly repeat a course before he passes. It is the hope that the possibility to predict students’ performance from such complex relationships can help facilitate the fine-tuning of academic systems and policies implemented in learning environments. To validate the power of neural networks in data mining, Turkish students’ performance database has been used; feedforward and radial basis function networks were trained for this task; and the performances obtained from these networks evaluated in consideration of achieved recognition rates and training time.

Keywords: artificial neural network, data mining, classification, students’ evaluation

Procedia PDF Downloads 591
769 Energy Consumption and Economic Growth: Testimony of Selected Sub-Saharan Africa Countries

Authors: Alfred Quarcoo

Abstract:

The main purpose of this paper is to examine the causal relationship between energy consumption and economic growth in Sub-Saharan Africa using panel data techniques. An annual data on energy consumption and Economic Growth (proxied by real gross domestic product per capita) spanning from 1990 to 2016 from the World bank index database was used. The results of the Augmented Dickey–Fuller unit root test shows that the series for all countries are not stationary at levels. However, the log of economic growth in Benin and Congo become stationary after taking the differences of the data, and log of energy consumption become stationary for all countries and Log of economic growth in Kenya and Zimbabwe were found to be stationary after taking the second differences of the panel series. The findings of the Johansen cointegration test demonstrate that the variables Log of Energy Consumption and Log of economic growth are not co-integrated for the cases of Kenya and Zimbabwe, so no long-run relationship between the variables were established in any country. The Granger causality test indicates that there is a unidirectional causality running from energy use to economic growth in Kenya and no causal linkage between Energy consumption and economic growth in Benin, Congo and Zimbabwe.

Keywords: Cointegration, Granger Causality, Sub-Sahara Africa, World Bank Development Indicators

Procedia PDF Downloads 31
768 Optimizing SCADA/RTU Control System Alarms for Gas Wells

Authors: Mohammed Ali Faqeeh

Abstract:

SCADA System Alarms Optimization Process has been introduced recently and applied accordingly in different implemented stages. First, MODBUS communication protocols between RTU/SCADA were improved at the level of I/O points scanning intervals. Then, some of the technical issues related to manufacturing limitations were resolved. Afterward, another approach was followed to take a decision on the configured alarms database. So, a couple of meetings and workshops were held among all system stakeholders, which resulted in an agreement of disabling unnecessary (Diagnostic) alarms. Moreover, a leap forward step was taken to segregate the SCADA Operator Graphics in a way to show only process-related alarms while some other graphics will ensure the availability of field alarms related to maintenance and engineering purposes. This overall system management and optimization have resulted in a huge effective impact on all operations, maintenance, and engineering. It has reduced unneeded open tickets for maintenance crews which led to reduce the driven mileages accordingly. Also, this practice has shown a good impression on the operation reactions and response to the emergency situations as the SCADA operators can be staying much vigilant on the real alarms rather than gets distracted by noisy ones. SCADA System Alarms Optimization process has been executed utilizing all applicable in-house resources among engineering, maintenance, and operations crews. The methodology of the entire enhanced scopes is performed through various stages.

Keywords: SCADA, RTU Communication, alarm management system, SCADA alarms, Modbus, DNP protocol

Procedia PDF Downloads 149
767 Spatio-Temporal Data Mining with Association Rules for Lake Van

Authors: Tolga Aydin, M. Fatih Alaeddinoğlu

Abstract:

People, throughout the history, have made estimates and inferences about the future by using their past experiences. Developing information technologies and the improvements in the database management systems make it possible to extract useful information from knowledge in hand for the strategic decisions. Therefore, different methods have been developed. Data mining by association rules learning is one of such methods. Apriori algorithm, one of the well-known association rules learning algorithms, is not commonly used in spatio-temporal data sets. However, it is possible to embed time and space features into the data sets and make Apriori algorithm a suitable data mining technique for learning spatio-temporal association rules. Lake Van, the largest lake of Turkey, is a closed basin. This feature causes the volume of the lake to increase or decrease as a result of change in water amount it holds. In this study, evaporation, humidity, lake altitude, amount of rainfall and temperature parameters recorded in Lake Van region throughout the years are used by the Apriori algorithm and a spatio-temporal data mining application is developed to identify overflows and newly-formed soil regions (underflows) occurring in the coastal parts of Lake Van. Identifying possible reasons of overflows and underflows may be used to alert the experts to take precautions and make the necessary investments.

Keywords: apriori algorithm, association rules, data mining, spatio-temporal data

Procedia PDF Downloads 356
766 Comparative Advantage of Mobile Agent Application in Procuring Software Products on the Internet

Authors: Michael K. Adu, Boniface K. Alese, Olumide S. Ogunnusi

Abstract:

This paper brings to fore the inherent advantages in application of mobile agents to procure software products rather than downloading software content on the Internet. It proposes a system whereby the products come on compact disk with mobile agent as deliverable. The client/user purchases a software product, but must connect to the remote server of the software developer before installation. The user provides an activation code that activates mobile agent which is part of the software product on compact disk. The validity of the activation code is checked on connection at the developer’s end to ascertain authenticity and prevent piracy. The system is implemented by downloading two different software products as compare with installing same products on compact disk with mobile agent’s application. Downloading software contents from developer’s database as in the traditional method requires a continuously open connection between the client and the developer’s end, a fixed network is not economically or technically feasible. Mobile agent after being dispatched into the network becomes independent of the creating process and can operate asynchronously and autonomously. It can reconnect later after completing its task and return for result delivery. Response Time and Network Load are very minimal with application of Mobile agent.

Keywords: software products, software developer, internet, activation code, mobile agent

Procedia PDF Downloads 292
765 Semiautomatic Calculation of Ejection Fraction Using Echocardiographic Image Processing

Authors: Diana Pombo, Maria Loaiza, Mauricio Quijano, Alberto Cadena, Juan Pablo Tello

Abstract:

In this paper, we present a semi-automatic tool for calculating ejection fraction from an echocardiographic video signal which is derived from a database in DICOM format, of Clinica de la Costa - Barranquilla. Described in this paper are each of the steps and methods used to find the respective calculation that includes acquisition and formation of the test samples, processing and finally the calculation of the parameters to obtain the ejection fraction. Two imaging segmentation methods were compared following a methodological framework that is similar only in the initial stages of processing (process of filtering and image enhancement) and differ in the end when algorithms are implemented (Active Contour and Region Growing Algorithms). The results were compared with the measurements obtained by two different medical specialists in cardiology who calculated the ejection fraction of the study samples using the traditional method, which consists of drawing the region of interest directly from the computer using echocardiography equipment and a simple equation to calculate the desired value. The results showed that if the quality of video samples are good (i.e., after the pre-processing there is evidence of an improvement in the contrast), the values provided by the tool are substantially close to those reported by physicians; also the correlation between physicians does not vary significantly.

Keywords: echocardiography, DICOM, processing, segmentation, EDV, ESV, ejection fraction

Procedia PDF Downloads 413