Search results for: test data.
8735 Morphemic Analysis Awareness: Impact on ESL Students’ Vocabulary Learning Strategy
Authors: Chandrakala Varatharajoo, Adelina Binti Asmawi, Nabeel Abdallah Mohammad Abedalaziz
Abstract:
The research explored the effect of morphemic analysis awareness on ESL secondary school students’ vocabulary acquisition. The quasi-experimental study was conducted with 100 ESL secondary school students in two experimental groups (inflectional and derivational) and one control group. The students’ vocabulary acquisition was assessed through two measures: Morph-Analysis Test and Morph-Vocabulary Test in the pretest and posttest before and after an intervention programme. Results of ANCOVA revealed that both the experimental groups achieved a significant score in Morph- Analysis Test and Vocabulary-Morphemic Test. However, the inflectional group obtained a fairly higher score than the derivational group. Thus, the findings of the research are discussed in two main areas. First, individual instructions of two types of morphemic awareness have contributed significant results on inflectional and derivational awareness among the ESL secondary school students. Nevertheless, derivational morphology achieved a significant but relatively smaller amount of effect on secondary school students’ morphological awareness compared to inflectional morphology in this research. Second finding showed that the awareness of inflectional and derivational morphology was found significantly related to vocabulary achievement of ESL secondary school students. Nevertheless, inflectional morphemic awareness had higher significant effect on ESL secondary school students’ vocabulary acquisition. Despite these findings, the study implies that morphemic analysis awareness can serve as an alternative strategy for ESL secondary school students in acquiring English vocabulary.
Keywords: Morphemic analysis, vocabulary, ESL students.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29038734 The Use of SD Bioline TB AgMPT64® Detection Assay for Rapid Characterization of Mycobacteria in Nigeria
Authors: S. Ibrahim, U. B. Abubakar, S. Danbirni, A. Usman, F. M. Ballah, C. A. Kudi, L. Lawson, G. H. Abdulrazak, I. A. Abdulkadir
Abstract:
Performing culture and characterization of mycobacteria in low resource settings like Nigeria is a very difficult task to undertake because of the very few and limited laboratories carrying out such an experiment; this is a largely due to stringent and laborious nature of the tests. Hence, a rapid, simple and accurate test for characterization is needed. The “SD BIOLINE TB Ag MPT 64 Rapid ®” is a simple and rapid immunochromatographic test used in differentiating Mycobacteria into Mycobacterium tuberculosis (NTM). The 100 sputa were obtained from patients suspected to be infected with tuberculosis and presented themselves to hospitals for check-up and treatment were involved in the study. The samples were cultured in a class III Biosafety cabinet and level III biosafety practices were followed. Forty isolates were obtained from the cultured sputa, and there were identified as Acid-fast bacilli (AFB) using Zeihl-Neelsen acid-fast stain. All the isolates (AFB positive) were then subjected to the SD BIOLINE Analyses. A total of 31 (77.5%) were characterized as MTBC, while nine (22.5%) were NTM. The total turnaround time for the rapid assay was just 30 minutes as compared to a few days of phenotypic and genotypic method. It was simple, rapid and reliable test to differentiate MTBC from NTM.
Keywords: Culture, mycobacteria, non-tuberculous mycobacteria, SD bioline.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11988733 The Study on the Stationarity of Housing Price-to-Rent and Housing Price-to-Income Ratios in China
Authors: Wen-Chi Liu
Abstract:
This paper aims to examine whether a bubble is present in the housing market of China. Thus, we use the housing price-to-income ratios and housing price-to-rent ratios of 35 cities from 1998 to 2010. The methods of the panel KSS unit root test with a Fourier function and the SPSM process are likewise used. The panel KSS unit root test with a Fourier function considers the problem of non-linearity and structural changes, and the SPSM process can avoid the stationary time series from dominating the result-generated bias. Through a rigorous empirical study, we determine that the housing price-to-income ratios are stationary in 34 of the 35 cities in China. Only Xining is non-stationary. The housing price-to-rent ratios are stationary in 32 of the 35 cities in China. Chengdu, Fuzhou, and Zhengzhou are non-stationary. Overall, the housing bubbles are not a serious problem in China at the time.
Keywords: Housing Price-to-Income Ratio, Housing Price-to-Rent Ratio, Housing Bubbles, Panel Unit-Root Test.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23828732 IMDC: An Image-Mapped Data Clustering Technique for Large Datasets
Authors: Faruq A. Al-Omari, Nabeel I. Al-Fayoumi
Abstract:
In this paper, we present a new algorithm for clustering data in large datasets using image processing approaches. First the dataset is mapped into a binary image plane. The synthesized image is then processed utilizing efficient image processing techniques to cluster the data in the dataset. Henceforth, the algorithm avoids exhaustive search to identify clusters. The algorithm considers only a small set of the data that contains critical boundary information sufficient to identify contained clusters. Compared to available data clustering techniques, the proposed algorithm produces similar quality results and outperforms them in execution time and storage requirements.
Keywords: Data clustering, Data mining, Image-mapping, Pattern discovery, Predictive analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15008731 Effect of Sedimentation on Torque Transmission in the Larger Radius Magnetorheological Clutch
Authors: Manish Kumar Thakur, Chiranjit Sarkar
Abstract:
Sedimentation of magnetorheological (MR) fluid affects its working. MR fluid is a smart fluid that has unique qualities such as quick responsiveness and easy controllability. It is used in the MR damper, MR brake, and MR clutch. In this work effect of sedimentation on torque transmission in the shear mode operated MR clutch is investigated. A test rig is developed to test the impact of sedimentation on torque transmission in the MR clutch. Torque transmission capability of MR clutch has been measured under two conditions to confirm the result of sedimentation. The first experiment is done just after filling and the other after one week. It has been observed that transmission torque is decreased after sedimentation. Hence sedimentation affects the working of the MR clutch.
Keywords: Clutch, magnetorheological fluid, sedimentation, torque.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4428730 Digital Twin of Real Electrical Distribution System with Real Time Recursive Load Flow Calculation and State Estimation
Authors: Anosh Arshad Sundhu, Francesco Giordano, Giacomo Della Croce, Maurizio Arnone
Abstract:
Digital Twin (DT) is a technology that generates a virtual representation of a physical system or process, enabling real-time monitoring, analysis, and simulation. DT of an Electrical Distribution System (EDS) can perform online analysis by integrating the static and real-time data in order to show the current grid status and predictions about the future status to the Distribution System Operator (DSO), producers and consumers. DT technology for EDS also offers the opportunity to DSO to test hypothetical scenarios. This paper discusses the development of a DT of an EDS by Smart Grid Controller (SGC) application, which is developed using open-source libraries and languages. The developed application can be integrated with Supervisory Control and Data Acquisition System (SCADA) of any EDS for creating the DT. The paper shows the performance of developed tools inside the application, tested on real EDS for grid observability, Smart Recursive Load Flow (SRLF) calculation and state estimation of loads in MV feeders.
Keywords: Digital Twin, Distribution System Operator, Electrical Distribution System, Smart Grid Controller, Supervisory Control and Data Acquisition System, Smart Recursive Load Flow.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2558729 Structural Modelling of the LiCl Aqueous Solution: Using the Hybrid Reverse Monte Carlo (HRMC) Simulation
Authors: M. Habchi, S.M. Mesli, M. Kotbi
Abstract:
The Reverse Monte Carlo (RMC) simulation is applied in the study of an aqueous electrolyte LiCl6H2O. On the basis of the available experimental neutron scattering data, RMC computes pair radial distribution functions in order to explore the structural features of the system. The obtained results include some unrealistic features. To overcome this problem, we use the Hybrid Reverse Monte Carlo (HRMC), incorporating an energy constraint in addition to the commonly used constraints derived from experimental data. Our results show a good agreement between experimental and computed partial distribution functions (PDFs) as well as a significant improvement in pair partial distribution curves. This kind of study can be considered as a useful test for a defined interaction model for conventional simulation techniques.
Keywords: RMC simulation, HRMC simulation, energy constraint, screened potential, glassy state, liquid state, partial distribution function, pair partial distribution function.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14688728 Estimation of Aquifer Properties Using Pumping Tests: Case Study of Pydibhimavaram Industrial Area, Srikakulam, India
Authors: G. Venkata Rao, P. Kalpana, R. Srinivasa Rao
Abstract:
Adequate and reliable estimates of aquifer parameters are of utmost importance for proper management of vital groundwater resources. At present scenario, the ground water is polluted because of industrial waste disposed over the land and the contaminants are transported in the aquifer from one area to another area, which is depending on the characteristics of the aquifer and contaminants. To know the contaminant transport, the accurate estimation of aquifer properties is highly needed. Conventionally, these properties are estimated through pumping tests carried out on water wells. The occurrence and movement of ground water in the aquifer are characteristically defined by the aquifer parameters. The pumping (aquifer) test is the standard technique for estimating various hydraulic properties of aquifer systems, viz., transmissivity (T), hydraulic conductivity (K), storage coefficient (S) etc., for which the graphical method is widely used. The study area for conducting pumping test is Pydibheemavaram Industrial area near the coastal belt of Srikulam, AP, India. The main objective of the present work is to estimate the aquifer properties for developing contaminant transport model for the study area.Keywords: Aquifer, contaminant transport, hydraulic conductivity, industrial waste, pumping test.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 34118727 The New Method of Concealed Data Aggregation in Wireless Sensor: A Case Study
Authors: M. Abbasi Dezfouli, S. Mazraeh, M. H. Yektaie
Abstract:
Wireless sensor networks (WSN) consists of many sensor nodes that are placed on unattended environments such as military sites in order to collect important information. Implementing a secure protocol that can prevent forwarding forged data and modifying content of aggregated data and has low delay and overhead of communication, computing and storage is very important. This paper presents a new protocol for concealed data aggregation (CDA). In this protocol, the network is divided to virtual cells, nodes within each cell produce a shared key to send and receive of concealed data with each other. Considering to data aggregation in each cell is locally and implementing a secure authentication mechanism, data aggregation delay is very low and producing false data in the network by malicious nodes is not possible. To evaluate the performance of our proposed protocol, we have presented computational models that show the performance and low overhead in our protocol.
Keywords: Wireless Sensor Networks, Security, Concealed Data Aggregation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17688726 Systematic Mapping Study of Digitization and Analysis of Manufacturing Data
Authors: R. Clancy, M. Ahern, D. O’Sullivan, K. Bruton
Abstract:
The manufacturing industry is currently undergoing a digital transformation as part of the mega-trend Industry 4.0. As part of this phase of the industrial revolution, traditional manufacturing processes are being combined with digital technologies to achieve smarter and more efficient production. To successfully digitally transform a manufacturing facility, the processes must first be digitized. This is the conversion of information from an analogue format to a digital format. The objective of this study was to explore the research area of digitizing manufacturing data as part of the worldwide paradigm, Industry 4.0. The formal methodology of a systematic mapping study was utilized to capture a representative sample of the research area and assess its current state. Specific research questions were defined to assess the key benefits and limitations associated with the digitization of manufacturing data. Research papers were classified according to the type of research and type of contribution to the research area. Upon analyzing 54 papers identified in this area, it was noted that 23 of the papers originated in Germany. This is an unsurprising finding as Industry 4.0 is originally a German strategy with supporting strong policy instruments being utilized in Germany to support its implementation. It was also found that the Fraunhofer Institute for Mechatronic Systems Design, in collaboration with the University of Paderborn in Germany, was the most frequent contributing Institution of the research papers with three papers published. The literature suggested future research directions and highlighted one specific gap in the area. There exists an unresolved gap between the data science experts and the manufacturing process experts in the industry. The data analytics expertise is not useful unless the manufacturing process information is utilized. A legitimate understanding of the data is crucial to perform accurate analytics and gain true, valuable insights into the manufacturing process. There lies a gap between the manufacturing operations and the information technology/data analytics departments within enterprises, which was borne out by the results of many of the case studies reviewed as part of this work. To test the concept of this gap existing, the researcher initiated an industrial case study in which they embedded themselves between the subject matter expert of the manufacturing process and the data scientist. Of the papers resulting from the systematic mapping study, 12 of the papers contributed a framework, another 12 of the papers were based on a case study, and 11 of the papers focused on theory. However, there were only three papers that contributed a methodology. This provides further evidence for the need for an industry-focused methodology for digitizing and analyzing manufacturing data, which will be developed in future research.
Keywords: Analytics, digitization, industry 4.0, manufacturing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7368725 Study on a New Formulation of Domestic Metro Synthetic Brake Shoe
Authors: Yang Chengmei
Abstract:
In this paper, taking Chinese Nanjing Metro ALSTOM vehicle synthesis brake as an example, the subway with synthetic brake shoe formula components of final product performance, has done a lot of research and performance test, final is drawn with hybrid fiber as reinforcing material, modified phenolic resin as matrix material, and then filling friction modifier performance, by the hot pressing process made a new type of domestic subway brake shoe. The product of the test performance indicators that can replace the similar foreign products.
Keywords: Metro, synthetic brake shoe, component analysis, formula research.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30238724 Simulation of the Large Hadrons Collisions Using Monte Carlo Tools
Authors: E. Al Daoud
Abstract:
In many cases, theoretical treatments are available for models for which there is no perfect physical realization. In this situation, the only possible test for an approximate theoretical solution is to compare with data generated from a computer simulation. In this paper, Monte Carlo tools are used to study and compare the elementary particles models. All the experiments are implemented using 10000 events, and the simulated energy is 13 TeV. The mean and the curves of several variables are calculated for each model using MadAnalysis 5. Anomalies in the results can be seen in the muons masses of the minimal supersymmetric standard model and the two Higgs doublet model.Keywords: Feynman rules, hadrons, Lagrangian, Monte Carlo, simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11278723 Peakwise Smoothing of Data Models using Wavelets
Authors: D Sudheer Reddy, N Gopal Reddy, P V Radhadevi, J Saibaba, Geeta Varadan
Abstract:
Smoothing or filtering of data is first preprocessing step for noise suppression in many applications involving data analysis. Moving average is the most popular method of smoothing the data, generalization of this led to the development of Savitzky-Golay filter. Many window smoothing methods were developed by convolving the data with different window functions for different applications; most widely used window functions are Gaussian or Kaiser. Function approximation of the data by polynomial regression or Fourier expansion or wavelet expansion also gives a smoothed data. Wavelets also smooth the data to great extent by thresholding the wavelet coefficients. Almost all smoothing methods destroys the peaks and flatten them when the support of the window is increased. In certain applications it is desirable to retain peaks while smoothing the data as much as possible. In this paper we present a methodology called as peak-wise smoothing that will smooth the data to any desired level without losing the major peak features.Keywords: smoothing, moving average, peakwise smoothing, spatialdensity models, planar shape models, wavelets.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17508722 A New Precautionary Method for Measurement and Improvement the Data Quality
Authors: Seyed Mohammad Hossein Moossavizadeh, Mehran Mohsenzadeh, Nasrin Arshadi
Abstract:
the data quality is a kind of complex and unstructured concept, which is concerned by information systems managers. The reason of this attention is the high amount of Expenses for maintenance and cleaning of the inefficient data. Such a data more than its expenses of lack of quality, cause wrong statistics, analysis and decisions in organizations. Therefor the managers intend to improve the quality of their information systems' data. One of the basic subjects of quality improvement is the evaluation of the amount of it. In this paper, we present a precautionary method, which with its application the data of information systems would have a better quality. Our method would cover different dimensions of data quality; therefor it has necessary integrity. The presented method has tested on three dimensions of accuracy, value-added and believability and the results confirm the improvement and integrity of this method.
Keywords: Data quality, precaution, information system, measurement, improvement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14688721 A Cell-centered Diffusion Finite Volume Scheme and it's Application to Magnetic Flux Compression Generators
Authors: Qiang Zhao, Yina Shi, Guangwei Yuan, Zhiwei Dong
Abstract:
A cell-centered finite volume scheme for discretizing diffusion operators on distorted quadrilateral meshes has recently been designed and added to APMFCG to enable that code to be used as a tool for studying explosive magnetic flux compression generators. This paper describes this scheme. Comparisons with analytic results for 2-D test cases are presented, as well as 2-D results from a test of a "realistic" generator configuration.
Keywords: Cell-centered FVM, distorted meshes, diffusion scheme, MFCG.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13648720 2D-Modeling with Lego Mindstorms
Authors: Miroslav Popelka, Jakub Nožička
Abstract:
The whole work is based on possibility to use Lego Mindstorms robotics systems to reduce costs. Lego Mindstorms consists of a wide variety of hardware components necessary to simulate, programme and test of robotics systems in practice. To programme algorithm, which simulates space using the ultrasonic sensor, was used development environment supplied with kit. Software Matlab was used to render values afterwards they were measured by ultrasonic sensor. The algorithm created for this paper uses theoretical knowledge from area of signal processing. Data being processed by algorithm are collected by ultrasonic sensor that scans 2D space in front of it. Ultrasonic sensor is placed on moving arm of robot which provides horizontal moving of sensor. Vertical movement of sensor is provided by wheel drive. The robot follows map in order to get correct positioning of measured data. Based on discovered facts it is possible to consider Lego Mindstorm for low-cost and capable kit for real-time modelling.
Keywords: LEGO Mindstorms, ultrasonic sensor, Real-time modeling, 2D object, low-cost robotics systems, sensors, Matlab, EV3 Home Edition Software.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31108719 An Efficient Data Mining Approach on Compressed Transactions
Authors: Jia-Yu Dai, Don-Lin Yang, Jungpin Wu, Ming-Chuan Hung
Abstract:
In an era of knowledge explosion, the growth of data increases rapidly day by day. Since data storage is a limited resource, how to reduce the data space in the process becomes a challenge issue. Data compression provides a good solution which can lower the required space. Data mining has many useful applications in recent years because it can help users discover interesting knowledge in large databases. However, existing compression algorithms are not appropriate for data mining. In [1, 2], two different approaches were proposed to compress databases and then perform the data mining process. However, they all lack the ability to decompress the data to their original state and improve the data mining performance. In this research a new approach called Mining Merged Transactions with the Quantification Table (M2TQT) was proposed to solve these problems. M2TQT uses the relationship of transactions to merge related transactions and builds a quantification table to prune the candidate itemsets which are impossible to become frequent in order to improve the performance of mining association rules. The experiments show that M2TQT performs better than existing approaches.Keywords: Association rule, data mining, merged transaction, quantification table.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19608718 Movie Genre Preference Prediction Using Machine Learning for Customer-Based Information
Authors: Haifeng Wang, Haili Zhang
Abstract:
Most movie recommendation systems have been developed for customers to find items of interest. This work introduces a predictive model usable by small and medium-sized enterprises (SMEs) who are in need of a data-based and analytical approach to stock proper movies for local audiences and retain more customers. We used classification models to extract features from thousands of customers’ demographic, behavioral and social information to predict their movie genre preference. In the implementation, a Gaussian kernel support vector machine (SVM) classification model and a logistic regression model were established to extract features from sample data and their test error-in-sample were compared. Comparison of error-out-sample was also made under different Vapnik–Chervonenkis (VC) dimensions in the machine learning algorithm to find and prevent overfitting. Gaussian kernel SVM prediction model can correctly predict movie genre preferences in 85% of positive cases. The accuracy of the algorithm increased to 93% with a smaller VC dimension and less overfitting. These findings advance our understanding of how to use machine learning approach to predict customers’ preferences with a small data set and design prediction tools for these enterprises.Keywords: Computational social science, movie preference, machine learning, SVM.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16518717 Weigh-in-Motion Data Analysis Software for Developing Traffic Data for Mechanistic Empirical Pavement Design
Authors: M. A. Hasan, M. R. Islam, R. A. Tarefder
Abstract:
Currently, there are few user friendly Weigh-in- Motion (WIM) data analysis softwares available which can produce traffic input data for the recently developed AASHTOWare pavement Mechanistic-Empirical (ME) design software. However, these softwares have only rudimentary Quality Control (QC) processes. Therefore, they cannot properly deal with erroneous WIM data. As the pavement performance is highly sensible to the quality of WIM data, it is highly recommended to use more refined QC process on raw WIM data to get a good result. This study develops a userfriendly software, which can produce traffic input for the ME design software. This software takes the raw data (Class and Weight data) collected from the WIM station and processes it with a sophisticated QC procedure. Traffic data such as traffic volume, traffic distribution, axle load spectra, etc. can be obtained from this software; which can directly be used in the ME design software.Keywords: Weigh-in-motion, software, axle load spectra, traffic distribution, AASHTOWare.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18968716 Bayesian Networks for Earthquake Magnitude Classification in a Early Warning System
Authors: G. Zazzaro, F.M. Pisano, G. Romano
Abstract:
During last decades, worldwide researchers dedicated efforts to develop machine-based seismic Early Warning systems, aiming at reducing the huge human losses and economic damages. The elaboration time of seismic waveforms is to be reduced in order to increase the time interval available for the activation of safety measures. This paper suggests a Data Mining model able to correctly and quickly estimate dangerousness of the running seismic event. Several thousand seismic recordings of Japanese and Italian earthquakes were analyzed and a model was obtained by means of a Bayesian Network (BN), which was tested just over the first recordings of seismic events in order to reduce the decision time and the test results were very satisfactory. The model was integrated within an Early Warning System prototype able to collect and elaborate data from a seismic sensor network, estimate the dangerousness of the running earthquake and take the decision of activating the warning promptly.Keywords: Bayesian Networks, Decision Support System, Magnitude Classification, Seismic Early Warning System
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 35988715 The Role of Brand Loyalty in Generating Positive Word of Mouth among Malaysian Hypermarket Customers
Authors: S. R. Nikhashemi, L. Haj Paim, Ali Khatibi
Abstract:
Structural Equation Modeling (SEM) was used to test a hypothesized model explaining Malaysian hypermarket customers’ perceptions of brand trust (BT), customer perceived value (CPV) and perceived service quality (PSQ) on building their brand loyalty (CBL) and generating positive word-of-mouth communication (WOM). Self-administered questionnaires were used to collect data from 374 Malaysian hypermarket customers from Mydin, Tesco, Aeon Big and Giant in Kuala Lumpur, a metropolitan city of Malaysia. The data strongly supported the model exhibiting that BT, CPV and PSQ are prerequisite factors in building customer brand loyalty, while PSQ has the strongest effect on prediction of customer brand loyalty compared to other factors. Besides, the present study suggests the effect of the aforementioned factors via customer brand loyalty strongly contributes to generate positive word of mouth communication.Keywords: Brand trust, perceived value, perceived service quality, brand loyalty, positive word of mouth communication.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30428714 Optimization of Gentamicin Production: Comparison of ANN and RSM Techniques
Authors: M.Rajasimman, S.Subathra
Abstract:
In this work, statistical experimental design was applied for the optimization of medium constituents for Gentamicin production by Micromsonospora echinospora subs pallida (MTCC 708) in a batch reactor and the results are compared with the ANN predicted values. By central composite design, 50 experiments are carried out for five test variables: Starch, Soya bean meal, K2HPO4, CaCO3 and FeSO4. The optimum condition was found to be: Starch (8.9,g/L), Soya bean meal (3.3 g/L), K2HPO4 (0.8 g/L), CaCO3 (4 g/L) and FeSO4 (0.03 g/L). At these optimized conditions, the yield of gentamicin was found to be 1020 mg/L. The R2 values were found to be 1 for ANN training set, 0.9953 for ANN test set, and 0.9286 for RSM.Keywords: Gentamicin, optimization, Micromonospora echinospora, ANN, RSM
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17878713 A Procedure to Assess Streamflow Rating Curves and Streamflow Sequences
Authors: Elena Carcano, Mirzi Betasolo
Abstract:
This study aims to provide sub-hourly streamflow predictions and associated rating curves for small catchments of intermittent and torrential flow regime characterized by flash floods occurring especially during April and November. The methodology entails two lumped conceptual hydrological models which work in series. The total model is based upon eleven parameters and shows good flexibility in handling different input sets. Runoff Coefficient has contributed to improving the model’s performances and has been treated as an additional parameter; while Sensitivity Analysis has highlighted how slight changes in the model’s input can lead to changes in model’s output. The adopted procedure is steady and useful to give very practical engineering information at the expense of a parsimonious request both in input data and in the number of adopted parameters. According to the obtained results, the authors encourage the test of this combined procedure on different hydrological scenarios in order to provide information for poorly monitored catchments and not updated sites.
Keywords: Streamflow rating curve, chronological data, streamflow sequences, conceptual models.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4208712 Examining the Performance of Three Multiobjective Evolutionary Algorithms Based on Benchmarking Problems
Authors: Konstantinos Metaxiotis, Konstantinos Liagkouras
Abstract:
The objective of this study is to examine the performance of three well-known multiobjective evolutionary algorithms for solving optimization problems. The first algorithm is the Non-dominated Sorting Genetic Algorithm-II (NSGA-II), the second one is the Strength Pareto Evolutionary Algorithm 2 (SPEA-2), and the third one is the Multiobjective Evolutionary Algorithms based on decomposition (MOEA/D). The examined multiobjective algorithms are analyzed and tested on the ZDT set of test functions by three performance metrics. The results indicate that the NSGA-II performs better than the other two algorithms based on three performance metrics.
Keywords: MOEAs, Multiobjective optimization, ZDT test functions, performance metrics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9528711 Sensitivity Comparison between Rapid Immuno-Chromatographic Device Test and ELISA in Detection and Sero-Prevalence of HBsAg and Anti-HCV antibodies in Apparently Healthy Blood Donors of Lahore, Pakistan
Authors: Natasha Hussain, Maleeha Aslam, Robina Farooq
Abstract:
Hepatitis B and hepatitis C are among the most significant hepatic infections all around the world that may lead to hepatocellular carcinoma. This study is first time performed at the blood transfussion centre of Omar hospital, Lahore. It aims to determine the sero-prevalence of these diseases by screening the apparently healthy blood donors who might be the carriers of HBV or HCV and pose a high risk in the transmission. It also aims the comparison between the sensitivity of two diagnostic tests; chromatographic immunoassay – one step test device and Enzyme Linked Immuno Sorbant Assay (ELISA). Blood serum of 855 apparently healthy blood donors was screened for Hepatitis B surface antigen (HBsAg) and for anti HCV antibodies. SPSS version 12.0 and X2 (Chi-square) test were used for statistical analysis. The seroprevalence of HCV was 8.07% by the device method and by ELISA 9.12% and that of HBV was 5.6% by the device and 6.43% by ELISA. The unavailability of vaccination against HCV makes it more prevalent. Comparing the two diagnostic methods, ELISA proved to be more sensitive.Keywords: ELISA, Sensitivity comparison of diagnostic tests, seroprevalence of Hepatitis B and C
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21988710 Intelligent Aid-Analysis Based on the Use of Digital Twin: Application to Electronic Warfare System
Authors: L. Chaussy, M. Nouvel
Abstract:
Workload of the system engineers during Integration Validation Verification process of Electronic Warfare Systems (EWS) is growing with complexity of the systems and with the diversity of tested cases (diversity of operational scenario in front of EWS). Even if the use of Digital Twin makes easier conception and development phases in term of planning and test equipment availability, time to analyze tests results is still too long and too complex. The idea to reduce the system engineer’s workload and improve test coverage is to introduce some intelligent and aid-analysis algorithms to improve this step.
Keywords: Analysis tools, automatic testing, digital twin, electronic warfare system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3868709 Human Growth Curve Estimation through a Combination of Longitudinal and Cross-sectional Data
Authors: Sedigheh Mirzaei S., Debasis Sengupta
Abstract:
Parametric models have been quite popular for studying human growth, particularly in relation to biological parameters such as peak size velocity and age at peak size velocity. Longitudinal data are generally considered to be vital for fittinga parametric model to individual-specific data, and for studying the distribution of these biological parameters in a human population. However, cross-sectional data are easier to obtain than longitudinal data. In this paper, we present a method of combining longitudinal and cross-sectional data for the purpose of estimating the distribution of the biological parameters. We demonstrate, through simulations in the special case ofthePreece Baines model, how estimates based on longitudinal data can be improved upon by harnessing the information contained in cross-sectional data.We study the extent of improvement for different mixes of the two types of data, and finally illustrate the use of the method through data collected by the Indian Statistical Institute.Keywords: Preece-Baines growth model, MCMC method, Mixed effect model
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21398708 Semantic Support for Hypothesis-Based Research from Smart Environment Monitoring and Analysis Technologies
Authors: T. S. Myers, J. Trevathan
Abstract:
Improvements in the data fusion and data analysis phase of research are imperative due to the exponential growth of sensed data. Currently, there are developments in the Semantic Sensor Web community to explore efficient methods for reuse, correlation and integration of web-based data sets and live data streams. This paper describes the integration of remotely sensed data with web-available static data for use in observational hypothesis testing and the analysis phase of research. The Semantic Reef system combines semantic technologies (e.g., well-defined ontologies and logic systems) with scientific workflows to enable hypothesis-based research. A framework is presented for how the data fusion concepts from the Semantic Reef architecture map to the Smart Environment Monitoring and Analysis Technologies (SEMAT) intelligent sensor network initiative. The data collected via SEMAT and the inferred knowledge from the Semantic Reef system are ingested to the Tropical Data Hub for data discovery, reuse, curation and publication.
Keywords: Information architecture, Semantic technologies Sensor networks, Ontologies.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17158707 Data Migration between Document-Oriented and Relational Databases
Authors: Bogdan Walek, Cyril Klimes
Abstract:
Current tools for data migration between documentoriented and relational databases have several disadvantages. We propose a new approach for data migration between documentoriented and relational databases. During data migration the relational schema of the target (relational database) is automatically created from collection of XML documents. Proposed approach is verified on data migration between document-oriented database IBM Lotus/ Notes Domino and relational database implemented in relational database management system (RDBMS) MySQL.Keywords: data migration, database, document-oriented database, XML, relational schema
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 35258706 Sandwich Structure Composites: Effect of Kenaf on Mechanical Properties
Authors: M. N. Othman, M. Bukhari, Z. Halim, S. A. Mohammad, K. Khalid
Abstract:
Sandwich structure composites produced by epoxy core and aluminium skin were developed as potential building materials. Interface bonding between core and skin was controlled by varying kenaf content. Five different weight percentage of kenaf loading ranging from 10 wt% to 50 wt% were employed in the core manufacturing in order to study the mechanical properties of the sandwich composite. Properties of skin aluminium with epoxy were found to be affected by drying time of the adhesive. Mechanical behavior of manufactured sandwich composites in relation with properties of constituent materials was studied. It was found that 30 wt% of kenaf loading contributed to increase the flexural strength and flexural modulus up to 102 MPa and 32 GPa, respectively. Analysis were done on the flatwise and edgewise compression test. For flatwise test, it was found that 30 wt% of fiber loading could withstand maximum force until 250 kN, with compressive strength results at 96.94 MPa. However, at edgewise compression test, the sandwich composite with same fiber loading only can withstand 31 kN of the maximum load with 62 MPa of compressive strength results.
Keywords: Aluminium, kenaf fiber epoxy, sandwich structure composite.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2226