Search results for: regression models.
2635 Aggregation Scheduling Algorithms in Wireless Sensor Networks
Authors: Min Kyung An
Abstract:
In Wireless Sensor Networks which consist of tiny wireless sensor nodes with limited battery power, one of the most fundamental applications is data aggregation which collects nearby environmental conditions and aggregates the data to a designated destination, called a sink node. Important issues concerning the data aggregation are time efficiency and energy consumption due to its limited energy, and therefore, the related problem, named Minimum Latency Aggregation Scheduling (MLAS), has been the focus of many researchers. Its objective is to compute the minimum latency schedule, that is, to compute a schedule with the minimum number of timeslots, such that the sink node can receive the aggregated data from all the other nodes without any collision or interference. For the problem, the two interference models, the graph model and the more realistic physical interference model known as Signal-to-Interference-Noise-Ratio (SINR), have been adopted with different power models, uniform-power and non-uniform power (with power control or without power control), and different antenna models, omni-directional antenna and directional antenna models. In this survey article, as the problem has proven to be NP-hard, we present and compare several state-of-the-art approximation algorithms in various models on the basis of latency as its performance measure.Keywords: Data aggregation, convergecast, gathering, approximation, interference, omni-directional, directional.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7992634 Forecasting Rainfall in Thailand: A Case Study of Nakhon Ratchasima Province
Authors: N. Sopipan
Abstract:
In this paper, we study the rainfall using a time series for weather stations in Nakhon Ratchasima province in Thailand by various statistical methods to enable us to analyse the behaviour of rainfall in the study areas. Time-series analysis is an important tool in modelling and forecasting rainfall. The ARIMA and Holt-Winter models were built on the basis of exponential smoothing. All the models proved to be adequate. Therefore it is possible to give information that can help decision makers establish strategies for the proper planning of agriculture, drainage systems and other water resource applications in Nakhon Ratchasima province. We obtained the best performance from forecasting with the ARIMA Model(1,0,1)(1,0,1)12.
Keywords: ARIMA Models, Exponential Smoothing, Holt- Winter model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26832633 Agile Methodology for Modeling and Design of Data Warehouses -AM4DW-
Authors: Nieto Bernal Wilson, Carmona Suarez Edgar
Abstract:
The organizations have structured and unstructured information in different formats, sources, and systems. Part of these come from ERP under OLTP processing that support the information system, however these organizations in OLAP processing level, presented some deficiencies, part of this problematic lies in that does not exist interesting into extract knowledge from their data sources, as also the absence of operational capabilities to tackle with these kind of projects. Data Warehouse and its applications are considered as non-proprietary tools, which are of great interest to business intelligence, since they are repositories basis for creating models or patterns (behavior of customers, suppliers, products, social networks and genomics) and facilitate corporate decision making and research. The following paper present a structured methodology, simple, inspired from the agile development models as Scrum, XP and AUP. Also the models object relational, spatial data models, and the base line of data modeling under UML and Big data, from this way sought to deliver an agile methodology for the developing of data warehouses, simple and of easy application. The methodology naturally take into account the application of process for the respectively information analysis, visualization and data mining, particularly for patterns generation and derived models from the objects facts structured.
Keywords: Data warehouse, model data, big data, object fact, object relational fact, process developed data warehouse.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14782632 Investigation of Dissolution in Diammonium Hydrogen Phosphate Solutions of Gypsum
Authors: Turan Çalban, Nursel Keskin, Sabri Çolak, Soner Kuşlu
Abstract:
Gypsum (CaSO4.2H2O) is a mineral that is found in large quantities in the Turkey and in the World. In this study, the dissolution of this mineral in the diammonium hydrogen phosphate solutions has been studied. The dissolution and dissolution kinetics of gypsum in diammonium hydrogen phosphate solutions will be useful for evaluating of solid wastes containing gypsum. Parameters such as diammonium hydrogen phosphate concentration, temperature and stirring speed affecting on the dissolution rate of the gypsum in diammonium hydrogen phosphate solutions were investigated. In experimental studies have researched effectiveness of the selected parameters. The dissolution of gypsum were examined in two parts at low and high temperatures. The experimental results were successfully correlated by linear regression using Statistica program. Dissolution curves were evaluated shrinking core models for solidfluid systems. The activation energy was found to be 34.58 kJ/mol and 44.45 kJ/mol for the low and the high temperatures. The dissolution of gypsum was controlled by chemical reaction both low temperatures and high temperatures.
Keywords: Diammonium hydrogen phosphate, Dissolution, Gypsum, Kinetics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20512631 MABENA Strategic Management Model for Local Companies
Authors: Kaveh Mohammad Cyrus, Shadi Sanagoo
Abstract:
MABENA model is a complementary model in comparison with traditional models such as HCMS, CMS and etc. New factors, which have effects on preparation of strategic plans and their sequential order in MABENA model is the platform of presented road map in this paper.Study review shows, factors such as emerging new critical success factors for strategic planning, improvement of international strategic models, increasing the maturity of companies and emerging new needs leading to design a new model which can be responsible for new critical factors and solve the limitations of previous strategic management models. Preparation of strategic planning need more factors than introduced in traditional models. The needed factors includes determining future Critical Success Factors and competencies, defining key processes, determining the maturity of the processes, considering all aspects of the external environment etc. Description of aforementioned requirements, the outcomes and their order is developing and presenting the MABENA model-s road map in this paper. This study presents a road map for strategic planning of the Iranian organizations.Keywords: Competitive Advantage, Process Maturity, StrategicPlanning, Strategic potential
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21612630 Interoperability Maturity Models for Consideration When Using School Management Systems in South Africa: A Scoping Review
Authors: Keneilwe Maremi, Marlien Herselman, Adele Botha
Abstract:
The main purpose and focus of this paper are to determine the Interoperability Maturity Models to consider when using School Management Systems (SMS). The importance of this is to inform and help schools with knowing which Interoperability Maturity Model is best suited for their SMS. To address the purpose, this paper will apply a scoping review to ensure that all aspects are provided. The scoping review will include papers written from 2012-2019 and a comparison of the different types of Interoperability Maturity Models will be discussed in detail, which includes the background information, the levels of interoperability, and area for consideration in each Maturity Model. The literature was obtained from the following databases: IEEE Xplore and Scopus, the following search engines were used: Harzings, and Google Scholar. The topic of the paper was used as a search term for the literature and the term ‘Interoperability Maturity Models’ was used as a keyword. The data were analyzed in terms of the definition of Interoperability, Interoperability Maturity Models, and levels of interoperability. The results provide a table that shows the focus area of concern for each Maturity Model (based on the scoping review where only 24 papers were found to be best suited for the paper out of 740 publications initially identified in the field). This resulted in the most discussed Interoperability Maturity Model for consideration (Information Systems Interoperability Maturity Model (ISIMM) and Organizational Interoperability Maturity Model for C2 (OIM)).
Keywords: Interoperability, Interoperability Maturity Model, School Management System, scoping review.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7982629 A Study of Hamilton-Jacobi-Bellman Equation Systems Arising in Differential Game Models of Changing Society
Authors: Weihua Ruan, Kuan-Chou Chen
Abstract:
This paper is concerned with a system of Hamilton-Jacobi-Bellman equations coupled with an autonomous dynamical system. The mathematical system arises in the differential game formulation of political economy models as an infinite-horizon continuous-time differential game with discounted instantaneous payoff rates and continuously and discretely varying state variables. The existence of a weak solution of the PDE system is proven and a computational scheme of approximate solution is developed for a class of such systems. A model of democratization is mathematically analyzed as an illustration of application.Keywords: Differential games, Hamilton-Jacobi-Bellman equations, infinite horizon, political-economy models.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10562628 Comparison of Two Interval Models for Interval-Valued Differential Evolution
Authors: Hidehiko Okada
Abstract:
The author previously proposed an extension of differential evolution. The proposed method extends the processes of DE to handle interval numbers as genotype values so that DE can be applied to interval-valued optimization problems. The interval DE can employ either of two interval models, the lower and upper model or the center and width model, for specifying genotype values. Ability of the interval DE in searching for solutions may depend on the model. In this paper, the author compares the two models to investigate which model contributes better for the interval DE to find better solutions. Application of the interval DE is evolutionary training of interval-valued neural networks. A result of preliminary study indicates that the CW model is better than the LU model: the interval DE with the CW model could evolve better neural networks.
Keywords: Evolutionary algorithms, differential evolution, neural network, neuroevolution, interval arithmetic.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16672627 Time Series Forecasting Using a Hybrid RBF Neural Network and AR Model Based On Binomial Smoothing
Authors: Fengxia Zheng, Shouming Zhong
Abstract:
ANNARIMA that combines both autoregressive integrated moving average (ARIMA) model and artificial neural network (ANN) model is a valuable tool for modeling and forecasting nonlinear time series, yet the over-fitting problem is more likely to occur in neural network models. This paper provides a hybrid methodology that combines both radial basis function (RBF) neural network and auto regression (AR) model based on binomial smoothing (BS) technique which is efficient in data processing, which is called BSRBFAR. This method is examined by using the data of Canadian Lynx data. Empirical results indicate that the over-fitting problem can be eased using RBF neural network based on binomial smoothing which is called BS-RBF, and the hybrid model–BS-RBFAR can be an effective way to improve forecasting accuracy achieved by BSRBF used separately.
Keywords: Binomial smoothing (BS), hybrid, Canadian Lynx data, forecasting accuracy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 36872626 Operating System Based Virtualization Models in Cloud Computing
Authors: Dev Ras Pandey, Bharat Mishra, S. K. Tripathi
Abstract:
Cloud computing is ready to transform the structure of businesses and learning through supplying the real-time applications and provide an immediate help for small to medium sized businesses. The ability to run a hypervisor inside a virtual machine is important feature of virtualization and it is called nested virtualization. In today’s growing field of information technology, many of the virtualization models are available, that provide a convenient approach to implement, but decision for a single model selection is difficult. This paper explains the applications of operating system based virtualization in cloud computing with an appropriate/suitable model with their different specifications and user’s requirements. In the present paper, most popular models are selected, and the selection was based on container and hypervisor based virtualization. Selected models were compared with a wide range of user’s requirements as number of CPUs, memory size, nested virtualization supports, live migration and commercial supports, etc. and we identified a most suitable model of virtualization.
Keywords: Virtualization, OS based virtualization, container and hypervisor based virtualization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19432625 Robust Numerical Scheme for Pricing American Options under Jump Diffusion Models
Authors: Salah Alrabeei, Mohammad Yousuf
Abstract:
The goal of option pricing theory is to help the investors to manage their money, enhance returns and control their financial future by theoretically valuing their options. However, most of the option pricing models have no analytical solution. Furthermore, not all the numerical methods are efficient to solve these models because they have nonsmoothing payoffs or discontinuous derivatives at the exercise price. In this paper, we solve the American option under jump diffusion models by using efficient time-dependent numerical methods. several techniques are integrated to reduced the overcome the computational complexity. Fast Fourier Transform (FFT) algorithm is used as a matrix-vector multiplication solver, which reduces the complexity from O(M2) into O(M logM). Partial fraction decomposition technique is applied to rational approximation schemes to overcome the complexity of inverting polynomial of matrices. The proposed method is easy to implement on serial or parallel versions. Numerical results are presented to prove the accuracy and efficiency of the proposed method.Keywords: Integral differential equations, American options, jump–diffusion model, rational approximation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5612624 The Use of Performance Indicators for Evaluating Models of Drying Jackfruit (Artocarpus heterophyllus L.): Page, Midilli, and Lewis
Authors: D. S. C. Soares, D. G. Costa, J. T. S., A. K. S. Abud, T. P. Nunes, A. M. Oliveira Júnior
Abstract:
Mathematical models of drying are used for the purpose of understanding the drying process in order to determine important parameters for design and operation of the dryer. The jackfruit is a fruit with high consumption in the Northeast and perishability. It is necessary to apply techniques to improve their conservation for longer in order to diffuse it by regions with low consumption. This study aimed to analyze several mathematical models (Page, Lewis, and Midilli) to indicate one that best fits the conditions of convective drying process using performance indicators associated with each model: accuracy (Af) and noise factors (Bf), mean square error (RMSE) and standard error of prediction (% SEP). Jackfruit drying was carried out in convective type tray dryer at a temperature of 50°C for 9 hours. It is observed that the model Midili was more accurate with Af: 1.39, Bf: 1.33, RMSE: 0.01%, and SEP: 5.34. However, the use of the Model Midilli is not appropriate for purposes of control process due to need four tuning parameters. With the performance indicators used in this paper, the Page model showed similar results with only two parameters. It is concluded that the best correlation between the experimental and estimated data is given by the Page’s model.
Keywords: Drying, models, jackfruit.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24222623 Analysis of Textual Data Based On Multiple 2-Class Classification Models
Authors: Shigeaki Sakurai, Ryohei Orihara
Abstract:
This paper proposes a new method for analyzing textual data. The method deals with items of textual data, where each item is described based on various viewpoints. The method acquires 2- class classification models of the viewpoints by applying an inductive learning method to items with multiple viewpoints. The method infers whether the viewpoints are assigned to the new items or not by using the models. The method extracts expressions from the new items classified into the viewpoints and extracts characteristic expressions corresponding to the viewpoints by comparing the frequency of expressions among the viewpoints. This paper also applies the method to questionnaire data given by guests at a hotel and verifies its effect through numerical experiments.
Keywords: Text mining, Multiple viewpoints, Differential analysis, Questionnaire data
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12902622 Quality Parameters of Offset Printing Wastewater
Authors: Kiurski S. Jelena, Kecić S. Vesna, Aksentijević M. Snežana
Abstract:
Samples of tap and wastewater were collected in three offset printing facilities in Novi Sad, Serbia. Ten physicochemical parameters were analyzed within all collected samples: pH, conductivity, m - alkalinity, p - alkalinity, acidity, carbonate concentration, hydrogen carbonate concentration, active oxygen content, chloride concentration and total alkali content. All measurements were conducted using the standard analytical and instrumental methods. Comparing the obtained results for tap water and wastewater, a clear quality difference was noticeable, since all physicochemical parameters were significantly higher within wastewater samples. The study also involves the application of simple linear regression analysis on the obtained dataset. By using software package ORIGIN 5 the pH value was mutually correlated with other physicochemical parameters. Based on the obtained values of Pearson coefficient of determination a strong positive correlation between chloride concentration and pH (r = -0.943), as well as between acidity and pH (r = -0.855) was determined. In addition, statistically significant difference was obtained only between acidity and chloride concentration with pH values, since the values of parameter F (247.634 and 182.536) were higher than Fcritical (5.59). In this way, results of statistical analysis highlighted the most influential parameter of water contamination in offset printing, in the form of acidity and chloride concentration. The results showed that variable dependence could be represented by the general regression model: y = a0 + a1x+ k, which further resulted with matching graphic regressions.
Keywords: Pollution, printing industry, simple linear regression analysis, wastewater.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16742621 Comparison of different Channel Modeling Techniques used in the BPLC Systems
Authors: Justinian Anatory, Nelson Theethayi
Abstract:
The paper compares different channel models used for modeling Broadband Power-Line Communication (BPLC) system. The models compared are Zimmermann and Dostert, Philipps, Anatory et al and Anatory et al generalized Transmission Line (TL) model. The validity of each model was compared in time domain with ATP-EMTP software which uses transmission line approach. It is found that for a power-line network with minimum number of branches all the models give similar signal/pulse time responses compared with ATP-EMTP software; however, Zimmermann and Dostert model indicates the same amplitude but different time delay. It is observed that when the numbers of branches are increased only generalized TL theory approach results are comparable with ATPEMTP results. Also the Multi-Carrier Spread Spectrum (MC-SS) system was applied to check the implication of such behavior on the modulation schemes. It is observed that using Philipps on the underground cable can predict the performance up to 25dB better than other channel models which can misread the actual performance of the system. Also modified Zimmermann and Dostert under multipath can predict a better performance of about 5dB better than the actual predicted by Generalized TL theory. It is therefore suggested for a realistic BPLC system design and analyses the model based on generalized TL theory be used.Keywords: Broadband Power line Channel Models, loadimpedance, Branched network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18282620 Vision Based Hand Gesture Recognition Using Generative and Discriminative Stochastic Models
Authors: Mahmoud Elmezain, Samar El-shinawy
Abstract:
Many approaches to pattern recognition are founded on probability theory, and can be broadly characterized as either generative or discriminative according to whether or not the distribution of the image features. Generative and discriminative models have very different characteristics, as well as complementary strengths and weaknesses. In this paper, we study these models to recognize the patterns of alphabet characters (A-Z) and numbers (0-9). To handle isolated pattern, generative model as Hidden Markov Model (HMM) and discriminative models like Conditional Random Field (CRF), Hidden Conditional Random Field (HCRF) and Latent-Dynamic Conditional Random Field (LDCRF) with different number of window size are applied on extracted pattern features. The gesture recognition rate is improved initially as the window size increase, but degrades as window size increase further. Experimental results show that the LDCRF is the best in terms of results than CRF, HCRF and HMM at window size equal 4. Additionally, our results show that; an overall recognition rates are 91.52%, 95.28%, 96.94% and 98.05% for CRF, HCRF, HMM and LDCRF respectively.
Keywords: Statistical Pattern Recognition, Generative Model, Discriminative Model, Human Computer Interaction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29372619 Dynamic Load Modeling for KHUZESTAN Power System Voltage Stability Studies
Authors: M. Sedighizadeh, A. Rezazadeh
Abstract:
Based on the component approach, three kinds of dynamic load models, including a single –motor model, a two-motor model and composite load model have been developed for the stability studies of Khuzestan power system. The study results are presented in this paper. Voltage instability is a dynamic phenomenon and therefore requires dynamic representation of the power system components. Industrial loads contain a large fraction of induction machines. Several models of different complexity are available for the description investigations. This study evaluates the dynamic performances of several dynamic load models in combination with the dynamics of a load changing transformer. Case study is steel industrial substation in Khuzestan power systems.Keywords: Dynamic load, modeling, Voltage Stability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18592618 The Effectiveness of Mineral Fertilization of Winter Wheat by Nitrogen in the Soil and Climatic Conditions in the Cr
Authors: Václav Voltr, Jan Leština
Abstract:
The basis of examines is survey of 500 in the years 2002-2010, which was selected according to homogeneity of land cover and where 1090 revenues were evaluated. For achieved yields of winter wheat is obtained multicriterial regression function depending on the major factors influencing the consumption of nitrogen. The coefficient of discrimination of the established model is 0.722. The increase in efficiency of fertilization is involved in supply of organic nutrients, tillage, soil pH, past weather, the humus content in the subsoil and grain content to 0.001 mm. The decrease in efficiency was mainly influenced by the total dose of mineral nitrogen, although it was divided into multiple doses, the proportion loamy particles up to 0.01 mm, rainy, or conversely dry weather during the vegetation. The efficiency of nitrogen was found to be the smallest on undeveloped soils and the highest on chernozem and alluvial soils.Keywords: Nitrogen efficiency, winter wheat, regression model
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14442617 Stochastic Learning Algorithms for Modeling Human Category Learning
Authors: Toshihiko Matsuka, James E. Corter
Abstract:
Most neural network (NN) models of human category learning use a gradient-based learning method, which assumes that locally-optimal changes are made to model parameters on each learning trial. This method tends to under predict variability in individual-level cognitive processes. In addition many recent models of human category learning have been criticized for not being able to replicate rapid changes in categorization accuracy and attention processes observed in empirical studies. In this paper we introduce stochastic learning algorithms for NN models of human category learning and show that use of the algorithms can result in (a) rapid changes in accuracy and attention allocation, and (b) different learning trajectories and more realistic variability at the individual-level.Keywords: category learning, cognitive modeling, radial basis function, stochastic optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16292616 Personal Information Classification Based on Deep Learning in Automatic Form Filling System
Authors: Shunzuo Wu, Xudong Luo, Yuanxiu Liao
Abstract:
Recently, the rapid development of deep learning makes artificial intelligence (AI) penetrate into many fields, replacing manual work there. In particular, AI systems also become a research focus in the field of automatic office. To meet real needs in automatic officiating, in this paper we develop an automatic form filling system. Specifically, it uses two classical neural network models and several word embedding models to classify various relevant information elicited from the Internet. When training the neural network models, we use less noisy and balanced data for training. We conduct a series of experiments to test my systems and the results show that our system can achieve better classification results.Keywords: Personal information, deep learning, auto fill, NLP, document analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8612615 Elastic and Plastic Collision Comparison Using Finite Element Method
Authors: Gustavo Rodrigues, Hans Weber, Larissa Driemeier
Abstract:
The prevision of post-impact conditions and the behavior of the bodies during the impact have been object of several collision models. The formulation from Hertz’s theory is generally used dated from the 19th century. These models consider the repulsive force as proportional to the deformation of the bodies under contact and may consider it proportional to the rate of deformation. The objective of the present work is to analyze the behavior of the bodies during impact using the Finite Element Method (FEM) with elastic and plastic material models. The main parameters to evaluate are, the contact force, the time of contact and the deformation of the bodies. An advantage of using the FEM approach is the possibility to apply a plastic deformation to the model according to the material definition: there will be used Johnson–Cook plasticity model whose parameters are obtained through empirical tests of real materials. This model allows analyzing the permanent deformation caused by impact, phenomenon observed in real world depending on the forces applied to the body. These results are compared between them and with the model-based Hertz theory.
Keywords: Collision, finite element method, Hertz’s Theory, impact models.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7782614 The Association between C-Reactive Protein and Hypertension of Different United States Participants Categorized by Ethnicity: Applying the National Health and Nutrition Examination Survey from 1999-2010
Authors: Ghada Abo-Zaid
Abstract:
Objectives: The main objective of this study was to examine the association between the elevated level of C-reactive protein (CRP) and incidence of hypertension before and after adjustments for age, BMI, gender, SES, smoking, diabetes, cholesterol LDL and cholesterol HDL, and to determine whether the association differs by race. Method: Cross sectional data for participants from aged 17 years to 74 years, included in The National Health and Nutrition Examination Survey (NHANES) from 1999 to 2010 were analyzed. The CRP level was classified into three categories (> 3 mg/L, between 1 mg/L and 3 mg/L, and < 3 mg/L). Blood pressure categorization was done using JNC 7 indicator. Hypertension is defined as either systolic blood pressure (SBP) of 140 mmHg or more and diastolic blood pressure (DBP) of 90 mmHg or more, otherwise a self-reported prior diagnosis by a physician. Pre-hypertension was defined as 139 ≥ SBP > 120 or 89 ≥ DBP >80. Multinominal regression model was undertaken to measure the association between CRP level and hypertension. Results: In univariable models, CRP concentrations > 3 mg/L were associated with a 73% greater risk of incident hypertension compared with CRP concentrations < 1 mg/L (Hypertension: odds ratio [OR] = 1.73; 95% confidence interval [CI], 1.50-1.99). Ethnic comparisons showed that American Mexicans had the highest risk of incident hypertension (OR = 2.39; 95% CI, 2.21-2.58). This risk was statistically insignificant after controlling by other variables (Hypertension: OR = 0.75; 95% CI, 0.52-1.08), or categorized by race [American Mexican: OR= 1.58; 95% CI, 0.58-4.26, Other Hispanic: OR = 0.87; 95% CI, 0.19-4.42, Non-Hispanic white: OR = 0.90; 95% CI, 0.50-1.59, Non-Hispanic Black: OR = 0.44; 95% CI, 0.22-0.87. The same results were found for pre-hypertension, and the Non-Hispanic black segment showed the highest significant risk for Pre-Hypertension (OR = 1.60; 95% CI, 1.26-2.03). When CRP concentrations were between 1.0 and 3.0 mg/L in unadjusted models, prehypertension was associated with higher likelihood of elevated CRP (OR = 1.37; 95% CI, 1.15-1.62). The same relationship was maintained in Non-Hispanic white, Non-Hispanic black, and other race (Non-Hispanic white: OR = 1.24; 95% CI, 1.03-1.48, Non-Hispanic black: OR = 1.60; 95% CI, 1.27-2.03, other race: OR = 2.50; 95% CI, 1.32-4.74) while the association was insignificant with American Mexican and other Hispanic. In the adjusted model, the relationship between CRP and prehypertension were no longer available. Contrary, hypertension was not independently associated with elevated CRP, and the results were the same after being grouped by race or adjustments for the possible confounder variables. The same results were obtained when SBP or DBP were on a continuous measure. Conclusions: This study confirmed the existence of an association between hypertension, prehypertension and elevated level of CRP, however this association was no longer available after adjusting by other variables. Ethic group differences were statistically significant at the univariable models, while it disappeared after controlling by other variables.Keywords: CRP, hypertension, ethnicity, NHANES, blood pressure.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13602613 Analytical and Experimental Study on the Effect of Air-Core Coil Parameters on Magnetic Force Used in a Linear Optical Scanner
Authors: Loke Kean Koay, Horizon Gitano-Briggs, Mani Maran Ratnam
Abstract:
Today air-core coils (ACC) are a viable alternative to ferrite-core coils in a range of applications due to their low induction effect. An analytical study was carried out and the results were used as a guide to understand the relationship between the magnet-coil distance and the resulting attractive magnetic force. Four different ACC models were fabricated for experimental study. The variation in the models included the dimensions, the number of coil turns and the current supply to the coil. Comparison between the analytical and experimental results for all the models shows an average discrepancy of less than 10%. An optimized ACC design was selected for the scanner which can provide maximum magnetic force.Keywords: Air-Core Coils, Electromagnetic, Linear Optical Scanner
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13802612 Effect of Alloying Elements and Hot Forging/Rolling Reduction Ratio on Hardness and Impact Toughness of Heat Treated Low Alloy Steels
Authors: Mahmoud M. Tash
Abstract:
The present study was carried out to investigate the effect of alloying elements and thermo-mechanical treatment (TMT) i.e. hot rolling and forging with different reduction ratios on the hardness (HV) and impact toughness (J) of heat-treated low alloy steels. An understanding of the combined effect of TMT and alloying elements and by measuring hardness, impact toughness, resulting from different heat treatment following TMT of the low alloy steels, it is possible to determine which conditions yielded optimum mechanical properties and high strength to weight ratio. Experimental Correlations between hot work reduction ratio, hardness and impact toughness for thermo-mechanically heat treated low alloy steels are analyzed quantitatively, and both regression and mathematical hardness and impact toughness models are developed.Keywords: Hot Forging, hot rolling, heat treatment, hardness (hv), impact toughness (j), microstructure, low alloy steels.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 34542611 Performance Analysis of Software Reliability Models using Matrix Method
Authors: RajPal Garg, Kapil Sharma, Rajive Kumar, R. K. Garg
Abstract:
This paper presents a computational methodology based on matrix operations for a computer based solution to the problem of performance analysis of software reliability models (SRMs). A set of seven comparison criteria have been formulated to rank various non-homogenous Poisson process software reliability models proposed during the past 30 years to estimate software reliability measures such as the number of remaining faults, software failure rate, and software reliability. Selection of optimal SRM for use in a particular case has been an area of interest for researchers in the field of software reliability. Tools and techniques for software reliability model selection found in the literature cannot be used with high level of confidence as they use a limited number of model selection criteria. A real data set of middle size software project from published papers has been used for demonstration of matrix method. The result of this study will be a ranking of SRMs based on the Permanent value of the criteria matrix formed for each model based on the comparison criteria. The software reliability model with highest value of the Permanent is ranked at number – 1 and so on.Keywords: Matrix method, Model ranking, Model selection, Model selection criteria, Software reliability models.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23192610 Anticipation of Bending Reinforcement Based on Iranian Concrete Code Using Meta-Heuristic Tools
Authors: Seyed Sadegh Naseralavi, Najmeh Bemani
Abstract:
In this paper, different concrete codes including America, New Zealand, Mexico, Italy, India, Canada, Hong Kong, Euro Code and Britain are compared with the Iranian concrete design code. First, by using Adaptive Neuro Fuzzy Inference System (ANFIS), the codes having the most correlation with the Iranian ninth issue of the national regulation are determined. Consequently, two anticipated methods are used for comparing the codes: Artificial Neural Network (ANN) and Multi-variable regression. The results show that ANN performs better. Predicting is done by using only tensile steel ratio and with ignoring the compression steel ratio.
Keywords: Concrete design code, anticipate method, artificial neural network, multi-variable regression, adaptive neuro fuzzy inference system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8172609 Designing of the Heating Process for Fiber- Reinforced Thermoplastics with Middle-Wave Infrared Radiators
Abstract:
Manufacturing components of fiber-reinforced thermoplastics requires three steps: heating the matrix, forming and consolidation of the composite and terminal cooling the matrix. For the heating process a pre-determined temperature distribution through the layers and the thickness of the pre-consolidated sheets is recommended to enable forming mechanism. Thus, a design for the heating process for forming composites with thermoplastic matrices is necessary. To obtain a constant temperature through thickness and width of the sheet, the heating process was analyzed by the help of the finite element method. The simulation models were validated by experiments with resistance thermometers as well as with an infrared camera. Based on the finite element simulation, heating methods for infrared radiators have been developed. Using the numeric simulation many iteration loops are required to determine the process parameters. Hence, the initiation of a model for calculating relevant process parameters started applying regression functions.Keywords: Fiber-reinforced thermoplastics, heating strategies, middle-wave infrared radiator.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17422608 CART Method for Modeling the Output Power of Copper Bromide Laser
Authors: Iliycho P. Iliev, Desislava S. Voynikova, Snezhana G. Gocheva-Ilieva
Abstract:
This paper examines the available experiment data for a copper bromide vapor laser (CuBr laser), emitting at two wavelengths - 510.6 and 578.2nm. Laser output power is estimated based on 10 independent input physical parameters. A classification and regression tree (CART) model is obtained which describes 97% of data. The resulting binary CART tree specifies which input parameters influence considerably each of the classification groups. This allows for a technical assessment that indicates which of these are the most significant for the manufacture and operation of the type of laser under consideration. The predicted values of the laser output power are also obtained depending on classification. This aids the design and development processes considerably.
Keywords: Classification and regression trees (CART), Copper Bromide laser (CuBr laser), laser generation, nonparametric statistical model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18252607 The Application of an Ensemble of Boosted Elman Networks to Time Series Prediction: A Benchmark Study
Authors: Chee Peng Lim, Wei Yee Goh
Abstract:
In this paper, the application of multiple Elman neural networks to time series data regression problems is studied. An ensemble of Elman networks is formed by boosting to enhance the performance of the individual networks. A modified version of the AdaBoost algorithm is employed to integrate the predictions from multiple networks. Two benchmark time series data sets, i.e., the Sunspot and Box-Jenkins gas furnace problems, are used to assess the effectiveness of the proposed system. The simulation results reveal that an ensemble of boosted Elman networks can achieve a higher degree of generalization as well as performance than that of the individual networks. The results are compared with those from other learning systems, and implications of the performance are discussed.
Keywords: AdaBoost, Elman network, neural network ensemble, time series regression.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16912606 Household Demand for Solid Waste Disposal Options in Malaysia
Authors: Pek Chuen-Khee, Jamal Othman
Abstract:
This paper estimates the economic values of household preference for enhanced solid waste disposal services in Malaysia. The contingent valuation (CV) method estimates an average additional monthly willingness-to-pay (WTP) in solid waste management charges of Ôé¼0.77 to 0.80 for improved waste disposal services quality. The finding of a slightly higher WTP from the generic CV question than that of label-specific, further reveals a higher WTP for sanitary landfill, at Ôé¼0.90, than incineration, at Ôé¼0.63. This suggests that sanitary landfill is a more preferred alternative. The logistic regression estimation procedure reveals that household-s concern of where their rubbish is disposed, age, ownership of house, household income and format of CV question are significant factors in influencing WTP.Keywords: contingent valuation, logistic regression, solid waste disposal, willingness-to-pay.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2609