Search results for: time series regression.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7656

Search results for: time series regression.

7206 Predictor Factors for Treatment Failure among Patients on Second Line Antiretroviral Therapy

Authors: Mohd. A. M. Rahim, Yahaya Hassan, Mathumalar L. Fahrni

Abstract:

Second line antiretroviral therapy (ART) regimen is used when patients fail their first line regimen. There are many factors such as non-adherence, drug resistance as well as virological and immunological failure that lead to second line highly active antiretroviral therapy (HAART) regimen treatment failure. This study was aimed at determining predictor factors to treatment failure with second line HAART and analyzing median survival time. An observational, retrospective study was conducted in Sungai Buloh Hospital (HSB) to assess current status of HIV patients treated with second line HAART regimen. Convenience sampling was used and 104 patients were included based on the study’s inclusion and exclusion criteria. Data was collected for six months i.e. from July until December 2013. Data was then analysed using SPSS version 18. Kaplan-Meier and Cox regression analyses were used to measure median survival times and predictor factors for treatment failure. The study population consisted mainly of male subjects, aged 30- 45 years, who were heterosexual, and had HIV infection for less than 6 years. The most common second line HAART regimen given was lopinavir/ritonavir (LPV/r)-based combination. Kaplan-Meier analysis showed that patients on LPV/r demonstrated longer median survival times than patients on indinavir/ritonavir (IDV/r) based combination (p<0.001). The commonest reason for a treatment to fail with second line HAART was non-adherence. Based on Cox regression analysis, other predictor factors for treatment failure with second line HAART regimen were age and mode of HIV transmission.

Keywords: Adherence, antiretroviral therapy, second line, treatment failure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2707
7205 Using Support Vector Machine for Prediction Dynamic Voltage Collapse in an Actual Power System

Authors: Muhammad Nizam, Azah Mohamed, Majid Al-Dabbagh, Aini Hussain

Abstract:

This paper presents dynamic voltage collapse prediction on an actual power system using support vector machines. Dynamic voltage collapse prediction is first determined based on the PTSI calculated from information in dynamic simulation output. Simulations were carried out on a practical 87 bus test system by considering load increase as the contingency. The data collected from the time domain simulation is then used as input to the SVM in which support vector regression is used as a predictor to determine the dynamic voltage collapse indices of the power system. To reduce training time and improve accuracy of the SVM, the Kernel function type and Kernel parameter are considered. To verify the effectiveness of the proposed SVM method, its performance is compared with the multi layer perceptron neural network (MLPNN). Studies show that the SVM gives faster and more accurate results for dynamic voltage collapse prediction compared with the MLPNN.

Keywords: Dynamic voltage collapse, prediction, artificial neural network, support vector machines

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1802
7204 How to Use E-Learning to Increase Job Satisfaction in Large Commercial Bank in Bangkok

Authors: Teerada Apibunyopas, Nithinant Thammakoranonta

Abstract:

Many organizations bring e-Learning to use as a tool in their training and human development department. It is getting more popular because it is easy to access to get knowledge all the time and also it provides a rich content, which can develop the employees’ skill efficiently. This study is focused on the factors that affect using e-Learning efficiently, so it will make job satisfaction increasing. The questionnaires were sent to employees in large commercial banks, which use e-Learning located in Bangkok, the results from multiple linear regression analysis showed that employee’s characteristics, characteristics of e-Learning, learning and growth have influence on job satisfaction.

Keywords: e-Learning, Job Satisfaction, Learning and growth.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2368
7203 Choosing between the Regression Correlation, the Rank Correlation, and the Correlation Curve

Authors: Roger L Goodwin

Abstract:

This paper presents a rank correlation curve. The traditional correlation coefficient is valid for both continuous variables and for integer variables using rank statistics. Since the correlation coefficient has already been established in rank statistics by Spearman, such a calculation can be extended to the correlation curve. This paper presents two survey questions. The survey collected non-continuous variables. We will show weak to moderate correlation. Obviously, one question has a negative effect on the other. A review of the qualitative literature can answer which question and why. The rank correlation curve shows which collection of responses has a positive slope and which collection of responses has a negative slope. Such information is unavailable from the flat, ”first-glance” correlation statistics.

Keywords: Bayesian estimation, regression model, rank statistics, correlation, correlation curve.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1601
7202 Automatic Sleep Stage Scoring with Wavelet Packets Based on Single EEG Recording

Authors: Luay A. Fraiwan, Natheer Y. Khaswaneh, Khaldon Y. Lweesy

Abstract:

Sleep stage scoring is the process of classifying the stage of the sleep in which the subject is in. Sleep is classified into two states based on the constellation of physiological parameters. The two states are the non-rapid eye movement (NREM) and the rapid eye movement (REM). The NREM sleep is also classified into four stages (1-4). These states and the state wakefulness are distinguished from each other based on the brain activity. In this work, a classification method for automated sleep stage scoring based on a single EEG recording using wavelet packet decomposition was implemented. Thirty two ploysomnographic recording from the MIT-BIH database were used for training and validation of the proposed method. A single EEG recording was extracted and smoothed using Savitzky-Golay filter. Wavelet packets decomposition up to the fourth level based on 20th order Daubechies filter was used to extract features from the EEG signal. A features vector of 54 features was formed. It was reduced to a size of 25 using the gain ratio method and fed into a classifier of regression trees. The regression trees were trained using 67% of the records available. The records for training were selected based on cross validation of the records. The remaining of the records was used for testing the classifier. The overall correct rate of the proposed method was found to be around 75%, which is acceptable compared to the techniques in the literature.

Keywords: Features selection, regression trees, sleep stagescoring, wavelet packets.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2315
7201 Research on the Optimization of the Facility Layout of Efficient Cafeterias for Troops

Authors: Qing Zhang, Jiachen Nie, Yujia Wen, Guanyuan Kou, Peng Yu, Kun Xia, Qin Yang, Li Ding

Abstract:

Background: A facility layout problem (FLP) is an NP-complete (non-deterministic polynomial) problem, for which is hard to obtain an exact optimal solution. FLP has been widely studied in various limited spaces and workflows. For example, cafeterias with many types of equipment for troops cause chaotic processes when dining. Objective: This article tried to optimize the layout of a troops’ cafeteria and to improve the overall efficiency of the dining process. Methods: First, the original cafeteria layout design scheme was analyzed from an ergonomic perspective and two new design schemes were generated. Next, three facility layout models were designed, and further simulation was applied to compare the total time and density of troops between each scheme. Last, an experiment of the dining process with video observation and analysis verified the simulation results. Results: In a simulation, the dining time under the second new layout is shortened by 2.25% and 1.89% (p<0.0001, p=0.0001) compared with the other two layouts, while troops-flow density and interference both greatly reduced in the two new layouts. In the experiment, process completing time and the number of interferences reduced as well, which verified corresponding simulation results. Conclusion: Our two new layout schemes are tested to be optimal by a series of simulation and space experiments. In future research, similar approaches could be applied when taking layout-design algorithm calculation into consideration.

Keywords: Troops’ cafeteria, layout optimization, dining efficiency, AnyLogic simulation, field experiment

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 486
7200 Analysis and Research of Two-Level Scheduling Profile for Open Real-Time System

Authors: Yongxian Jin, Jingzhou Huang

Abstract:

In an open real-time system environment, the coexistence of different kinds of real-time and non real-time applications makes the system scheduling mechanism face new requirements and challenges. One two-level scheduling scheme of the open real-time systems is introduced, and points out that hard and soft real-time applications are scheduled non-distinctively as the same type real-time applications, the Quality of Service (QoS) cannot be guaranteed. It has two flaws: The first, it can not differentiate scheduling priorities of hard and soft real-time applications, that is to say, it neglects characteristic differences between hard real-time applications and soft ones, so it does not suit a more complex real-time environment. The second, the worst case execution time of soft real-time applications cannot be predicted exactly, so it is not worth while to cost much spending in order to assure all soft real-time applications not to miss their deadlines, and doing that may cause resource wasting. In order to solve this problem, a novel two-level real-time scheduling mechanism (including scheduling profile and scheduling algorithm) which adds the process of dealing with soft real-time applications is proposed. Finally, we verify real-time scheduling mechanism from two aspects of theory and experiment. The results indicate that our scheduling mechanism can achieve the following objectives. (1) It can reflect the difference of priority when scheduling hard and soft real-time applications. (2) It can ensure schedulability of hard real-time applications, that is, their rate of missing deadline is 0. (3) The overall rate of missing deadline of soft real-time applications can be less than 1. (4) The deadline of a non-real-time application is not set, whereas the scheduling algorithm that server 0 S uses can avoid the “starvation" of jobs and increase QOS. By doing that, our scheduling mechanism is more compatible with different types of applications and it will be applied more widely.

Keywords: Hard real-time, two-level scheduling profile, open real-time system, non-distinctive schedule, soft real-time

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1553
7199 Applications of Stable Distributions in Time Series Analysis, Computer Sciences and Financial Markets

Authors: Mohammad Ali Baradaran Ghahfarokhi, Parvin Baradaran Ghahfarokhi

Abstract:

In this paper, first we introduce the stable distribution, stable process and theirs characteristics. The a -stable distribution family has received great interest in the last decade due to its success in modeling data, which are too impulsive to be accommodated by the Gaussian distribution. In the second part, we propose major applications of alpha stable distribution in telecommunication, computer science such as network delays and signal processing and financial markets. At the end, we focus on using stable distribution to estimate measure of risk in stock markets and show simulated data with statistical softwares.

Keywords: stable distribution, SaS, infinite variance, heavy tail networks, VaR.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2045
7198 A 1H NMR-Linked PCR Modelling Strategy for Tracking the Fatty Acid Sources of Aldehydic Lipid Oxidation Products in Culinary Oils Exposed to Simulated Shallow-Frying Episodes

Authors: Martin Grootveld, Benita Percival, Sarah Moumtaz, Kerry L. Grootveld

Abstract:

Objectives/Hypotheses: The adverse health effect potential of dietary lipid oxidation products (LOPs) has evoked much clinical interest. Therefore, we employed a 1H NMR-linked Principal Component Regression (PCR) chemometrics modelling strategy to explore relationships between data matrices comprising (1) aldehydic LOP concentrations generated in culinary oils/fats when exposed to laboratory-simulated shallow frying practices, and (2) the prior saturated (SFA), monounsaturated (MUFA) and polyunsaturated fatty acid (PUFA) contents of such frying media (FM), together with their heating time-points at a standard frying temperature (180 oC). Methods: Corn, sunflower, extra virgin olive, rapeseed, linseed, canola, coconut and MUFA-rich algae frying oils, together with butter and lard, were heated according to laboratory-simulated shallow-frying episodes at 180 oC, and FM samples were collected at time-points of 0, 5, 10, 20, 30, 60, and 90 min. (n = 6 replicates per sample). Aldehydes were determined by 1H NMR analysis (Bruker AV 400 MHz spectrometer). The first (dependent output variable) PCR data matrix comprised aldehyde concentration scores vectors (PC1* and PC2*), whilst the second (predictor) one incorporated those from the fatty acid content/heating time variables (PC1-PC4) and their first-order interactions. Results: Structurally complex trans,trans- and cis,trans-alka-2,4-dienals, 4,5-epxy-trans-2-alkenals and 4-hydroxy-/4-hydroperoxy-trans-2-alkenals (group I aldehydes predominantly arising from PUFA peroxidation) strongly and positively loaded on PC1*, whereas n-alkanals and trans-2-alkenals (group II aldehydes derived from both MUFA and PUFA hydroperoxides) strongly and positively loaded on PC2*. PCR analysis of these scores vectors (SVs) demonstrated that PCs 1 (positively-loaded linoleoylglycerols and [linoleoylglycerol]:[SFA] content ratio), 2 (positively-loaded oleoylglycerols and negatively-loaded SFAs), 3 (positively-loaded linolenoylglycerols and [PUFA]:[SFA] content ratios), and 4 (exclusively orthogonal sampling time-points) all powerfully contributed to aldehydic PC1* SVs (p 10-3 to < 10-9), as did all PC1-3 x PC4 interaction ones (p 10-5 to < 10-9). PC2* was also markedly dependent on all the above PC SVs (PC2 > PC1 and PC3), and the interactions of PC1 and PC2 with PC4 (p < 10-9 in each case), but not the PC3 x PC4 contribution. Conclusions: NMR-linked PCR analysis is a valuable strategy for (1) modelling the generation of aldehydic LOPs in heated cooking oils and other FM, and (2) tracking their unsaturated fatty acid (UFA) triacylglycerol sources therein.

Keywords: Frying oils, frying episodes, lipid oxidation products, cytotoxic/genotoxic aldehydes, chemometrics, principal component regression, NMR Analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 875
7197 The Effects and Interactions of Synthesis Parameters on Properties of Mg Substituted Hydroxyapatite

Authors: S. Sharma, U. Batra, S. Kapoor, A. Dua

Abstract:

In this study, the effects and interactions of reaction time and capping agent assistance during sol-gel synthesis of magnesium substituted hydroxyapatite nanopowder (MgHA) on hydroxyapatite (HA) to β-tricalcium phosphate (β-TCP) ratio, Ca/P ratio and mean crystallite size was examined experimentally as well as through statistical analysis. MgHA nanopowders were synthesized by sol-gel technique at room temperature using aqueous solution of calcium nitrate tetrahydrate, magnesium nitrate hexahydrate and potassium dihydrogen phosphate as starting materials. The reaction time for sol-gel synthesis was varied between 15 to 60 minutes. Two process routes were followed with and without addition of triethanolamine (TEA) in the solutions. The elemental compositions of as-synthesized powders were determined using X-ray fluorescence (XRF) spectroscopy. The functional groups present in the assynthesized MgHA nanopowders were established through Fourier Transform Infrared Spectroscopy (FTIR). The amounts of phases present, Ca/P ratio and mean crystallite sizes of MgHA nanopowders were determined using X-ray diffraction (XRD). The HA content in biphasic mixture of HA and β-TCP and Ca/P ratio in as-synthesized MgHA nanopowders increased effectively with reaction time of sols (p<0.0001, two way ANOVA), however, these were independent of TEA addition (p>0.15, two way ANOVA). The MgHA nanopowders synthesized with TEA assistance exhibited 14 nm lower crystallite size (p<0.018, 2 sample t-test) compared to the powder synthesized without TEA assistance.

Keywords: Capping agent, hydroxyapatite, regression analysis, sol-gel, 2- sample t-test, two-way ANOVA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1609
7196 Comparative Study - Three Artificial Intelligence Techniques for Rain Domain in Precipitation Forecast

Authors: Nabilah Filzah Mohd Radzuan, Andi Putra, Zalinda Othman, Azuraliza Abu Bakar, Abdul Razak Hamdan

Abstract:

Precipitation forecast is important in avoid incident of natural disaster which can cause loss in involved area. This review paper involves three techniques from artificial intelligence namely logistic regression, decisions tree, and random forest which used in making precipitation forecast. These combination techniques through VAR model in finding advantages and strength for every technique in forecast process. Data contains variables from rain domain. Adaptation of artificial intelligence techniques involved on rain domain enables the process to be easier and systematic for precipitation forecast.

Keywords: Logistic regression, decisions tree, random forest, VAR model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2027
7195 Determinants of the U.S. Current Account

Authors: Shuh Liang

Abstract:

This article provides empirical evidence on the effect of domestic and international factors on the U.S. current account deficit. Linear dynamic regression and vector autoregression models are employed to estimate the relationships during the period from 1986 to 2011. The findings of this study suggest that the current and lagged private saving rate and foreign current account for East Asian economies have played a vital role in affecting the U.S. current account. Additionally, using Granger causality tests and variance decompositions, the change of the productivity growth and foreign domestic demand are determined to influence significantly the change of the U.S. current account. To summarize, the empirical relationship between the U.S. current account deficit and its determinants is sensitive to alternative regression models and specifications.

Keywords: Current account deficit, productivity growth, foreign demand, vector autoregression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1700
7194 Estimating Regression Effects in Com Poisson Generalized Linear Model

Authors: Vandna Jowaheer, Naushad A. Mamode Khan

Abstract:

Com Poisson distribution is capable of modeling the count responses irrespective of their mean variance relation and the parameters of this distribution when fitted to a simple cross sectional data can be efficiently estimated using maximum likelihood (ML) method. In the regression setup, however, ML estimation of the parameters of the Com Poisson based generalized linear model is computationally intensive. In this paper, we propose to use quasilikelihood (QL) approach to estimate the effect of the covariates on the Com Poisson counts and investigate the performance of this method with respect to the ML method. QL estimates are consistent and almost as efficient as ML estimates. The simulation studies show that the efficiency loss in the estimation of all the parameters using QL approach as compared to ML approach is quite negligible, whereas QL approach is lesser involving than ML approach.

Keywords: Com Poisson, Cross-sectional, Maximum Likelihood, Quasi likelihood

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1750
7193 A Multiple Linear Regression Model to Predict the Price of Cement in Nigeria

Authors: Kenneth M. Oba

Abstract:

This study investigated factors affecting the price of cement in Nigeria, and developed a mathematical model that can predict future cement prices. Cement is key in the Nigerian construction industry. The changes in price caused by certain factors could affect economic and infrastructural development; hence there is need for proper proactive planning. Secondary data were collected from published information on cement between 2014 and 2019. In addition, questionnaires were sent to some domestic cement retailers in Port Harcourt in Nigeria, to obtain the actual prices of cement between the same periods. The study revealed that the most critical factors affecting the price of cement in Nigeria are inflation rate, population growth rate, and Gross Domestic Product (GDP) growth rate. With the use of data from United Nations, International Monetary Fund, and Central Bank of Nigeria databases, amongst others, a Multiple Linear Regression model was formulated. The model was used to predict the price of cement for 2020-2025. The model was then tested with 95% confidence level, using a two-tailed t-test and an F-test, resulting in an R2 of 0.8428 and R2 (adj.) of 0.6069. The results of the tests and the correlation factors confirm the model to be fit and adequate. This study will equip researchers and stakeholders in the construction industry with information for planning, monitoring, and management of present and future construction projects that involve the use of cement.

Keywords: Cement price, multiple linear regression model, Nigerian Construction Industry, price prediction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 772
7192 Static Priority Approach to Under-Frequency Based Load Shedding Scheme in Islanded Industrial Networks: Using the Case Study of Fatima Fertilizer Company Ltd - FFL

Authors: S. H. Kazmi, T. Ahmed, K. Javed, A. Ghani

Abstract:

In this paper static scheme of under-frequency based load shedding is considered for chemical and petrochemical industries with islanded distribution networks relying heavily on the primary commodity to ensure minimum production loss, plant downtime or critical equipment shutdown. A simplistic methodology is proposed for in-house implementation of this scheme using underfrequency relays and a step by step guide is provided including the techniques to calculate maximum percentage overloads, frequency decay rates, time based frequency response and frequency based time response of the system. Case study of FFL electrical system is utilized, presenting the actual system parameters and employed load shedding settings following the similar series of steps. The arbitrary settings are then verified for worst overload conditions (loss of a generation source in this case) and comprehensive system response is then investigated.

Keywords: Islanding, under-frequency load shedding, frequency rate of change, static UFLS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2203
7191 Factors for Entry Timing Choices Using Principal Axis Factorial Analysis and Logistic Regression Model

Authors: Mat Isa, C. M., Mohd Saman, H., Mohd Nasir, S. R., Jaapar, A.

Abstract:

International market expansion involves a strategic process of market entry decision through which a firm expands its operation from domestic to the international domain. Hence, entry timing choices require the needs to balance the early entry risks and the problems in losing opportunities as a result of late entry into a new market. Questionnaire surveys administered to 115 Malaysian construction firms operating in 51 countries worldwide have resulted in 39.1 percent response rate. Factor analysis was used to determine the most significant factors affecting entry timing choices of the firms to penetrate the international market. A logistic regression analysis used to examine the firms’ entry timing choices, indicates that the model has correctly classified 89.5 per cent of cases as late movers. The findings reveal that the most significant factor influencing the construction firms’ choices as late movers was the firm factor related to the firm’s international experience, resources, competencies and financing capacity. The study also offers valuable information to construction firms with intention to internationalize their businesses.

Keywords: Factors, early movers, entry timing choices, late movers, Logistic Regression Model, Principal Axis Factorial Analysis, Malaysian construction firms.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2220
7190 Measuring the Efficiency of Medical Equipment

Authors: Panagiotis H. Tsarouhas

Abstract:

the reliability analysis of the medical equipments can help to increase the availability and the efficiency of the systems. In this manuscript we present a simple method of decomposition that could be easily applied on the complex medical systems. Using this method we can easily calculate the effect of the subsystems or components on the reliability of the overall system. Furthermore, to investigate the effect of subsystems or components on system performance, we perform a numerical study varying every time the worst reliability of subsystem or component with another which has higher reliability. It can also be useful to engineers and designers of medical equipment, who wishes to optimize the complex systems.

Keywords: Reliability, Availability, Series-parallel System, medical equipment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2383
7189 Model-Driven and Data-Driven Approaches for Crop Yield Prediction: Analysis and Comparison

Authors: Xiangtuo Chen, Paul-Henry Cournéde

Abstract:

Crop yield prediction is a paramount issue in agriculture. The main idea of this paper is to find out efficient way to predict the yield of corn based meteorological records. The prediction models used in this paper can be classified into model-driven approaches and data-driven approaches, according to the different modeling methodologies. The model-driven approaches are based on crop mechanistic modeling. They describe crop growth in interaction with their environment as dynamical systems. But the calibration process of the dynamic system comes up with much difficulty, because it turns out to be a multidimensional non-convex optimization problem. An original contribution of this paper is to propose a statistical methodology, Multi-Scenarios Parameters Estimation (MSPE), for the parametrization of potentially complex mechanistic models from a new type of datasets (climatic data, final yield in many situations). It is tested with CORNFLO, a crop model for maize growth. On the other hand, the data-driven approach for yield prediction is free of the complex biophysical process. But it has some strict requirements about the dataset. A second contribution of the paper is the comparison of these model-driven methods with classical data-driven methods. For this purpose, we consider two classes of regression methods, methods derived from linear regression (Ridge and Lasso Regression, Principal Components Regression or Partial Least Squares Regression) and machine learning methods (Random Forest, k-Nearest Neighbor, Artificial Neural Network and SVM regression). The dataset consists of 720 records of corn yield at county scale provided by the United States Department of Agriculture (USDA) and the associated climatic data. A 5-folds cross-validation process and two accuracy metrics: root mean square error of prediction(RMSEP), mean absolute error of prediction(MAEP) were used to evaluate the crop prediction capacity. The results show that among the data-driven approaches, Random Forest is the most robust and generally achieves the best prediction error (MAEP 4.27%). It also outperforms our model-driven approach (MAEP 6.11%). However, the method to calibrate the mechanistic model from dataset easy to access offers several side-perspectives. The mechanistic model can potentially help to underline the stresses suffered by the crop or to identify the biological parameters of interest for breeding purposes. For this reason, an interesting perspective is to combine these two types of approaches.

Keywords: Crop yield prediction, crop model, sensitivity analysis, paramater estimation, particle swarm optimization, random forest.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1163
7188 ELD79-LGD2006 Transformation Techniques Implementation and Accuracy Comparison in Tripoli Area, Libya

Authors: Jamal A. Gledan, Othman A. Azzeidani

Abstract:

During the last decade, Libya established a new Geodetic Datum called Libyan Geodetic Datum 2006 (LGD 2006) by using GPS, whereas the ground traversing method was used to establish the last Libyan datum which was called the Europe Libyan Datum 79 (ELD79). The current research paper introduces ELD79 to LGD2006 coordinate transformation technique, the accurate comparison of transformation between multiple regression equations and the three – parameters model (Bursa-Wolf). The results had been obtained show that the overall accuracy of stepwise multi regression equations is better than that can be determined by using Bursa-Wolf transformation model.

Keywords: Geodetic datum, horizontal control points, traditional similarity transformation model, unconventional transformation techniques.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2710
7187 The Influence of Interest, Beliefs, and Identity with Mathematics on Achievement

Authors: Asma Alzahrani, Elizabeth Stojanovski

Abstract:

This study investigated factors that influence mathematics achievement based on a sample of ninth-grade students (N  =  21,444) from the High School Longitudinal Study of 2009 (HSLS09). Key aspects studied included efficacy in mathematics, interest and enjoyment of mathematics, identity with mathematics and future utility beliefs and how these influence mathematics achievement. The predictability of mathematics achievement based on these factors was assessed using correlation coefficients and multiple linear regression. Spearman rank correlations and multiple regression analyses indicated positive and statistically significant relationships between the explanatory variables: mathematics efficacy, identity with mathematics, interest in and future utility beliefs with the response variable, achievement in mathematics.

Keywords: Mathematics achievement, math efficacy, mathematics interest, identity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1099
7186 Free Fatty Acid Assessment of Crude Palm Oil Using a Non-Destructive Approach

Authors: Siti Nurhidayah Naqiah Abdull Rani, Herlina Abdul Rahim, Rashidah Ghazali, Noramli Abdul Razak

Abstract:

Near infrared (NIR) spectroscopy has always been of great interest in the food and agriculture industries. The development of prediction models has facilitated the estimation process in recent years. In this study, 110 crude palm oil (CPO) samples were used to build a free fatty acid (FFA) prediction model. 60% of the collected data were used for training purposes and the remaining 40% used for testing. The visible peaks on the NIR spectrum were at 1725 nm and 1760 nm, indicating the existence of the first overtone of C-H bands. Principal component regression (PCR) was applied to the data in order to build this mathematical prediction model. The optimal number of principal components was 10. The results showed R2=0.7147 for the training set and R2=0.6404 for the testing set.

Keywords: Palm oil, fatty acid, NIRS, regression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4351
7185 Existence and Uniqueness of Periodic Solution for a Discrete-time SIR Epidemic Model with Time Delays and Impulses

Authors: Ling Liu, Yuan Ye

Abstract:

In this paper, a discrete-time SIR epidemic model with nonlinear incidence rate, time delays and impulses is investigated. Sufficient conditions for the existence and uniqueness of periodic solutions are obtained by using contraction theorem and inequality techniques. An example is employed to illustrate our results.

Keywords: Discrete-time SIR epidemic model, time delay, nonlinear incidence rate, impulse.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1634
7184 On the Factors Influencing the Competitiveness of Chinese Service Trade after Entering WTO

Authors: Ying Wang

Abstract:

Service trade is an important force of influencing economic development. A review on the related literatures is done firstly. Then through the construction of a Diamond Model, the main factors which influence the competitiveness of Chinese service trade are determined. With three competitiveness indexes served as the reference series respectively, the influencing factors served as the comparable series, three grey incidence models are then built up to conduct an empirical analysis on the main factors influencing the competitiveness of service trade after China entering WTO. The result indicates that urbanization level, open degree of service industry and foreign direct investment have larger impacts on Chinese service trade competitiveness, followed in turn by GDP in service industry and human capital, while commodity trade has the minimum impact. Further discussion provides train of thought for the upgrade of Chinese service trade competitiveness.

Keywords: Service Trade, Competitiveness, Diamond Model, Grey Incidence Model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1759
7183 A Frame Work for the Development of a Suitable Method to Find Shoot Length at Maturity of Mustard Plant Using Soft Computing Model

Authors: Satyendra Nath Mandal, J. Pal Choudhury, Dilip De, S. R. Bhadra Chaudhuri

Abstract:

The production of a plant can be measured in terms of seeds. The generation of seeds plays a critical role in our social and daily life. The fruit production which generates seeds, depends on the various parameters of the plant, such as shoot length, leaf number, root length, root number, etc When the plant is growing, some leaves may be lost and some new leaves may appear. It is very difficult to use the number of leaves of the tree to calculate the growth of the plant.. It is also cumbersome to measure the number of roots and length of growth of root in several time instances continuously after certain initial period of time, because roots grow deeper and deeper under ground in course of time. On the contrary, the shoot length of the tree grows in course of time which can be measured in different time instances. So the growth of the plant can be measured using the data of shoot length which are measured at different time instances after plantation. The environmental parameters like temperature, rain fall, humidity and pollution are also play some role in production of yield. The soil, crop and distance management are taken care to produce maximum amount of yields of plant. The data of the growth of shoot length of some mustard plant at the initial stage (7,14,21 & 28 days after plantation) is available from the statistical survey by a group of scientists under the supervision of Prof. Dilip De. In this paper, initial shoot length of Ken( one type of mustard plant) has been used as an initial data. The statistical models, the methods of fuzzy logic and neural network have been tested on this mustard plant and based on error analysis (calculation of average error) that model with minimum error has been selected and can be used for the assessment of shoot length at maturity. Finally, all these methods have been tested with other type of mustard plants and the particular soft computing model with the minimum error of all types has been selected for calculating the predicted data of growth of shoot length. The shoot length at the stage of maturity of all types of mustard plants has been calculated using the statistical method on the predicted data of shoot length.

Keywords: Fuzzy time series, neural network, forecasting error, average error.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1578
7182 Optimizing the Project Delivery Time with Time Cost Trade-offs

Authors: Wei Lo, Ming-En Kuo

Abstract:

While to minimize the overall project cost is always one of the objectives of construction managers, to obtain the maximum economic return is definitely one the ultimate goals of the project investors. As there is a trade-off relationship between the project time and cost, and the project delivery time directly affects the timing of economic recovery of an investment project, to provide a method that can quantify the relationship between the project delivery time and cost, and identify the optimal delivery time to maximize economic return has always been the focus of researchers and industrial practitioners. Using genetic algorithms, this study introduces an optimization model that can quantify the relationship between the project delivery time and cost and furthermore, determine the optimal delivery time to maximize the economic return of the project. The results provide objective quantification for accurately evaluating the project delivery time and cost, and facilitate the analysis of the economic return of a project.

Keywords: Time-Cost Trade-Off, Genetic Algorithms, Resource Integration, Economic return.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1764
7181 Hourly Electricity Load Forecasting: An Empirical Application to the Italian Railways

Authors: M. Centra

Abstract:

Due to the liberalization of countless electricity markets, load forecasting has become crucial to all public utilities for which electricity is a strategic variable. With the goal of contributing to the forecasting process inside public utilities, this paper addresses the issue of applying the Holt-Winters exponential smoothing technique and the time series analysis for forecasting the hourly electricity load curve of the Italian railways. The results of the analysis confirm the accuracy of the two models and therefore the relevance of forecasting inside public utilities.

Keywords: ARIMA models, Exponential smoothing, Electricity, Load forecasting, Rail transportation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2614
7180 Stability Analysis of Mutualism Population Model with Time Delay

Authors: Rusliza Ahmad, Harun Budin

Abstract:

This paper studies the effect of time delay on stability of mutualism population model with limited resources for both species. First, the stability of the model without time delay is analyzed. The model is then improved by considering a time delay in the mechanism of the growth rate of the population. We analyze the effect of time delay on the stability of the stable equilibrium point. Result showed that the time delay can induce instability of the stable equilibrium point, bifurcation and stability switches.

Keywords: Bifurcation, Delay margin, Mutualism population model, Time delay

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1967
7179 Genetic Algorithms Multi-Objective Model for Project Scheduling

Authors: Elsheikh Asser

Abstract:

Time and cost are the main goals of the construction project management. The first schedule developed may not be a suitable schedule for beginning or completing the project to achieve the target completion time at a minimum total cost. In general, there are trade-offs between time and cost (TCT) to complete the activities of a project. This research presents genetic algorithms (GAs) multiobjective model for project scheduling considering different scenarios such as least cost, least time, and target time.

Keywords: Genetic algorithms, Time-cost trade-off.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2310
7178 Optimal Calculation of Partial Transmission Ratios of Four-Step Helical Gearboxes for Getting Minimal Gearbox Length

Authors: Vu Ngoc Pi

Abstract:

This paper presents a new study on the applications of optimization and regression analysis techniques for optimal calculation of partial ratios of four-step helical gearboxes for getting minimal gearbox length. In the paper, basing on the moment equilibrium condition of a mechanic system including four gear units and their regular resistance condition, models for determination of the partial ratios of the gearboxes are proposed. In particular, explicit models for calculation of the partial ratios are proposed by using regression analysis. Using these models, the determination of the partial ratios is accurate and simple.

Keywords: Gearbox design; optimal design; helical gearbox, transmission ratio.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2073
7177 Quality Parameters of Offset Printing Wastewater

Authors: Kiurski S. Jelena, Kecić S. Vesna, Aksentijević M. Snežana

Abstract:

Samples of tap and wastewater were collected in three offset printing facilities in Novi Sad, Serbia. Ten physicochemical parameters were analyzed within all collected samples: pH, conductivity, m - alkalinity, p - alkalinity, acidity, carbonate concentration, hydrogen carbonate concentration, active oxygen content, chloride concentration and total alkali content. All measurements were conducted using the standard analytical and instrumental methods. Comparing the obtained results for tap water and wastewater, a clear quality difference was noticeable, since all physicochemical parameters were significantly higher within wastewater samples. The study also involves the application of simple linear regression analysis on the obtained dataset. By using software package ORIGIN 5 the pH value was mutually correlated with other physicochemical parameters. Based on the obtained values of Pearson coefficient of determination a strong positive correlation between chloride concentration and pH (r = -0.943), as well as between acidity and pH (r = -0.855) was determined. In addition, statistically significant difference was obtained only between acidity and chloride concentration with pH values, since the values of parameter F (247.634 and 182.536) were higher than Fcritical (5.59). In this way, results of statistical analysis highlighted the most influential parameter of water contamination in offset printing, in the form of acidity and chloride concentration. The results showed that variable dependence could be represented by the general regression model: y = a0 + a1x+ k, which further resulted with matching graphic regressions.

Keywords: Pollution, printing industry, simple linear regression analysis, wastewater.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1662