Search results for: Generalized Cross Validation.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1618

Search results for: Generalized Cross Validation.

688 An Approach for Coagulant Dosage Optimization Using Soft Jar Test: A Case Study of Bangkhen Water Treatment Plant

Authors: Ninlawat Phuangchoke, Waraporn Viyanon, Setta Sasananan

Abstract:

The most important process of the water treatment plant process is coagulation, which uses alum and poly aluminum chloride (PACL). Therefore, determining the dosage of alum and PACL is the most important factor to be prescribed. This research applies an artificial neural network (ANN), which uses the Levenberg–Marquardt algorithm to create a mathematical model (Soft Jar Test) for chemical dose prediction, as used for coagulation, such as alum and PACL, with input data consisting of turbidity, pH, alkalinity, conductivity, and, oxygen consumption (OC) of the Bangkhen Water Treatment Plant (BKWTP), under the authority of the Metropolitan Waterworks Authority of Thailand. The data were collected from 1 January 2019 to 31 December 2019 in order to cover the changing seasons of Thailand. The input data of ANN are divided into three groups: training set, test set, and validation set. The coefficient of determination and the mean absolute errors of the alum model are 0.73, 3.18 and the PACL model are 0.59, 3.21, respectively.

Keywords: Soft jar test, jar test, water treatment plant process, artificial neural network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 664
687 Modelling Hydrological Time Series Using Wakeby Distribution

Authors: Ilaria Lucrezia Amerise

Abstract:

The statistical modelling of precipitation data for a given portion of territory is fundamental for the monitoring of climatic conditions and for Hydrogeological Management Plans (HMP). This modelling is rendered particularly complex by the changes taking place in the frequency and intensity of precipitation, presumably to be attributed to the global climate change. This paper applies the Wakeby distribution (with 5 parameters) as a theoretical reference model. The number and the quality of the parameters indicate that this distribution may be the appropriate choice for the interpolations of the hydrological variables and, moreover, the Wakeby is particularly suitable for describing phenomena producing heavy tails. The proposed estimation methods for determining the value of the Wakeby parameters are the same as those used for density functions with heavy tails. The commonly used procedure is the classic method of moments weighed with probabilities (probability weighted moments, PWM) although this has often shown difficulty of convergence, or rather, convergence to a configuration of inappropriate parameters. In this paper, we analyze the problem of the likelihood estimation of a random variable expressed through its quantile function. The method of maximum likelihood, in this case, is more demanding than in the situations of more usual estimation. The reasons for this lie, in the sampling and asymptotic properties of the estimators of maximum likelihood which improve the estimates obtained with indications of their variability and, therefore, their accuracy and reliability. These features are highly appreciated in contexts where poor decisions, attributable to an inefficient or incomplete information base, can cause serious damages.

Keywords: Generalized extreme values (GEV), likelihood estimation, precipitation data, Wakeby distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 675
686 Correlation between Capacitance and Dissipation Factor used for Assessment of Stator Insulation

Authors: José Luis Oslinger, Luis Carlos Castro

Abstract:

Measurements of capacitance C and dissipation factor tand of the stator insulation system provide useful information about internal defects within the insulation. The index k is defined as the proportionality constant between the changes at high voltage of capacitance DC and of the dissipation factor Dtand . DC and Dtand values were highly correlated when small flat defects were within the insulation and that correlation was lost in the presence of large narrow defects like electrical treeing. The discrimination between small and large defects is made resorting to partial discharge PD phase angle analysis. For the validation of the results, C and tand measurements were carried out in a 15MVA 4160V steam turbine turbogenerator placed in a sugar mill. In addition, laboratory test results obtained by other authors were analyzed jointly. In such laboratory tests, model coil bars subjected to thermal cycling resulted highly degraded and DC and Dtand values were not correlated. Thus, the index k could not be calculated.

Keywords: Aging, capacitance, dissipation factor, electrical treeing, insulation condition, partial discharge.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2942
685 Identification, Prediction and Detection of the Process Fault in a Cement Rotary Kiln by Locally Linear Neuro-Fuzzy Technique

Authors: Masoud Sadeghian, Alireza Fatehi

Abstract:

In this paper, we use nonlinear system identification method to predict and detect process fault of a cement rotary kiln. After selecting proper inputs and output, an input-output model is identified for the plant. To identify the various operation points in the kiln, Locally Linear Neuro-Fuzzy (LLNF) model is used. This model is trained by LOLIMOT algorithm which is an incremental treestructure algorithm. Then, by using this method, we obtained 3 distinct models for the normal and faulty situations in the kiln. One of the models is for normal condition of the kiln with 15 minutes prediction horizon. The other two models are for the two faulty situations in the kiln with 7 minutes prediction horizon are presented. At the end, we detect these faults in validation data. The data collected from White Saveh Cement Company is used for in this study.

Keywords: Cement Rotary Kiln, Fault Detection, Delay Estimation Method, Locally Linear Neuro Fuzzy Model, LOLIMOT.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1673
684 Asymptotic Analysis of Instant Messaging Service with Relay Nodes

Authors: Muhammad T. Alam, Zheng Da Wu

Abstract:

In this paper, we provide complete end-to-end delay analyses including the relay nodes for instant messages. Message Session Relay Protocol (MSRP) is used to provide congestion control for large messages in the Instant Messaging (IM) service. Large messages are broken into several chunks. These chunks may traverse through a maximum number of two relay nodes before reaching destination according to the IETF specification of the MSRP relay extensions. We discuss the current solutions of sending large instant messages and introduce a proposal to reduce message flows in the IM service. We consider virtual traffic parameter i.e., the relay nodes are stateless non-blocking for scalability purpose. This type of relay node is also assumed to have input rate at constant bit rate. We provide a new scheduling policy that schedules chunks according to their previous node?s delivery time stamp tags. Validation and analysis is shown for such scheduling policy. The performance analysis with the model introduced in this paper is simple and straight forward, which lead to reduced message flows in the IM service.

Keywords: Instant messaging, stateless, chunking, MSRP.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1620
683 Asymmetrical Informative Estimation for Macroeconomic Model: Special Case in the Tourism Sector of Thailand

Authors: Chukiat Chaiboonsri, Satawat Wannapan

Abstract:

This paper used an asymmetric informative concept to apply in the macroeconomic model estimation of the tourism sector in Thailand. The variables used to statistically analyze are Thailand international and domestic tourism revenues, the expenditures of foreign and domestic tourists, service investments by private sectors, service investments by the government of Thailand, Thailand service imports and exports, and net service income transfers. All of data is a time-series index which was observed between 2002 and 2015. Empirically, the tourism multiplier and accelerator were estimated by two statistical approaches. The first was the result of the Generalized Method of Moments model (GMM) based on the assumption which the tourism market in Thailand had perfect information (Symmetrical data). The second was the result of the Maximum Entropy Bootstrapping approach (MEboot) based on the process that attempted to deal with imperfect information and reduced uncertainty in data observations (Asymmetrical data). In addition, the tourism leakages were investigated by a simple model based on the injections and leakages concept. The empirical findings represented the parameters computed from the MEboot approach which is different from the GMM method. However, both of the MEboot estimation and GMM model suggests that Thailand’s tourism sectors are in a period capable of stimulating the economy.

Keywords: Thailand tourism, maximum entropy bootstrapping approach, macroeconomic model, asymmetric information.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1264
682 A Hybrid Neural Network and Traditional Approach for Forecasting Lumpy Demand

Authors: A. Nasiri Pour, B. Rostami Tabar, A.Rahimzadeh

Abstract:

Accurate demand forecasting is one of the most key issues in inventory management of spare parts. The problem of modeling future consumption becomes especially difficult for lumpy patterns, which characterized by intervals in which there is no demand and, periods with actual demand occurrences with large variation in demand levels. However, many of the forecasting methods may perform poorly when demand for an item is lumpy. In this study based on the characteristic of lumpy demand patterns of spare parts a hybrid forecasting approach has been developed, which use a multi-layered perceptron neural network and a traditional recursive method for forecasting future demands. In the described approach the multi-layered perceptron are adapted to forecast occurrences of non-zero demands, and then a conventional recursive method is used to estimate the quantity of non-zero demands. In order to evaluate the performance of the proposed approach, their forecasts were compared to those obtained by using Syntetos & Boylan approximation, recently employed multi-layered perceptron neural network, generalized regression neural network and elman recurrent neural network in this area. The models were applied to forecast future demand of spare parts of Arak Petrochemical Company in Iran, using 30 types of real data sets. The results indicate that the forecasts obtained by using our proposed mode are superior to those obtained by using other methods.

Keywords: Lumpy Demand, Neural Network, Forecasting, Hybrid Approach.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2680
681 Development of Anterior Lumbar Interbody Fusion (ALIF) PEEK Cage Based On the Korean Lumbar Anatomical Information

Authors: Chang Soo Chon, Cheol Woong Ko, Han Sung Kim

Abstract:

The aim of this study is to develop an anterior lumbar interbody fusion (ALIF) PEEK cage suitable for Korean people. In this study, CT images were obtained from Korean male (173cm, 71kg) and 3D Korean lumbar models were reconstructed based on the CT images to investigate anatomical characteristics. Major design parameters of anterior lumbar interbody fusion (ALIF) PEEK Cage were selected using the morphological measurement information of the Korean Lumbar models. Through finite element analysis and mechanical tests, the developed ALIFPEEK Cage prototype was compared with the Fidji Cage (Zimmer. Inc, USA) and it was found that the ALIF prototype showed similar and/or superior mechanical performance compared to the FidJi Cage. Also, clinical validation for the ALIF PEEK Cage prototype was carried out to check predictable troubles in surgical operations. Finally, it is considered that the convenience and stability of the prototype was clinically verified.

Keywords: Interbody fusion, PEEK, implant, finite element analysis, lumbar, spine.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2871
680 The Impact of Modeling Method of Moisture Emission from the Swimming Pool on the Accuracy of Numerical Calculations of Air Parameters in Ventilated Natatorium

Authors: Piotr Ciuman, Barbara Lipska

Abstract:

The aim of presented research was to improve numerical predictions of air parameters distribution in the actual natatorium by the selection of calculation formula of mass flux of moisture emitted from the pool. Selected correlation should ensure the best compliance of numerical results with the measurements' results of these parameters in the facility. The numerical model of the natatorium was developed, for which boundary conditions were prepared on the basis of measurements' results carried out in the actual facility. Numerical calculations were carried out with the use of ANSYS CFX software, with six formulas being implemented, which in various ways made the moisture emission dependent on water surface temperature and air parameters in the natatorium. The results of calculations with the use of these formulas were compared for air parameters' distributions: Specific humidity, velocity and temperature in the facility. For the selection of the best formula, numerical results of these parameters in occupied zone were validated by comparison with the measurements' results carried out at selected points of this zone.

Keywords: Experimental validation, indoor swimming pool, moisture emission, natatorium, numerical calculations, CFD, thermal and humidity conditions, ventilation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1498
679 How CATV Survive in the Era of Convergence?

Authors: J. Park, J. Song, B. Lee

Abstract:

The purpose of this paper is to analyze the case of the U.S. Pivot and to suggest an appropriate model including entry strategies and success factors for QPS of Cable TV. The telecommunication companies have been operating QPS including IPTV service, which enables them to cross over broadcasting areas. Due to this circumstance, the Cable TV operators are now concerned and are planning to add QPS with the mobile service. Based on the Porter's five forces model, an analytical framework has been proposed to MVNO in Cable TV industry in the United States. As a result of this study, MVNO in Cable TV industry has to have a clear killer application with their sufficient contents. Subsequently, the direction of the future Cable TV industry is proposed.

Keywords: CATV, MVNO, Pivot, QPS

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1553
678 Neural Network Implementation Using FPGA: Issues and Application

Authors: A. Muthuramalingam, S. Himavathi, E. Srinivasan

Abstract:

.Hardware realization of a Neural Network (NN), to a large extent depends on the efficient implementation of a single neuron. FPGA-based reconfigurable computing architectures are suitable for hardware implementation of neural networks. FPGA realization of ANNs with a large number of neurons is still a challenging task. This paper discusses the issues involved in implementation of a multi-input neuron with linear/nonlinear excitation functions using FPGA. Implementation method with resource/speed tradeoff is proposed to handle signed decimal numbers. The VHDL coding developed is tested using Xilinx XC V50hq240 Chip. To improve the speed of operation a lookup table method is used. The problems involved in using a lookup table (LUT) for a nonlinear function is discussed. The percentage saving in resource and the improvement in speed with an LUT for a neuron is reported. An attempt is also made to derive a generalized formula for a multi-input neuron that facilitates to estimate approximately the total resource requirement and speed achievable for a given multilayer neural network. This facilitates the designer to choose the FPGA capacity for a given application. Using the proposed method of implementation a neural network based application, namely, a Space vector modulator for a vector-controlled drive is presented

Keywords: FPGA implementation, multi-input neuron, neural network, nn based space vector modulator.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4425
677 Objective Assessment of Psoriasis Lesion Thickness for PASI Scoring using 3D Digital Imaging

Authors: M.H. Ahmad Fadzil, Hurriyatul Fitriyah, Esa Prakasa, Hermawan Nugroho, S.H. Hussein, Azura Mohd. Affandi

Abstract:

Psoriasis is a chronic inflammatory skin condition which affects 2-3% of population around the world. Psoriasis Area and Severity Index (PASI) is a gold standard to assess psoriasis severity as well as the treatment efficacy. Although a gold standard, PASI is rarely used because it is tedious and complex. In practice, PASI score is determined subjectively by dermatologists, therefore inter and intra variations of assessment are possible to happen even among expert dermatologists. This research develops an algorithm to assess psoriasis lesion for PASI scoring objectively. Focus of this research is thickness assessment as one of PASI four parameters beside area, erythema and scaliness. Psoriasis lesion thickness is measured by averaging the total elevation from lesion base to lesion surface. Thickness values of 122 3D images taken from 39 patients are grouped into 4 PASI thickness score using K-means clustering. Validation on lesion base construction is performed using twelve body curvature models and show good result with coefficient of determinant (R2) is equal to 1.

Keywords: 3D digital imaging, base construction, PASI, psoriasis lesion thickness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2454
676 Obtaining Constants of Johnson-Cook Material Model Using a Combined Experimental, Numerical Simulation and Optimization Method

Authors: F. Rahimi Dehgolan, M. Behzadi, J. Fathi Sola

Abstract:

In this article, the Johnson-Cook material model’s constants for structural steel ST.37 have been determined by a method which integrates experimental tests, numerical simulation, and optimization. In the first step, a quasi-static test was carried out on a plain specimen. Next, the constants were calculated for it by minimizing the difference between the results acquired from the experiment and numerical simulation. Then, a quasi-static tension test was performed on three notched specimens with different notch radii. At last, in order to verify the results, they were used in numerical simulation of notched specimens and it was observed that experimental and simulation results are in good agreement. Changing the diameter size of the plain specimen in the necking area was set as the objective function in the optimization step. For final validation of the proposed method, diameter variation was considered as a parameter and its sensitivity to a change in any of the model constants was examined and the results were completely corroborating.

Keywords: Constants, Johnson-Cook material model, notched specimens, quasi-static test, sensitivity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3617
675 The Role of Counselling Psychology on Expatriate Adjustment in East Asia: A Systematic Review

Authors: Panagiotis Platanitis

Abstract:

Purpose: This research paper seeks to review the empirical studies in the field of expatriate adjustment in East Asia in order to produce a thematic understanding of the current adjustment challenges, thus enabling practitioners to enrich their knowledge. Background: Learning to live, work, and function in a country and culture vastly different from that of one’s upbringing can pose some unique challenges in terms of adaptation and adjustment. This has led to a growing body of research about the adjustment of expatriate workers. Adjustment itself has been posited as a three-dimensional construct; work adjustment, interaction adjustment and general or cultural adjustment. Methodology: This qualitative systematic review has been conducted on all identified peer-reviewed empirical studies related to expatriate adjustment in East Asia. Five electronic databases (PsychInfo, Emerald, Scopus, EBSCO and JSTOR) were searched to December 2015. Out of 625 identified records, thorough evaluation for eligibility resulted in 15 relevant studies being subjected to data analysis. The quality of the identified research was assessed according to the Standard Quality Assessment Criteria for Evaluating Primary Research Papers from a Variety of Fields. The data were analysed by means of thematic synthesis for systematic reviews of qualitative research. Findings: Data analysis revealed five key themes. The themes developed were: (1) personality traits (2) types of adjustment, (3) language, (4) culture and (5) coping strategies. Types of adjustment included subthemes such as: Interaction, general, work, psychological, sociocultural and cross-cultural adjustment. Conclusion: The present review supported previous literature on the different themes of adjustment and it takes the focus from work and general adjustment to the psychological challenges and it introduces the psychological adjustment. It also gives a different perspective about the use of cross-cultural training and the coping strategies expatriates use when they are abroad. This review helps counselling psychologists to understand the importance of a multicultural approach when working with expatriates and also to be aware of what expatriates might face when working and living in East Asia.

Keywords: Expatriates, adjustment, East Asia, counselling psychology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2042
674 Study of Rayleigh-Bénard-Brinkman Convection Using LTNE Model and Coupled, Real Ginzburg-Landau Equations

Authors: P. G. Siddheshwar, R. K. Vanishree, C. Kanchana

Abstract:

A local nonlinear stability analysis using a eight-mode expansion is performed in arriving at the coupled amplitude equations for Rayleigh-Bénard-Brinkman convection (RBBC) in the presence of LTNE effects. Streamlines and isotherms are obtained in the two-dimensional unsteady finite-amplitude convection regime. The parameters’ influence on heat transport is found to be more pronounced at small time than at long times. Results of the Rayleigh-Bénard convection is obtained as a particular case of the present study. Additional modes are shown not to significantly influence the heat transport thus leading us to infer that five minimal modes are sufficient to make a study of RBBC. The present problem that uses rolls as a pattern of manifestation of instability is a needed first step in the direction of making a very general non-local study of two-dimensional unsteady convection. The results may be useful in determining the preferred range of parameters’ values while making rheometric measurements in fluids to ascertain fluid properties such as viscosity. The results of LTE are obtained as a limiting case of the results of LTNE obtained in the paper.

Keywords: Rayleigh-Bénard convection, heat transport, porous media, generalized Lorenz model, coupled Ginzburg-Landau model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 927
673 Thermophysical and Heat Transfer Performance of Covalent and Noncovalent Functionalized Graphene Nanoplatelet-Based Water Nanofluids in an Annular Heat Exchanger

Authors: Hamed K. Arzani, Ahmad Amiri, Hamid K. Arzani, Salim Newaz Kazi, Ahmad Badarudin

Abstract:

The new design of heat exchangers utilizing an annular distributor opens a new gateway for realizing higher energy optimization. To realize this goal, graphene nanoplatelet-based water nanofluids with promising thermophysical properties were synthesized in the presence of covalent and noncovalent functionalization. Thermal conductivity, density, viscosity and specific heat capacity were investigated and employed as a raw data for ANSYS-Fluent to be used in two-phase approach. After validation of obtained results by analytical equations, two special parameters of convective heat transfer coefficient and pressure drop were investigated. The study followed by studying other heat transfer parameters of annular pass in the presence of graphene nanopletelesbased water nanofluids at different weight concentrations, input powers and temperatures. As a result, heat transfer performance and friction loss are predicted for both synthesized nanofluids.

Keywords: Heat transfer, nanofluid, turbulent flow, forced convection flow, graphene nanoplatelet.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2171
672 Dependability Tools in Multi-Agent Support for Failures Analysis of Computer Networks

Authors: Myriam Noureddine

Abstract:

During their activity, all systems must be operational without failures and in this context, the dependability concept is essential avoiding disruption of their function. As computer networks are systems with the same requirements of dependability, this article deals with an analysis of failures for a computer network. The proposed approach integrates specific tools of the plat-form KB3, usually applied in dependability studies of industrial systems. The methodology is supported by a multi-agent system formed by six agents grouped in three meta agents, dealing with two levels. The first level concerns a modeling step through a conceptual agent and a generating agent. The conceptual agent is dedicated to the building of the knowledge base from the system specifications written in the FIGARO language. The generating agent allows producing automatically both the structural model and a dependability model of the system. The second level, the simulation, shows the effects of the failures of the system through a simulation agent. The approach validation is obtained by its application on a specific computer network, giving an analysis of failures through their effects for the considered network.

Keywords: Computer network, dependability, KB3 plat-form, multi-agent system, failure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 640
671 Computer Modeling of Drug Distribution after Intravitreal Administration

Authors: N. Haghjou, M. J. Abdekhodaie, Y. L. Cheng, M. Saadatmand

Abstract:

Intravitreal injection (IVI) is the most common treatment for eye posterior segment diseases such as endopthalmitis, retinitis, age-related macular degeneration, diabetic retinopathy, uveitis, and retinal detachment. Most of the drugs used to treat vitreoretinal diseases, have a narrow concentration range in which they are effective, and may be toxic at higher concentrations. Therefore, it is critical to know the drug distribution within the eye following intravitreal injection. Having knowledge of drug distribution, ophthalmologists can decide on drug injection frequency while minimizing damage to tissues. The goal of this study was to develop a computer model to predict intraocular concentrations and pharmacokinetics of intravitreally injected drugs. A finite volume model was created to predict distribution of two drugs with different physiochemical properties in the rabbit eye. The model parameters were obtained from literature review. To validate this numeric model, the in vivo data of spatial concentration profile from the lens to the retina were compared with the numeric data. The difference was less than 5% between the numerical and experimental data. This validation provides strong support for the numerical methodology and associated assumptions of the current study.

Keywords: Posterior segment, Intravitreal injection (IVI), Pharmacokinetic, Modelling, Finite volume method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2447
670 Least Square-SVM Detector for Wireless BPSK in Multi-Environmental Noise

Authors: J. P. Dubois, Omar M. Abdul-Latif

Abstract:

Support Vector Machine (SVM) is a statistical learning tool developed to a more complex concept of structural risk minimization (SRM). In this paper, SVM is applied to signal detection in communication systems in the presence of channel noise in various environments in the form of Rayleigh fading, additive white Gaussian background noise (AWGN), and interference noise generalized as additive color Gaussian noise (ACGN). The structure and performance of SVM in terms of the bit error rate (BER) metric is derived and simulated for these advanced stochastic noise models and the computational complexity of the implementation, in terms of average computational time per bit, is also presented. The performance of SVM is then compared to conventional binary signaling optimal model-based detector driven by binary phase shift keying (BPSK) modulation. We show that the SVM performance is superior to that of conventional matched filter-, innovation filter-, and Wiener filter-driven detectors, even in the presence of random Doppler carrier deviation, especially for low SNR (signal-to-noise ratio) ranges. For large SNR, the performance of the SVM was similar to that of the classical detectors. However, the convergence between SVM and maximum likelihood detection occurred at a higher SNR as the noise environment became more hostile.

Keywords: Colour noise, Doppler shift, innovation filter, least square-support vector machine, matched filter, Rayleigh fading, Wiener filter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1813
669 Numerical Analysis and Experimental Validation of a Downhole Stress/Strain Measurement Tool

Authors: Abhay Bodake, Ping Sui, Hafeez Syed, Ratish Kadam

Abstract:

Real-time measurement of applied forces, like tension, compression, torsion, and bending moment, identifies the transferred energies being applied to the bottomhole assembly (BHA). These forces are highly detrimental to measurement/logging-while-drilling tools and downhole equipment. Real-time measurement of the dynamic downhole behavior, including weight, torque, bending on bit, and vibration, establishes a real-time feedback loop between the downhole drilling system and drilling team at the surface. This paper describes the numerical analysis of the strain data acquired by the measurement tool at different locations on the strain pockets. The strain values obtained by FEA for various loading conditions (tension, compression, torque, and bending moment) are compared against experimental results obtained from an identical experimental setup. Numerical analyses results agree with experimental data within 8% and, therefore, substantiate and validate the FEA model. This FEA model can be used to analyze the combined loading conditions that reflect the actual drilling environment.

Keywords: FEA, M/LWD, Oil & Gas, Strain Measurement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2585
668 Validation of Contemporary Physical Activity Tracking Technologies through Exercise in a Controlled Environment

Authors: Reem I. Altamimi, Geoff D. Skinner

Abstract:

Extended periods engaged in sedentary behavior increases the risk of becoming overweight and/or obese which is linked to other health problems. Adding technology to the term ‘active living’ permits its inclusion in promoting and facilitating habitual physical activity. Technology can either act as a barrier to, or facilitate this lifestyle, depending on the chosen technology. Physical Activity Monitoring Technologies (PAMTs) are a popular example of such technologies. Different contemporary PAMTs have been evaluated based on customer reviews; however, there is a lack of published experimental research into the efficacy of PAMTs. This research aims to investigate the reliability of four PAMTs: two wristbands (Fitbit Flex and Jawbone UP), a waist-clip (Fitbit One), and a mobile application (iPhone Health Application) for recording a specific distance walked on a treadmill (1.5km) at constant speed. Physical activity tracking technologies are varied in their recordings, even while performing the same activity. This research demonstrates that Jawbone UP band recorded the most accurate distance compared to Fitbit One, Fitbit Flex, and iPhone Health Application.

Keywords: Fitbit, Jawbone UP, mobile tracking applications, physical activity tracking technologies.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1525
667 Receive and Transmit Array Antenna Spacingand Their Effect on the Performance of SIMO and MIMO Systems by using an RCS Channel Model

Authors: N. Ebrahimi-Tofighi, M. ArdebiliPour, M. Shahabadi

Abstract:

In this paper, the effect of receive and/or transmit antenna spacing on the performance (BER vs. SNR) of multipleantenna systems is determined by using an RCS (Radar Cross Section) channel model. In this physical model, the scatterers existing in the propagation environment are modeled by their RCS so that the correlation of the receive signal complex amplitudes, i.e., both magnitude and phase, can be estimated. The proposed RCS channel model is then compared with classical models.

Keywords: MIMO system, Performance of system, Signalcorrelation, SIMO system, Wireless channel model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1932
666 Automatic Translation of Ada-ECATNet Using Rewriting Logic

Authors: N. Boudiaf

Abstract:

One major difficulty that faces developers of concurrent and distributed software is analysis for concurrency based faults like deadlocks. Petri nets are used extensively in the verification of correctness of concurrent programs. ECATNets are a category of algebraic Petri nets based on a sound combination of algebraic abstract types and high-level Petri nets. ECATNets have 'sound' and 'complete' semantics because of their integration in rewriting logic and its programming language Maude. Rewriting logic is considered as one of very powerful logics in terms of description, verification and programming of concurrent systems We proposed previously a method for translating Ada-95 tasking programs to ECATNets formalism (Ada-ECATNet) and we showed that ECATNets formalism provides a more compact translation for Ada programs compared to the other approaches based on simple Petri nets or Colored Petri nets. We showed also previously how the ECATNet formalism offers to Ada many validation and verification tools like simulation, Model Checking, accessibility analysis and static analysis. In this paper, we describe the implementation of our translation of the Ada programs into ECATNets.

Keywords: Ada tasking, Analysis, Automatic Translation, ECATNets, Maude, Rewriting Logic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1584
665 A Framework for Scalable Autonomous P2P Resource Discovery for the Grid Implementation

Authors: Hesham A. Ali, Mofreh M. Salem, Ahmed A. Hamza

Abstract:

Recently, there have been considerable efforts towards the convergence between P2P and Grid computing in order to reach a solution that takes the best of both worlds by exploiting the advantages that each offers. Augmenting the peer-to-peer model to the services of the Grid promises to eliminate bottlenecks and ensure greater scalability, availability, and fault-tolerance. The Grid Information Service (GIS) directly influences quality of service for grid platforms. Most of the proposed solutions for decentralizing the GIS are based on completely flat overlays. The main contributions for this paper are: the investigation of a novel resource discovery framework for Grid implementations based on a hierarchy of structured peer-to-peer overlay networks, and introducing a discovery algorithm utilizing the proposed framework. Validation of the framework-s performance is done via simulation. Experimental results show that the proposed organization has the advantage of being scalable while providing fault-isolation, effective bandwidth utilization, and hierarchical access control. In addition, it will lead to a reliable, guaranteed sub-linear search which returns results within a bounded interval of time and with a smaller amount of generated traffic within each domain.

Keywords: Grid computing, grid information service, P2P, resource discovery.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1976
664 A New Model of English-Vietnamese Bilingual Information Retrieval System

Authors: Chinh Trong Nguyen, Dang Tuan Nguyen

Abstract:

In this paper, we propose a new model of English- Vietnamese bilingual Information Retrieval system. Although there are so many CLIR systems had been researched and built, the accuracy of searching results in different languages that the CLIR system supports still need to improve, especially in finding bilingual documents. The problems identified in this paper are the limitation of machine translation-s result and the extra large collections of document to be found. So we try to establish a different model to overcome these problems.

Keywords: Bilingual Information Retrieval, Cross-lingual Information Retrieval, Bilingual Web sites.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1628
663 A Systemic Maturity Model

Authors: Emir H. Pernet, Jeimy J. Cano

Abstract:

Maturity models, used descriptively to explain changes in reality or normatively to guide managers to make interventions to make organizations more effective and efficient, are based on the principles of statistical quality control and PDCA continuous improvement (Plan, Do, Check, Act). Some frameworks developed over the concept of maturity models include COBIT, CMM, and ITIL. This paper presents some limitations of traditional maturity models, most of them related to the mechanistic and reductionist principles over which those models are built. As systems theory helps the understanding of the dynamics of organizations and organizational change, the development of a systemic maturity model can help to overcome some of those limitations. This document proposes a systemic maturity model, based on a systemic conceptualization of organizations, focused on the study of the functioning of the parties, the relationships among them, and their behavior as a whole. The concept of maturity from the system theory perspective is conceptually defined as an emergent property of the organization, which arises as a result of the degree of alignment and integration of their processes. This concept is operationalized through a systemic function that measures the maturity of organizations, and finally validated by the measuring of maturity in some organizations. For its operationalization and validation, the model was applied to measure the maturity of organizational Governance, Risk and Compliance (GRC) processes.

Keywords: GRC, Maturity Model, Systems Theory, Viable System Model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2699
662 Adsorptive Waste Heat Based Air-Conditioning Control Strategy for Automotives

Authors: Indrasen Raghupatruni, Michael Glora, Ralf Diekmann, Thomas Demmer

Abstract:

As the trend in automotive technology is fast moving towards hybridization and electrification to curb emissions as well as to improve the fuel efficiency, air-conditioning systems in passenger cars have not caught up with this trend and still remain as the major energy consumers amongst others. Adsorption based air-conditioning systems, e.g. with silica-gel water pair, which are already in use for residential and commercial applications, are now being considered as a technology leap once proven feasible for the passenger cars. In this paper we discuss a methodology, challenges and feasibility of implementing an adsorption based air-conditioning system in a passenger car utilizing the exhaust waste heat. We also propose an optimized control strategy with interfaces to the engine control unit of the vehicle for operating this system with reasonable efficiency supported by our simulation and validation results in a prototype vehicle, additionally comparing to existing implementations, simulation based as well as experimental. Finally we discuss the influence of start-stop and hybrid systems on the operation strategy of the adsorption air-conditioning system.

Keywords: Adsorption air-conditioning, feasibility study, optimized control strategy, prototype vehicle.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2413
661 Ranking Genes from DNA Microarray Data of Cervical Cancer by a local Tree Comparison

Authors: Frank Emmert-Streib, Matthias Dehmer, Jing Liu, Max Muhlhauser

Abstract:

The major objective of this paper is to introduce a new method to select genes from DNA microarray data. As criterion to select genes we suggest to measure the local changes in the correlation graph of each gene and to select those genes whose local changes are largest. More precisely, we calculate the correlation networks from DNA microarray data of cervical cancer whereas each network represents a tissue of a certain tumor stage and each node in the network represents a gene. From these networks we extract one tree for each gene by a local decomposition of the correlation network. The interpretation of a tree is that it represents the n-nearest neighbor genes on the n-th level of a tree, measured by the Dijkstra distance, and, hence, gives the local embedding of a gene within the correlation network. For the obtained trees we measure the pairwise similarity between trees rooted by the same gene from normal to cancerous tissues. This evaluates the modification of the tree topology due to tumor progression. Finally, we rank the obtained similarity values from all tissue comparisons and select the top ranked genes. For these genes the local neighborhood in the correlation networks changes most between normal and cancerous tissues. As a result we find that the top ranked genes are candidates suspected to be involved in tumor growth. This indicates that our method captures essential information from the underlying DNA microarray data of cervical cancer.

Keywords: Graph similarity, generalized trees, graph alignment, DNA microarray data, cervical cancer.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1753
660 Influence Analysis of Macroeconomic Parameters on Real Estate Price Variation in Taipei, Taiwan

Authors: Li Li, Kai-Hsuan Chu

Abstract:

It is well known that the real estate price depends on a lot of factors. Each house current value is dependent on the location, room number, transportation, living convenience, year and surrounding environments. Although, there are different experienced models for housing agent to estimate the price, it is a case by case study without overall dynamic variation investigation. However, many economic parameters may more or less influence the real estate price variation. Here, the influences of most macroeconomic parameters on real estate price are investigated individually based on least-square scheme and grey correlation strategy. Then those parameters are classified into leading indices, simultaneous indices and laggard indices. In addition, the leading time period is evaluated based on least square method. The important leading and simultaneous indices can be used to establish an artificial intelligent neural network model for real estate price variation prediction. The real estate price variation of Taipei, Taiwan during 2005 ~ 2017 are chosen for this research data analysis and validation. The results show that the proposed method has reasonable prediction function for real estate business reference.

Keywords: Real estate price, least-square, grey correlation, macroeconomics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 989
659 Application and Assessment of Artificial Neural Networks for Biodiesel Iodine Value Prediction

Authors: Raquel M. de Sousa, Sofiane Labidi, Allan Kardec D. Barros, Alex O. Barradas Filho, Aldalea L. B. Marques

Abstract:

Several parameters are established in order to measure biodiesel quality. One of them is the iodine value, which is an important parameter that measures the total unsaturation within a mixture of fatty acids. Limitation of unsaturated fatty acids is necessary since warming of higher quantity of these ones ends in either formation of deposits inside the motor or damage of lubricant. Determination of iodine value by official procedure tends to be very laborious, with high costs and toxicity of the reagents, this study uses artificial neural network (ANN) in order to predict the iodine value property as an alternative to these problems. The methodology of development of networks used 13 esters of fatty acids in the input with convergence algorithms of back propagation of back propagation type were optimized in order to get an architecture of prediction of iodine value. This study allowed us to demonstrate the neural networks’ ability to learn the correlation between biodiesel quality properties, in this caseiodine value, and the molecular structures that make it up. The model developed in the study reached a correlation coefficient (R) of 0.99 for both network validation and network simulation, with Levenberg-Maquardt algorithm.

Keywords: Artificial Neural Networks, Biodiesel, Iodine Value, Prediction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2381