Search results for: Normalized least mean square methods
3239 Transformer Top-Oil Temperature Modeling and Simulation
Authors: T. C. B. N. Assunção, J. L. Silvino, P. Resende
Abstract:
The winding hot-spot temperature is one of the most critical parameters that affect the useful life of the power transformers. The winding hot-spot temperature can be calculated as function of the top-oil temperature that can estimated by using the ambient temperature and transformer loading measured data. This paper proposes the estimation of the top-oil temperature by using a method based on Least Squares Support Vector Machines approach. The estimated top-oil temperature is compared with measured data of a power transformer in operation. The results are also compared with methods based on the IEEE Standard C57.91-1995/2000 and Artificial Neural Networks. It is shown that the Least Squares Support Vector Machines approach presents better performance than the methods based in the IEEE Standard C57.91-1995/2000 and artificial neural networks.Keywords: Artificial Neural Networks, Hot-spot Temperature, Least Squares Support Vector, Top-oil Temperature.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24923238 A New Approach of Fuzzy Methods for Evaluating of Hydrological Data
Authors: Nasser Shamskia, Seyyed Habib Rahmati, Hassan Haleh , Seyyedeh Hoda Rahmati
Abstract:
The main criteria of designing in the most hydraulic constructions essentially are based on runoff or discharge of water. Two of those important criteria are runoff and return period. Mostly, these measures are calculated or estimated by stochastic data. Another feature in hydrological data is their impreciseness. Therefore, in order to deal with uncertainty and impreciseness, based on Buckley-s estimation method, a new fuzzy method of evaluating hydrological measures are developed. The method introduces triangular shape fuzzy numbers for different measures in which both of the uncertainty and impreciseness concepts are considered. Besides, since another important consideration in most of the hydrological studies is comparison of a measure during different months or years, a new fuzzy method which is consistent with special form of proposed fuzzy numbers, is also developed. Finally, to illustrate the methods more explicitly, the two algorithms are tested on one simple example and a real case study.Keywords: Fuzzy Discharge, Fuzzy estimation, Fuzzy ranking method, Hydrological data
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17133237 Monotonicity of Dependence Concepts from Independent Random Vector into Dependent Random Vector
Authors: Guangpu Chen
Abstract:
When the failure function is monotone, some monotonic reliability methods are used to gratefully simplify and facilitate the reliability computations. However, these methods often work in a transformed iso-probabilistic space. To this end, a monotonic simulator or transformation is needed in order that the transformed failure function is still monotone. This note proves at first that the output distribution of failure function is invariant under the transformation. And then it presents some conditions under which the transformed function is still monotone in the newly obtained space. These concern the copulas and the dependence concepts. In many engineering applications, the Gaussian copulas are often used to approximate the real word copulas while the available information on the random variables is limited to the set of marginal distributions and the covariances. So this note catches an importance on the conditional monotonicity of the often used transformation from an independent random vector into a dependent random vector with Gaussian copulas.
Keywords: Monotonic, Rosenblatt, Nataf transformation, dependence concepts, completely positive matrices, Gaussiancopulas
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12123236 Pilot-Assisted Direct-Current Biased Optical Orthogonal Frequency Division Multiplexing Visible Light Communication System
Authors: Ayad A. Abdulkafi, Shahir F. Nawaf, Mohammed K. Hussein, Ibrahim K. Sileh, Fouad A. Abdulkafi
Abstract:
Visible light communication (VLC) is a new approach of optical wireless communication proposed to support the congested radio frequency (RF) spectrum. VLC systems are combined with orthogonal frequency division multiplexing (OFDM) to achieve high rate transmission and high spectral efficiency. In this paper, we investigate the Pilot-Assisted Channel Estimation for DC biased Optical OFDM (PACE-DCO-OFDM) systems to reduce the effects of the distortion on the transmitted signal. Least-square (LS) and linear minimum mean-squared error (LMMSE) estimators are implemented in MATLAB/Simulink to enhance the bit-error-rate (BER) of PACE-DCO-OFDM. Results show that DCO-OFDM system based on PACE scheme has achieved better BER performance compared to conventional system without pilot assisted channel estimation. Simulation results show that the proposed PACE-DCO-OFDM based on LMMSE algorithm can more accurately estimate the channel and achieves better BER performance when compared to the LS based PACE-DCO-OFDM and the traditional system without PACE. For the same signal to noise ratio (SNR) of 25 dB, the achieved BER is about 5×10-4 for LMMSE-PACE and 4.2×10-3 with LS-PACE while it is about 2×10-1 for system without PACE scheme.
Keywords: Channel estimation, OFDM, pilot-assist, VLC.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6683235 Radiochemical Purity of 68Ga-BCA-Peptides: Separation of All 68Ga Species with a Single iTLC Strip
Authors: Anton A. Larenkov, Alesya Ya Maruk
Abstract:
In the present study, highly effective iTLC single strip method for the determination of radiochemical purity (RCP) of 68Ga-BCA-peptides was developed (with no double-developing, changing of eluents or other additional manipulation). In this method iTLC-SG strips and commonly used eluent TFAaq. (3-5 % (v/v)) are used. The method allows determining each of the key radiochemical forms of 68Ga (colloidal, bound, ionic) separately with the peaks separation being no less than 4 σ. Rf = 0.0-0.1 for 68Ga-colloid; Rf = 0.5-0.6 for 68Ga-BCA-peptides; Rf = 0.9-1.0 for ionic 68Ga. The method is simple and fast: For developing length of 75 mm only 4-6 min is required (versus 18-20 min for pharmacopoeial method). The method has been tested on various compounds (including 68Ga-DOTA-TOC, 68Ga-DOTA-TATE, 68Ga-NODAGA-RGD2 etc.). The cross-validation work for every specific form of 68Ga showed good correlation between method developed and control (pharmacopoeial) methods. The method can become convenient and much more informative replacement for pharmacopoeial methods, including HPLC.Keywords: DOTA-TATE, 68Ga, quality control, radiochemical purity, radiopharmaceuticals, iTLC.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20193234 Phytoremediation of Cd and Pb by Four Tropical Timber Species Grown on an Ex-tin Mine in Peninsular Malaysia
Authors: Lai Hoe Ang, Lai Kuen Tang, Wai Mun Ho, Ting Fui Hui, Gary W. Theseira
Abstract:
Contamination of heavy metals in tin tailings has caused an interest in the scientific approach of their remediation. One of the approaches is through phytoremediation, which is using tree species to extract the heavy metals from the contaminated soils. Tin tailings comprise of slime and sand tailings. This paper reports only on the finding of the four timber species namely Acacia mangium, Hopea odorata, Intsia palembanica and Swietenia macrophylla on the removal of cadmium (Cd) and lead (Pb) from the slime tailings. The methods employed for sampling and soil analysis are established methods. Six trees of each species were randomly selected from a 0.25 ha plot for extraction and determination of their heavy metals. The soil samples were systematically collected according to 5 x 5 m grid from each plot. Results showed that the concentration of heavy metals in soils and trees varied according to species. Higher concentration of heavy metals was found in the stem than the primary roots of all the species. A. Mangium accumulated the highest total amount of Pb per hectare basis.Keywords: Cd, Pb, Phytoremediation of slimetailings, timber species.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27523233 Comparative Study of Transformed and Concealed Data in Experimental Designs and Analyses
Authors: K. Chinda, P. Luangpaiboon
Abstract:
This paper presents the comparative study of coded data methods for finding the benefit of concealing the natural data which is the mercantile secret. Influential parameters of the number of replicates (rep), treatment effects (τ) and standard deviation (σ) against the efficiency of each transformation method are investigated. The experimental data are generated via computer simulations under the specified condition of the process with the completely randomized design (CRD). Three ways of data transformation consist of Box-Cox, arcsine and logit methods. The difference values of F statistic between coded data and natural data (Fc-Fn) and hypothesis testing results were determined. The experimental results indicate that the Box-Cox results are significantly different from natural data in cases of smaller levels of replicates and seem to be improper when the parameter of minus lambda has been assigned. On the other hand, arcsine and logit transformations are more robust and obviously, provide more precise numerical results. In addition, the alternate ways to select the lambda in the power transformation are also offered to achieve much more appropriate outcomes.Keywords: Experimental Designs, Box-Cox, Arcsine, Logit Transformations.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16223232 Effective Internal Control System in the Nasarawa State Tertiary Educational Institutions for Efficiency: A Case of Nasarawa State Polytechnic, Lafia
Authors: Ibrahim Dauda Adagye
Abstract:
Effective internal control system in the bursary unit of tertiary educational institutions is geared toward achieving quality teaching, learning and research environment and as well assist the management of the institutions, particularly when decisions are to be made. While internal control system exists in all institutions, the outlined objectives above are far from being achieved. The paper therefore assesses the effectiveness of internal control system in tertiary educational institutions in Nasarawa State, Nigeria with specific focus on the Nasarawa State Polytechnic, Lafia. The study is survey, hence a simple closed ended questionnaire was developed and administered to a sample of twenty seven (27) member staff from the Bursary and the Internal audit unit of the Nasarawa State Polytechnic, Lafia so as to obtain data for analysis purposes and to test the study hypothesis. Responses from the questionnaire were analysed using a simple percentage and chi square. Findings shows that the right people are not assigned to the right job in the department, budget, and management accounting were never used in the institution’s operations and checking of subordinate by their superior officers is not regular. This renders the current internal control structure of the Polytechnic as ineffective and weak. The paper therefore recommends that: transparency should be seen as significant, as the institution work toward meeting its objectives, it therefore means that the right staff be assigned the right job and regular checking of the subordinates by their superiors be ensued.
Keywords: Bursary unit, efficiency, Internal control, tertiary educational institutions.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 38903231 Context Modeling and Context-Aware Service Adaptation for Pervasive Computing Systems
Authors: Moeiz Miraoui, Chakib Tadj, Chokri ben Amar
Abstract:
Devices in a pervasive computing system (PCS) are characterized by their context-awareness. It permits them to provide proactively adapted services to the user and applications. To do so, context must be well understood and modeled in an appropriate form which enhance its sharing between devices and provide a high level of abstraction. The most interesting methods for modeling context are those based on ontology however the majority of the proposed methods fail in proposing a generic ontology for context which limit their usability and keep them specific to a particular domain. The adaptation task must be done automatically and without an explicit intervention of the user. Devices of a PCS must acquire some intelligence which permits them to sense the current context and trigger the appropriate service or provide a service in a better suitable form. In this paper we will propose a generic service ontology for context modeling and a context-aware service adaptation based on a service oriented definition of context.
Keywords: Pervasive computing system, context, contextawareness, service, context modeling, ontology, adaptation, machine learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18153230 Phase Equilibrium of Volatile Organic Compounds in Polymeric Solvents Using Group Contribution Methods
Authors: E. Muzenda
Abstract:
Group contribution methods such as the UNIFAC are of major interest to researchers and engineers involved synthesis, feasibility studies, design and optimization of separation processes as well as other applications of industrial use. Reliable knowledge of the phase equilibrium behavior is crucial for the prediction of the fate of the chemical in the environment and other applications. The objective of this study was to predict the solubility of selected volatile organic compounds (VOCs) in glycol polymers and biodiesel. Measurements can be expensive and time consuming, hence the need for thermodynamic models. The results obtained in this study for the infinite dilution activity coefficients compare very well those published in literature obtained through measurements. It is suggested that in preliminary design or feasibility studies of absorption systems for the abatement of volatile organic compounds, prediction procedures should be implemented while accurate fluid phase equilibrium data should be obtained from experiment.Keywords: Volatile organic compounds, Prediction, Phaseequilibrium, Environmental, Infinite dilution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20263229 A Model to Determine Atmospheric Stability and its Correlation with CO Concentration
Authors: Kh. Ashrafi, Gh. A. Hoshyaripour
Abstract:
Atmospheric stability plays the most important role in the transport and dispersion of air pollutants. Different methods are used for stability determination with varying degrees of complexity. Most of these methods are based on the relative magnitude of convective and mechanical turbulence in atmospheric motions. Richardson number, Monin-Obukhov length, Pasquill-Gifford stability classification and Pasquill–Turner stability classification, are the most common parameters and methods. The Pasquill–Turner Method (PTM), which is employed in this study, makes use of observations of wind speed, insolation and the time of day to classify atmospheric stability with distinguishable indices. In this study, a model is presented to determination of atmospheric stability conditions using PTM. As a case study, meteorological data of Mehrabad station in Tehran from 2000 to 2005 is applied to model. Here, three different categories are considered to deduce the pattern of stability conditions. First, the total pattern of stability classification is obtained and results show that atmosphere is 38.77%, 27.26%, 33.97%, at stable, neutral and unstable condition, respectively. It is also observed that days are mostly unstable (66.50%) while nights are mostly stable (72.55%). Second, monthly and seasonal patterns are derived and results indicate that relative frequency of stable conditions decrease during January to June and increase during June to December, while results for unstable conditions are exactly in opposite manner. Autumn is the most stable season with relative frequency of 50.69% for stable condition, whilst, it is 42.79%, 34.38% and 27.08% for winter, summer and spring, respectively. Hourly stability pattern is the third category that points out that unstable condition is dominant from approximately 03-15 GTM and 04-12 GTM for warm and cold seasons, respectively. Finally, correlation between atmospheric stability and CO concentration is achieved.Keywords: Atmospheric stability, Pasquill-Turner classification, convective turbulence, mechanical turbulence, Tehran.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 64533228 BIM Application Research Based on the Main Entrance and Garden Area Project of Shanghai Disneyland
Authors: Ying Yuken, Pengfei Wang, Zhang Qilin, Xiao Ben
Abstract:
Based on the main entrance and garden area (ME&G) project of Shanghai Disneyland, this paper introduces the application of BIM technology in this kind of low-rise comprehensive building with complex facade system, electromechanical system and decoration system. BIM technology is applied to the whole process of design, construction and completion of the whole project. With the construction of BIM application framework of the whole project, the key points of BIM modeling methods of different systems and the integration and coordination of BIM models are elaborated in detail. The specific application methods of BIM technology in similar complex low-rise building projects are sorted out. Finally, the paper summarizes the benefits of BIM technology application, and puts forward some suggestions for BIM management mode and practical application of similar projects in the future.Keywords: BIM, complex low-rise building, BIM modeling, model integration and coordination, 3D scanning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10673227 Application of Post-Stack and Pre-Stack Seismic Inversion for Prediction of Hydrocarbon Reservoirs in a Persian Gulf Gas Field
Authors: Nastaran Moosavi, Mohammad Mokhtari
Abstract:
Seismic inversion is a technique which has been in use for years and its main goal is to estimate and to model physical characteristics of rocks and fluids. Generally, it is a combination of seismic and well-log data. Seismic inversion can be carried out through different methods; we have conducted and compared post-stack and pre- stack seismic inversion methods on real data in one of the fields in the Persian Gulf. Pre-stack seismic inversion can transform seismic data to rock physics such as P-impedance, S-impedance and density. While post- stack seismic inversion can just estimate P-impedance. Then these parameters can be used in reservoir identification. Based on the results of inverting seismic data, a gas reservoir was detected in one of Hydrocarbon oil fields in south of Iran (Persian Gulf). By comparing post stack and pre-stack seismic inversion it can be concluded that the pre-stack seismic inversion provides a more reliable and detailed information for identification and prediction of hydrocarbon reservoirs.Keywords: Density, P-impedance, S-impedance, post-stack seismic inversion, pre-stack seismic inversion.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22293226 Kinetic Parameter Estimation from Thermogravimetry and Microscale Combustion Calorimetry
Authors: Rhoda Afriyie Mensah, Lin Jiang, Solomon Asante-Okyere, Xu Qiang, Cong Jin
Abstract:
Flammability analysis of extruded polystyrene (XPS) has become crucial due to its utilization as insulation material for energy efficient buildings. Using the Kissinger-Akahira-Sunose and Flynn-Wall-Ozawa methods, the degradation kinetics of two pure XPS from the local market, red and grey ones, were obtained from the results of thermogravity analysis (TG) and microscale combustion calorimetry (MCC) experiments performed under the same heating rates. From the experiments, it was discovered that red XPS released more heat than grey XPS and both materials showed two mass loss stages. Consequently, the kinetic parameters for red XPS were higher than grey XPS. A comparative evaluation of activation energies from MCC and TG showed an insignificant degree of deviation signifying an equivalent apparent activation energy from both methods. However, different activation energy profiles as a result of the different chemical pathways were presented when the dependencies of the activation energies on extent of conversion for TG and MCC were compared.
Keywords: Flammability, microscale combustion calorimetry, thermogravity analysis, thermal degradation, kinetic analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8843225 Face Recognition Based On Vector Quantization Using Fuzzy Neuro Clustering
Authors: Elizabeth B. Varghese, M. Wilscy
Abstract:
A face recognition system is a computer application for automatically identifying or verifying a person from a digital image or a video frame. A lot of algorithms have been proposed for face recognition. Vector Quantization (VQ) based face recognition is a novel approach for face recognition. Here a new codebook generation for VQ based face recognition using Integrated Adaptive Fuzzy Clustering (IAFC) is proposed. IAFC is a fuzzy neural network which incorporates a fuzzy learning rule into a competitive neural network. The performance of proposed algorithm is demonstrated by using publicly available AT&T database, Yale database, Indian Face database and a small face database, DCSKU database created in our lab. In all the databases the proposed approach got a higher recognition rate than most of the existing methods. In terms of Equal Error Rate (ERR) also the proposed codebook is better than the existing methods.
Keywords: Face Recognition, Vector Quantization, Integrated Adaptive Fuzzy Clustering, Self Organization Map.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22413224 Optimization of Transmission Lines Loading in TNEP Using Decimal Codification Based GA
Authors: H. Shayeghi, M. Mahdavi
Abstract:
Transmission network expansion planning (TNEP) is a basic part of power system planning that determines where, when and how many new transmission lines should be added to the network. Up till now, various methods have been presented to solve the static transmission network expansion planning (STNEP) problem. But in all of these methods, lines adequacy rate has not been considered at the end of planning horizon, i.e., expanded network misses adequacy after some times and needs to be expanded again. In this paper, expansion planning has been implemented by merging lines loading parameter in the STNEP and inserting investment cost into the fitness function constraints using genetic algorithm. Expanded network will possess a maximum adequacy to provide load demand and also the transmission lines overloaded later. Finally, adequacy index could be defined and used to compare some designs that have different investment costs and adequacy rates. In this paper, the proposed idea has been tested on the Garvers network. The results show that the network will possess maximum efficiency economically.
Keywords: Adequacy Optimization, Transmission Expansion Planning, DCGA.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18113223 A Robust Reception of IEEE 802.15.4a IR-TH UWB in Dense Multipath and Gaussian Noise
Authors: Farah Haroon, Haroon Rasheed, Kazi M Ahmed
Abstract:
IEEE 802.15.4a impulse radio-time hopping ultra wide band (IR-TH UWB) physical layer, due to small duty cycle and very short pulse widths is robust against multipath propagation. However, scattering and reflections with the large number of obstacles in indoor channel environments, give rise to dense multipath fading. It imposes serious problem to optimum Rake receiver architectures, for which very large number of fingers are needed. Presence of strong noise also affects the reception of fine pulses having extremely low power spectral density. A robust SRake receiver for IEEE 802.15.4a IRTH UWB in dense multipath and additive white Gaussian noise (AWGN) is proposed to efficiently recover the weak signals with much reduced complexity. It adaptively increases the signal to noise (SNR) by decreasing noise through a recursive least square (RLS) algorithm. For simulation, dense multipath environment of IEEE 802.15.4a industrial non line of sight (NLOS) is employed. The power delay profile (PDF) and the cumulative distribution function (CDF) for the respective channel environment are found. Moreover, the error performance of the proposed architecture is evaluated in comparison with conventional SRake and AWGN correlation receivers. The simulation results indicate a substantial performance improvement with very less number of Rake fingers.Keywords: Adaptive noise cancellation, dense multipath propoagation, IEEE 802.15.4a, IR-TH UWB, industrial NLOS environment, SRake receiver
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18273222 Methods for Distinction of Cattle Using Supervised Learning
Authors: Radoslav Židek, Veronika Šidlová, Radovan Kasarda, Birgit Fuerst-Waltl
Abstract:
Machine learning represents a set of topics dealing with the creation and evaluation of algorithms that facilitate pattern recognition, classification, and prediction, based on models derived from existing data. The data can present identification patterns which are used to classify into groups. The result of the analysis is the pattern which can be used for identification of data set without the need to obtain input data used for creation of this pattern. An important requirement in this process is careful data preparation validation of model used and its suitable interpretation. For breeders, it is important to know the origin of animals from the point of the genetic diversity. In case of missing pedigree information, other methods can be used for traceability of animal´s origin. Genetic diversity written in genetic data is holding relatively useful information to identify animals originated from individual countries. We can conclude that the application of data mining for molecular genetic data using supervised learning is an appropriate tool for hypothesis testing and identifying an individual.
Keywords: Genetic data, Pinzgau cattle, supervised learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23183221 Adaption Model for Building Agile Pronunciation Dictionaries Using Phonemic Distance Measurements
Authors: Akella Amarendra Babu, Rama Devi Yellasiri, Natukula Sainath
Abstract:
Where human beings can easily learn and adopt pronunciation variations, machines need training before put into use. Also humans keep minimum vocabulary and their pronunciation variations are stored in front-end of their memory for ready reference, while machines keep the entire pronunciation dictionary for ready reference. Supervised methods are used for preparation of pronunciation dictionaries which take large amounts of manual effort, cost, time and are not suitable for real time use. This paper presents an unsupervised adaptation model for building agile and dynamic pronunciation dictionaries online. These methods mimic human approach in learning the new pronunciations in real time. A new algorithm for measuring sound distances called Dynamic Phone Warping is presented and tested. Performance of the system is measured using an adaptation model and the precision metrics is found to be better than 86 percent.Keywords: Pronunciation variations, dynamic programming, machine learning, natural language processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8003220 Exploring the Potential of Phase Change Memories as an Alternative to DRAM Technology
Authors: Venkataraman Krishnaswami, Venkatasubramanian Viswanathan
Abstract:
Scalability poses a severe threat to the existing DRAM technology. The capacitors that are used for storing and sensing charge in DRAM are generally not scaled beyond 42nm. This is because; the capacitors must be sufficiently large for reliable sensing and charge storage mechanism. This leaves DRAM memory scaling in jeopardy, as charge sensing and storage mechanisms become extremely difficult. In this paper we provide an overview of the potential and the possibilities of using Phase Change Memory (PCM) as an alternative for the existing DRAM technology. The main challenges that we encounter in using PCM are, the limited endurance, high access latencies, and higher dynamic energy consumption than that of the conventional DRAM. We then provide an overview of various methods, which can be employed to overcome these drawbacks. Hybrid memories involving both PCM and DRAM can be used, to achieve good tradeoffs in access latency and storage density. We conclude by presenting, the results of these methods that makes PCM a potential replacement for the current DRAM technology.Keywords: DRAM, Phase Change Memory.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19953219 Use of GIS for the Performance Evaluation of Canal Irrigation System in Rice Wheat Cropping Zone
Authors: Umm-e- Kalsoom, M. Arshad, Sadia Iqbal, M. Usman, M. Adnan
Abstract:
The research study evaluated the performance of irrigation system by using special scientific tools like Remote Sensing and GIS technology, so that proper measurements could be taken for the sustainable agriculture and water management. Different performance evaluation parameters had been calculated for the purposed data was gathered from field investigation and different government and private organizations. According to the calculations, organic matter ranges from 0.19% (low value) to 0.76% (high value). In flat irrigation system for wheat yield ranges from 3347.16 to 5260.39 kg/ha, while the total water applied to wheat crop ranges from 252.94 to 279.19 mm and WUE ranges from 13.07 to 18.37 kg/ha/mm. For rice yield ranges from 3347.47 to 5433.07 kg/ha with total water supplied to rice crop ranges from 764.71 to 978.15 mm and WUE ranges from 3.49 to 5.71 kg/ha/mm. Similarly, in raised bed system wheat yield ranges from 4569.13 to 6008.60 kg/ha, total water supplied ranges from 158.87 to 185.09 mm and WUE ranges from 27.20 to 33.54 kg/ha/mm while in rice crop, yield ranges from 5285.04 to 6716.69 kg/ha, total water supplied ranges from 600.72 to 755.06 mm and WUE ranges from 6.41 to 10.05 kg/ha/mm. Almost 51.3% water saving is observed in bed irrigation system as compared to flat system. Less water supplied to beds is more affective as its WUE value is higher than flat system where more water is supplied in both the seasons. Similarly, RWS values show that maximum water deficit while minimum area is getting adequate water supply. Greater yield is recorded in bed system as plant per square meter is more in bed system in comparison of flat system Thus, the integration of GIS tools to regularly compute performance indices could provide irrigation managers with the means for managing efficiently the irrigation system.Keywords: Field survey, Relative Water Supply (RWS), Remote sensing maps, Water Use Efficiency (WUE).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24183218 Parenting Styles and Their Relation to Videogame Addiction
Authors: Petr Květon, Martin Jelínek
Abstract:
We try to identify the role of various aspects of parenting style in the phenomenon of videogame playing addiction. Relevant self-report questionnaires were part of a wider set of methods focused on the constructs related to videogame playing. The battery of methods was administered in school settings in paper and pencil form. The research sample consisted of 333 (166 males, 167 females) elementary and high school students at the age between 10 and 19 years (m=14.98, sd=1.77). Using stepwise regression analysis, we assessed the influence of demographic variables (gender and age) and parenting styles. Age and gender together explained 26.3% of game addiction variance (F(2,330)=58.81, p<.01). By adding four aspect of parenting styles (inconsistency, involvement, control, and warmth) another 10.2% of variance was explained (∆F(4,326)=13.09, p<.01). The significant predictor was gender of the respondent, where males scored higher on game addiction scale (B=0.70, p<.01), age (β=-0.18, p<.01), where younger children showed higher level of addiction, and parental inconsistency (β=0.30, p<.01), where the higher the inconsistency in upbringing, the more developed game playing addiction.
Keywords: Gender, parenting styles, video games, addiction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27413217 Glass Bottle Inspector Based on Machine Vision
Authors: Huanjun Liu, Yaonan Wang, Feng Duan
Abstract:
This text studies glass bottle intelligent inspector based machine vision instead of manual inspection. The system structure is illustrated in detail in this paper. The text presents the method based on watershed transform methods to segment the possible defective regions and extract features of bottle wall by rules. Then wavelet transform are used to exact features of bottle finish from images. After extracting features, the fuzzy support vector machine ensemble is putted forward as classifier. For ensuring that the fuzzy support vector machines have good classification ability, the GA based ensemble method is used to combining the several fuzzy support vector machines. The experiments demonstrate that using this inspector to inspect glass bottles, the accuracy rate may reach above 97.5%.Keywords: Intelligent Inspection, Support Vector Machines, Ensemble Methods, watershed transform, Wavelet Transform
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 38963216 Modal Analysis of Machine Tool Column Using Finite Element Method
Authors: Migbar Assefa
Abstract:
The performance of a machine tool is eventually assessed by its ability to produce a component of the required geometry in minimum time and at small operating cost. It is customary to base the structural design of any machine tool primarily upon the requirements of static rigidity and minimum natural frequency of vibration. The operating properties of machines like cutting speed, feed and depth of cut as well as the size of the work piece also have to be kept in mind by a machine tool structural designer. This paper presents a novel approach to the design of machine tool column for static and dynamic rigidity requirement. Model evaluation is done effectively through use of General Finite Element Analysis software ANSYS. Studies on machine tool column are used to illustrate finite element based concept evaluation technique. This paper also presents results obtained from the computations of thin walled box type columns that are subjected to torsional and bending loads in case of static analysis and also results from modal analysis. The columns analyzed are square and rectangle based tapered open column, column with cover plate, horizontal partitions and with apertures. For the analysis purpose a total of 70 columns were analyzed for bending, torsional and modal analysis. In this study it is observed that the orientation and aspect ratio of apertures have no significant effect on the static and dynamic rigidity of the machine tool structure.
Keywords: Finite Element Modeling, Modal Analysis, Machine tool structure, Static Analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 50373215 Zero Inflated Models for Overdispersed Count Data
Authors: Y. N. Phang, E. F. Loh
Abstract:
The zero inflated models are usually used in modeling count data with excess zeros where the existence of the excess zeros could be structural zeros or zeros which occur by chance. These type of data are commonly found in various disciplines such as finance, insurance, biomedical, econometrical, ecology, and health sciences which involve sex and health dental epidemiology. The most popular zero inflated models used by many researchers are zero inflated Poisson and zero inflated negative binomial models. In addition, zero inflated generalized Poisson and zero inflated double Poisson models are also discussed and found in some literature. Recently zero inflated inverse trinomial model and zero inflated strict arcsine models are advocated and proven to serve as alternative models in modeling overdispersed count data caused by excessive zeros and unobserved heterogeneity. The purpose of this paper is to review some related literature and provide a variety of examples from different disciplines in the application of zero inflated models. Different model selection methods used in model comparison are discussed.
Keywords: Overdispersed count data, model selection methods, likelihood ratio, AIC, BIC.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 45323214 Analytical and Experimental Methods of Design for Supersonic Two-Stage Ejectors
Authors: S. Daneshmand, C. Aghanajafi, A. Bahrami
Abstract:
In this paper the supersonic ejectors are experimentally and analytically studied. Ejector is a device that uses the energy of a fluid to move another fluid. This device works like a vacuum pump without usage of piston, rotor or any other moving component. An ejector contains an active nozzle, a passive nozzle, a mixing chamber and a diffuser. Since the fluid viscosity is large, and the flow is turbulent and three dimensional in the mixing chamber, the numerical methods consume long time and high cost to analyze the flow in ejectors. Therefore this paper presents a simple analytical method that is based on the precise governing equations in fluid mechanics. According to achieved analytical relations, a computer code has been prepared to analyze the flow in different components of the ejector. An experiment has been performed in supersonic regime 1.53213 A Retrievable Genetic Algorithm for Efficient Solving of Sudoku Puzzles
Authors: Seyed Mehran Kazemi, Bahare Fatemi
Abstract:
Sudoku is a logic-based combinatorial puzzle game which is popular among people of different ages. Due to this popularity, computer softwares are being developed to generate and solve Sudoku puzzles with different levels of difficulty. Several methods and algorithms have been proposed and used in different softwares to efficiently solve Sudoku puzzles. Various search methods such as stochastic local search have been applied to this problem. Genetic Algorithm (GA) is one of the algorithms which have been applied to this problem in different forms and in several works in the literature. In these works, chromosomes with little or no information were considered and obtained results were not promising. In this paper, we propose a new way of applying GA to this problem which uses more-informed chromosomes than other works in the literature. We optimize the parameters of our GA using puzzles with different levels of difficulty. Then we use the optimized values of the parameters to solve various puzzles and compare our results to another GA-based method for solving Sudoku puzzles.
Keywords: Genetic algorithm, optimization, solving Sudoku puzzles, stochastic local search.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 37723212 Peakwise Smoothing of Data Models using Wavelets
Authors: D Sudheer Reddy, N Gopal Reddy, P V Radhadevi, J Saibaba, Geeta Varadan
Abstract:
Smoothing or filtering of data is first preprocessing step for noise suppression in many applications involving data analysis. Moving average is the most popular method of smoothing the data, generalization of this led to the development of Savitzky-Golay filter. Many window smoothing methods were developed by convolving the data with different window functions for different applications; most widely used window functions are Gaussian or Kaiser. Function approximation of the data by polynomial regression or Fourier expansion or wavelet expansion also gives a smoothed data. Wavelets also smooth the data to great extent by thresholding the wavelet coefficients. Almost all smoothing methods destroys the peaks and flatten them when the support of the window is increased. In certain applications it is desirable to retain peaks while smoothing the data as much as possible. In this paper we present a methodology called as peak-wise smoothing that will smooth the data to any desired level without losing the major peak features.Keywords: smoothing, moving average, peakwise smoothing, spatialdensity models, planar shape models, wavelets.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17503211 Revisiting Domestication and Foreignisation Methods: Translating the Quran by the Hybrid Approach
Authors: Aladdin Al-Tarawneh
Abstract:
The Quran, as it is the sacred book of Islam and considered the literal word of God (Allah) in Arabic, is highly translated into many languages; however, the foreignising or the literal approach excessively stains the quality and discredits the final product in the eyes of its receptors. Such an approach fails to capture the intended meaning of the Quran and to communicate it in any language. Therefore, this study is conducted to propose a different approach that seeks involving other ones according to a hybrid model. Indeed, this study challenges the binary adherence that is highly used in Translation Studies (TS) in general and in the translation of the Quran in particular. Drawing on the genuine fact that the Quran can be communicated in any language in terms of meaning, and the translation is not sacred; this paper approaches the translation of the Quran by blending different methods like domestication or foreignisation in a systematic way, avoiding the binary choice made by many translators. To reach this aim, the paper has a conceptual part that seeks to elucidate and clarify the main methods employed in TS, and criticise and modify them to propose the new hybrid approach (the hybrid model) for translating the Quran – that is, the deductive method. To support and validate the outcome of the previous part, a comparative model is employed in order to highlight the differences between the suggested translation and other widely used ones – that is, the inductive method. By applying this methodology, the paper proves that there is a deficiency of communicating the original meaning of the Quran in light of the foreignising approach. In conclusion, the paper suggests producing a Quran translation has to take into account the adoption of many techniques to express the meaning of the Quran as understood in the original, and to offer this understanding in English in the most native-like manner to serve the intended target readers.
Keywords: Quran translation, hybrid approach, domestication, foreignisation, hybrid model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11903210 Effectiveness of Business Software Systems Development and Enhancement Projects versus Work Effort Estimation Methods
Authors: Beata Czarnacka-Chrobot
Abstract:
Execution of Business Software Systems (BSS) Development and Enhancement Projects (D&EP) is characterized by the exceptionally low effectiveness, leading to considerable financial losses. The general reason for low effectiveness of such projects is that they are inappropriately managed. One of the factors of proper BSS D&EP management is suitable (reliable and objective) method of project work effort estimation since this is what determines correct estimation of its major attributes: project cost and duration. BSS D&EP is usually considered to be accomplished effectively if product of a planned functionality is delivered without cost and time overrun. The goal of this paper is to prove that choosing approach to the BSS D&EP work effort estimation has a considerable influence on the effectiveness of such projects execution.
Keywords: Business software systems, development and enhancement projects, effectiveness, work effort estimation methods, software product size, software product functionality, project duration, project cost.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2084