Search results for: High Pass Filter
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6328

Search results for: High Pass Filter

5878 Exponentially Weighted Simultaneous Estimation of Several Quantiles

Authors: Valeriy Naumov, Olli Martikainen

Abstract:

In this paper we propose new method for simultaneous generating multiple quantiles corresponding to given probability levels from data streams and massive data sets. This method provides a basis for development of single-pass low-storage quantile estimation algorithms, which differ in complexity, storage requirement and accuracy. We demonstrate that such algorithms may perform well even for heavy-tailed data.

Keywords: Quantile estimation, data stream, heavy-taileddistribution, tail index.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1520
5877 Stochastic Control of Decentralized Singularly Perturbed Systems

Authors: Walid S. Alfuhaid, Saud A. Alghamdi, John M. Watkins, M. Edwin Sawan

Abstract:

Designing a controller for stochastic decentralized interconnected large scale systems usually involves a high degree of complexity and computation ability. Noise, observability, and controllability of all system states, connectivity, and channel bandwidth are other constraints to design procedures for distributed large scale systems. The quasi-steady state model investigated in this paper is a reduced order model of the original system using singular perturbation techniques. This paper results in an optimal control synthesis to design an observer based feedback controller by standard stochastic control theory techniques using Linear Quadratic Gaussian (LQG) approach and Kalman filter design with less complexity and computation requirements. Numerical example is given at the end to demonstrate the efficiency of the proposed method.

Keywords: Decentralized, optimal control, output, singular perturb.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1554
5876 Analysis and Performance Evaluation of Noise-Reduction Transformer

Authors: Toshiaki Yanada, Kazumi Ishikawa

Abstract:

The present paper deals with the analysis and development of noise-reduction transformer that has a filter function for conductive noise transmission. Two types of prototype noise-reduction transformers with two different output voltages are proposed. To determine an optimum design for the noise-reduction transformer, noise attenuation characteristics are discussed based on the experiments and the equivalent circuit analysis. The analysis gives a relation between the circuit parameters and the noise attenuation. High performance step-down noise-reduction transformer for direct power supply to electronics equipment is developed. The input voltage of the transformer is 100 V and the output voltage is 5 V. Frequency characteristics of noise attenuation are discussed, and prevention of pulse noise transmission is demonstrated. Normal mode noise attenuation of this transformer is –80 dB, and common mode exceeds –90 dB. The step-down noise-reduction transformer eliminates pulse noise efficiently.

Keywords: conductive noise, EMC, EMI, noise attenuation, transformer.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1954
5875 Gene Selection Guided by Feature Interdependence

Authors: Hung-Ming Lai, Andreas Albrecht, Kathleen Steinhöfel

Abstract:

Cancers could normally be marked by a number of differentially expressed genes which show enormous potential as biomarkers for a certain disease. Recent years, cancer classification based on the investigation of gene expression profiles derived by high-throughput microarrays has widely been used. The selection of discriminative genes is, therefore, an essential preprocess step in carcinogenesis studies. In this paper, we have proposed a novel gene selector using information-theoretic measures for biological discovery. This multivariate filter is a four-stage framework through the analyses of feature relevance, feature interdependence, feature redundancy-dependence and subset rankings, and having been examined on the colon cancer data set. Our experimental result show that the proposed method outperformed other information theorem based filters in all aspect of classification errors and classification performance.

Keywords: Colon cancer, feature interdependence, feature subset selection, gene selection, microarray data analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2122
5874 An Optimal Control Method for Reconstruction of Topography in Dam-Break Flows

Authors: Alia Alghosoun, Nabil El Moçayd, Mohammed Seaid

Abstract:

Modeling dam-break flows over non-flat beds requires an accurate representation of the topography which is the main source of uncertainty in the model. Therefore, developing robust and accurate techniques for reconstructing topography in this class of problems would reduce the uncertainty in the flow system. In many hydraulic applications, experimental techniques have been widely used to measure the bed topography. In practice, experimental work in hydraulics may be very demanding in both time and cost. Meanwhile, computational hydraulics have served as an alternative for laboratory and field experiments. Unlike the forward problem, the inverse problem is used to identify the bed parameters from the given experimental data. In this case, the shallow water equations used for modeling the hydraulics need to be rearranged in a way that the model parameters can be evaluated from measured data. However, this approach is not always possible and it suffers from stability restrictions. In the present work, we propose an adaptive optimal control technique to numerically identify the underlying bed topography from a given set of free-surface observation data. In this approach, a minimization function is defined to iteratively determine the model parameters. The proposed technique can be interpreted as a fractional-stage scheme. In the first stage, the forward problem is solved to determine the measurable parameters from known data. In the second stage, the adaptive control Ensemble Kalman Filter is implemented to combine the optimality of observation data in order to obtain the accurate estimation of the topography. The main features of this method are on one hand, the ability to solve for different complex geometries with no need for any rearrangements in the original model to rewrite it in an explicit form. On the other hand, its achievement of strong stability for simulations of flows in different regimes containing shocks or discontinuities over any geometry. Numerical results are presented for a dam-break flow problem over non-flat bed using different solvers for the shallow water equations. The robustness of the proposed method is investigated using different numbers of loops, sensitivity parameters, initial samples and location of observations. The obtained results demonstrate high reliability and accuracy of the proposed techniques.

Keywords: Optimal control, ensemble Kalman Filter, topography reconstruction, data assimilation, shallow water equations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 659
5873 Financing Decision and Productivity Growth for the Venture Capital Industry Using High-Order Fuzzy Time Series

Authors: Shang-En Yu

Abstract:

Human society, there are many uncertainties, such as economic growth rate forecast of the financial crisis, many scholars have, since the the Song Chissom two scholars in 1993 the concept of the so-called fuzzy time series (Fuzzy Time Series)different mode to deal with these problems, a previous study, however, usually does not consider the relevant variables selected and fuzzy process based solely on subjective opinions the fuzzy semantic discrete, so can not objectively reflect the characteristics of the data set, in addition to carrying outforecasts are often fuzzy rules as equally important, failed to consider the importance of each fuzzy rule. For these reasons, the variable selection (Factor Selection) through self-organizing map (Self-Organizing Map, SOM) and proposed high-end weighted multivariate fuzzy time series model based on fuzzy neural network (Fuzzy-BPN), and using the the sequential weighted average operator (Ordered Weighted Averaging operator, OWA) weighted prediction. Therefore, in order to verify the proposed method, the Taiwan stock exchange (Taiwan Stock Exchange Corporation) Taiwan Weighted Stock Index (Taiwan Stock Exchange Capitalization Weighted Stock Index, TAIEX) as experimental forecast target, in order to filter the appropriate variables in the experiment Finally, included in other studies in recent years mode in conjunction with this study, the results showed that the predictive ability of this study further improve.

Keywords: Heterogeneity, residential mortgage loans, foreclosure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1372
5872 Unsupervised Feature Selection Using Feature Density Functions

Authors: Mina Alibeigi, Sattar Hashemi, Ali Hamzeh

Abstract:

Since dealing with high dimensional data is computationally complex and sometimes even intractable, recently several feature reductions methods have been developed to reduce the dimensionality of the data in order to simplify the calculation analysis in various applications such as text categorization, signal processing, image retrieval, gene expressions and etc. Among feature reduction techniques, feature selection is one the most popular methods due to the preservation of the original features. In this paper, we propose a new unsupervised feature selection method which will remove redundant features from the original feature space by the use of probability density functions of various features. To show the effectiveness of the proposed method, popular feature selection methods have been implemented and compared. Experimental results on the several datasets derived from UCI repository database, illustrate the effectiveness of our proposed methods in comparison with the other compared methods in terms of both classification accuracy and the number of selected features.

Keywords: Feature, Feature Selection, Filter, Probability Density Function

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2060
5871 An Improved Preprocessing for Biosonar Target Classification

Authors: Turgay Temel, John Hallam

Abstract:

An improved processing description to be employed in biosonar signal processing in a cochlea model is proposed and examined. It is compared to conventional models using a modified discrimination analysis and both are tested. Their performances are evaluated with echo data captured from natural targets (trees).Results indicate that the phase characteristics of low-pass filters employed in the echo processing have a significant effect on class separability for this data.

Keywords: Cochlea model, discriminant analysis, neurospikecoding, classification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1476
5870 Iris Localization using Circle and Fuzzy Circle Detection Method

Authors: Marzieh. Savoj, S. Amirhassan. Monadjemi

Abstract:

Iris localization is a very important approach in biometric identification systems. Identification process usually is implemented in three levels: iris localization, feature extraction, and pattern matching finally. Accuracy of iris localization as the first step affects all other levels and this shows the importance of iris localization in an iris based biometric system. In this paper, we consider Daugman iris localization method as a standard method, propose a new method in this field and then analyze and compare the results of them on a standard set of iris images. The proposed method is based on the detection of circular edge of iris, and improved by fuzzy circles and surface energy difference contexts. Implementation of this method is so easy and compared to the other methods, have a rather high accuracy and speed. Test results show that the accuracy of our proposed method is about Daugman method and computation speed of it is 10 times faster.

Keywords: Convolution, Edge detector filter, Fuzzy circle, Identification

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2020
5869 Improved Lung Nodule Visualization on Chest Radiographs using Digital Filtering and Contrast Enhancement

Authors: Benjamin Y. M. Kwan, Hon Keung Kwan

Abstract:

Early detection of lung cancer through chest radiography is a widely used method due to its relatively affordable cost. In this paper, an approach to improve lung nodule visualization on chest radiographs is presented. The approach makes use of linear phase high-frequency emphasis filter for digital filtering and histogram equalization for contrast enhancement to achieve improvements. Results obtained indicate that a filtered image can reveal sharper edges and provide more details. Also, contrast enhancement offers a way to further enhance the global (or local) visualization by equalizing the histogram of the pixel values within the whole image (or a region of interest). The work aims to improve lung nodule visualization of chest radiographs to aid detection of lung cancer which is currently the leading cause of cancer deaths worldwide.

Keywords: Chest radiographs, Contrast enhancement, Digital filtering, Lung nodule detection

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1712
5868 Green Bridges and Their Migration Potential

Authors: Jaroslav Žák, Aleš Florian

Abstract:

Green bridges enable wildlife to pass through linear structures, especially freeways. The term migration potential is used to quantify their functionality. The proposed methodology for determining migration potential eliminates the mathematical, systematic and ecological inaccuracies of previous methodologies and provides a reliable tool for designers and environmentalists. The methodology is suited especially to medium-sized and large mammals, is mathematically correct, and its correspondence with reality was tested by monitoring existing green bridges. 

Keywords: Green bridges, migration potential, partial probabilities, wildlife migration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1736
5867 Discrete Polynomial Moments and Savitzky-Golay Smoothing

Authors: Paul O'Leary, Matthew Harker

Abstract:

This paper presents unified theory for local (Savitzky- Golay) and global polynomial smoothing. The algebraic framework can represent any polynomial approximation and is seamless from low degree local, to high degree global approximations. The representation of the smoothing operator as a projection onto orthonormal basis functions enables the computation of: the covariance matrix for noise propagation through the filter; the noise gain and; the frequency response of the polynomial filters. A virtually perfect Gram polynomial basis is synthesized, whereby polynomials of degree d = 1000 can be synthesized without significant errors. The perfect basis ensures that the filters are strictly polynomial preserving. Given n points and a support length ls = 2m + 1 then the smoothing operator is strictly linear phase for the points xi, i = m+1. . . n-m. The method is demonstrated on geometric surfaces data lying on an invariant 2D lattice.

Keywords: Gram polynomials, Savitzky-Golay Smoothing, Discrete Polynomial Moments

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2761
5866 An Inclusion Project for Deaf Children into a Northern Italy Contest

Authors: G. Tamanza, A. Bossoni

Abstract:

84 deaf students (from primary school to college) and their families participated in this inclusion project in cooperation with numerous institutions in northern Italy (Brescia-Lombardy). Participants were either congenitally deaf or their deafness was related to other pathologies. This research promoted the integration of deaf students as they pass from primary school to high school to college. Learning methods and processes were studied that focused on encour­aging individual autonomy and socialization. The research team and its collaborators included school teachers, speech ther­apists, psychologists and home tutors, as well as teaching assistants, child neuropsychiatrists and other external authorities involved with deaf persons social inclusion programs. Deaf children and their families were supported, in terms of inclusion, and were made aware of the research team that focused on the Bisogni Educativi Speciali (BES or Special Educational Needs) (L.170/2010 - DM 5669/2011). This project included a diagnostic and evaluative phase as well as an operational one. Results demonstrated that deaf children were highly satisfied and confident; academic performance improved and collaboration in school increased. Deaf children felt that they had access to high school and college. Empowerment for the families of deaf children in terms of networking among local services that deal with the deaf also improved while family satisfaction also improved. We found that teachers and those who gave support to deaf children increased their professional skills. Achieving autonomy, instrumental, communicative and relational abilities were also found to be crucial. Project success was determined by temporal continuity, clear theoretical methodology, strong alliance for the project direction and a resilient team response.

Keywords: Autonomy, inclusion, skills, well-being.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1169
5865 Extended Study on Removing Gaussian Noise in Mechanical Engineering Drawing Images using Median Filters

Authors: Low Khong Teck, Hasan S. M. Al-Khaffaf, Abdullah Zawawi Talib, Tan Kian Lam

Abstract:

In this paper, an extended study is performed on the effect of different factors on the quality of vector data based on a previous study. In the noise factor, one kind of noise that appears in document images namely Gaussian noise is studied while the previous study involved only salt-and-pepper noise. High and low levels of noise are studied. For the noise cleaning methods, algorithms that were not covered in the previous study are used namely Median filters and its variants. For the vectorization factor, one of the best available commercial raster to vector software namely VPstudio is used to convert raster images into vector format. The performance of line detection will be judged based on objective performance evaluation method. The output of the performance evaluation is then analyzed statistically to highlight the factors that affect vector quality.

Keywords: Performance Evaluation, Vectorization, Median Filter, Gaussian Noise.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1685
5864 Fast Factored DCT-LMS Speech Enhancement for Performance Enhancement of Digital Hearing Aid

Authors: Sunitha. S.L., V. Udayashankara

Abstract:

Background noise is particularly damaging to speech intelligibility for people with hearing loss especially for sensorineural loss patients. Several investigations on speech intelligibility have demonstrated sensorineural loss patients need 5-15 dB higher SNR than the normal hearing subjects. This paper describes Discrete Cosine Transform Power Normalized Least Mean Square algorithm to improve the SNR and to reduce the convergence rate of the LMS for Sensory neural loss patients. Since it requires only real arithmetic, it establishes the faster convergence rate as compare to time domain LMS and also this transformation improves the eigenvalue distribution of the input autocorrelation matrix of the LMS filter. The DCT has good ortho-normal, separable, and energy compaction property. Although the DCT does not separate frequencies, it is a powerful signal decorrelator. It is a real valued function and thus can be effectively used in real-time operation. The advantages of DCT-LMS as compared to standard LMS algorithm are shown via SNR and eigenvalue ratio computations. . Exploiting the symmetry of the basis functions, the DCT transform matrix [AN] can be factored into a series of ±1 butterflies and rotation angles. This factorization results in one of the fastest DCT implementation. There are different ways to obtain factorizations. This work uses the fast factored DCT algorithm developed by Chen and company. The computer simulations results show superior convergence characteristics of the proposed algorithm by improving the SNR at least 10 dB for input SNR less than and equal to 0 dB, faster convergence speed and better time and frequency characteristics.

Keywords: Hearing Impairment, DCT Adaptive filter, Sensorineural loss patients, Convergence rate.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2158
5863 Optimal Duty-Cycle Modulation Scheme for Analog-To-Digital Conversion Systems

Authors: G. Sonfack, J. Mbihi, B. Lonla Moffo

Abstract:

This paper presents an optimal duty-cycle modulation (ODCM) scheme for analog-to-digital conversion (ADC) systems. The overall ODCM-Based ADC problem is decoupled into optimal DCM and digital filtering sub-problems, while taking into account constraints of mutual design parameters between the two. Using a set of three lemmas and four morphological theorems, the ODCM sub-problem is modelled as a nonlinear cost function with nonlinear constraints. Then, a weighted least pth norm of the error between ideal and predicted frequency responses is used as a cost function for the digital filtering sub-problem. In addition, MATLAB fmincon and MATLAB iirlnorm tools are used as optimal DCM and least pth norm solvers respectively. Furthermore, the virtual simulation scheme of an overall prototyping ODCM-based ADC system is implemented and well tested with the help of Simulink tool according to relevant set of design data, i.e., 3 KHz of modulating bandwidth, 172 KHz of maximum modulation frequency and 25 MHZ of sampling frequency. Finally, the results obtained and presented show that the ODCM-based ADC achieves under 3 KHz of modulating bandwidth: 57 dBc of SINAD (signal-to-noise and distorsion), 58 dB of SFDR (Surpious free dynamic range) -80 dBc of THD (total harmonic distorsion), and 10 bits of minimum resolution. These performance levels appear to be a great challenge within the class of oversampling ADC topologies, with 2nd order IIR (infinite impulse response) decimation filter.

Keywords: Digital IIR filter, morphological lemmas and theorems, optimal DCM-based DAC, virtual simulation, weighted least pth norm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 909
5862 Low Cost IMU \ GPS Integration Using Kalman Filtering for Land Vehicle Navigation Application

Authors: Othman Maklouf, Abdurazag Ghila, Ahmed Abdulla, Ameer Yousef

Abstract:

Land vehicle navigation system technology is a subject of great interest today. Global Positioning System (GPS) is a common choice for positioning in such systems. However, GPS alone is incapable of providing continuous and reliable positioning, because of its inherent dependency on external electromagnetic signals. Inertial Navigation is the implementation of inertial sensors to determine the position and orientation of a vehicle. As such, inertial navigation has unbounded error growth since the error accumulates at each step. Thus in order to contain these errors some form of external aiding is required. The availability of low cost Micro-Electro-Mechanical-System (MEMS) inertial sensors is now making it feasible to develop Inertial Navigation System (INS) using an inertial measurement unit (IMU), in conjunction with GPS to fulfill the demands of such systems. Typically IMU’s are very expensive systems; however this INS will use “low cost” components. Unfortunately with low cost also comes low performance and is the main reason for the inclusion of GPS and Kalman filtering into the system. The aim of this paper is to develop a GPS/MEMS INS integrated system, which is able to provide a navigation solution with accuracy levels appropriate for land vehicle navigation. The primary piece of equipment used was a MEMS-based Crista IMU (from Cloud Cap Technology Inc.) and a Garmin GPS 18 PC (which is both a receiver and antenna). The integration of GPS with INS can be implemented using a Kalman filter in loosely coupled mode. In this integration mode the INS error states, together with any navigation state (position, velocity, and attitude) and other unknown parameters of interest, are estimated using GPS measurements. All important equations regarding navigation are presented along with discussion.

Keywords: GPS, IMU, Kalman Filter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7511
5861 Improved Estimation of Evolutionary Spectrum based on Short Time Fourier Transforms and Modified Magnitude Group Delay by Signal Decomposition

Authors: H K Lakshminarayana, J S Bhat, H M Mahesh

Abstract:

A new estimator for evolutionary spectrum (ES) based on short time Fourier transform (STFT) and modified group delay function (MGDF) by signal decomposition (SD) is proposed. The STFT due to its built-in averaging, suppresses the cross terms and the MGDF preserves the frequency resolution of the rectangular window with the reduction in the Gibbs ripple. The present work overcomes the magnitude distortion observed in multi-component non-stationary signals with STFT and MGDF estimation of ES using SD. The SD is achieved either through discrete cosine transform based harmonic wavelet transform (DCTHWT) or perfect reconstruction filter banks (PRFB). The MGDF also improves the signal to noise ratio by removing associated noise. The performance of the present method is illustrated for cross chirp and frequency shift keying (FSK) signals, which indicates that its performance is better than STFT-MGDF (STFT-GD) alone. Further its noise immunity is better than STFT. The SD based methods, however cannot bring out the frequency transition path from band to band clearly, as there will be gap in the contour plot at the transition. The PRFB based STFT-SD shows good performance than DCTHWT decomposition method for STFT-GD.

Keywords: Evolutionary Spectrum, Modified Group Delay, Discrete Cosine Transform, Harmonic Wavelet Transform, Perfect Reconstruction Filter Banks, Short Time Fourier Transform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1593
5860 Joint Microstatistic Multiuser Detection and Cancellation of Nonlinear Distortion Effects for the Uplink of MC-CDMA Systems Using Golay Codes

Authors: Peter Drotar, Juraj Gazda, Pavol Galajda, Dusan Kocur

Abstract:

The study in this paper underlines the importance of correct joint selection of the spreading codes for uplink of multicarrier code division multiple access (MC-CDMA) at the transmitter side and detector at the receiver side in the presence of nonlinear distortion due to high power amplifier (HPA). The bit error rate (BER) of system for different spreading sequences (Walsh code, Gold code, orthogonal Gold code, Golay code and Zadoff-Chu code) and different kinds of receivers (minimum mean-square error receiver (MMSE-MUD) and microstatistic multi-user receiver (MSF-MUD)) is compared by means of simulations for MC-CDMA transmission system. Finally, the results of analysis will show, that the application of MSF-MUD in combination with Golay codes can outperform significantly the other tested spreading codes and receivers for all mostly used models of HPA.

Keywords: HPA, MC-CDMA, microstatistic filter, multi-user receivers, PAPR.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1746
5859 Air Pollution Control from Rice Shellers - A Case Study

Authors: S. M. Ahuja

Abstract:

A Rice Sheller is used for obtaining polished white rice from paddy. There are about 3000 Rice Shellers in Punjab and 50000 in India. During the process of shelling lot of dust is emitted from different unit operations like paddy silo, paddy shaker, bucket elevators, huskers, paddy separator etc. These dust emissions have adverse effect on the health of the workers and the wear and tear of the shelling machinery is fast. All the dust emissions spewing out of these unit operations of a rice Sheller were contained by providing suitable hoods and enclosures while ensuring their workability. These were sucked by providing an induced draft fan followed by a high efficiency cyclone separator that has got an overall dust collection efficiency of more than 90%. This cyclone separator replaced two cyclone separators and a filter bag house, which the Rice Sheller was already having. The dust concentration in the stack after the installation of cyclone separator is well within the stipulated standards. Besides controlling pollution, there is improvement in the quality of products like bran and the life of shelling machinery has enhanced. The payback period of this technology is less than four shelling months.

Keywords: Air Pollution, Cyclone Separator, Pneumatic Conveying, Rice Sheller.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3350
5858 Classifying Students for E-Learning in Information Technology Course Using ANN

Authors: S. Areerachakul, N. Ployong, S. Na Songkla

Abstract:

This research’s objective is to select the model with most accurate value by using Neural Network Technique as a way to filter potential students who enroll in IT course by Electronic learning at Suan Suanadha Rajabhat University. It is designed to help students selecting the appropriate courses by themselves. The result showed that the most accurate model was 100 Folds Cross-validation which had 73.58% points of accuracy.

Keywords: Artificial neural network, classification, students.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1485
5857 Simulation of Complex-Shaped Particle Breakage Using the Discrete Element Method

Authors: Felix Platzer, Eric Fimbinger

Abstract:

In Discrete Element Method (DEM) simulations, the breakage behavior of particles can be simulated based on different principles. In the case of large, complex-shaped particles that show various breakage patterns depending on the scenario leading to the failure and often only break locally instead of fracturing completely, some of these principles do not lead to realistic results. The reason for this is that in said cases, the methods in question, such as the Particle Replacement Method (PRM) or Voronoi Fracture, replace the initial particle (that is intended to break) into several sub-particles when certain breakage criteria are reached, such as exceeding the fracture energy. That is why those methods are commonly used for the simulation of materials that fracture completely instead of breaking locally. That being the case, when simulating local failure, it is advisable to pre-build the initial particle from sub-particles that are bonded together. The dimensions of these sub-particles  consequently define the minimum size of the fracture results. This structure of bonded sub-particles enables the initial particle to break at the location of the highest local loads – due to the failure of the bonds in those areas – with several sub-particle clusters being the result of the fracture, which can again also break locally. In this project, different methods for the generation and calibration of complex-shaped particle conglomerates using bonded particle modeling (BPM) to enable the ability to depict more realistic fracture behavior were evaluated based on the example of filter cake. The method that proved suitable for this purpose and which furthermore  allows efficient and realistic simulation of breakage behavior of complex-shaped particles applicable to industrial-sized simulations is presented in this paper.

Keywords: Bonded particle model (BPM), DEM, filter cake, particle breakage, particle fracture.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 373
5856 Performance Evaluation of GPS \ INS Main Integration Approach

Authors: Othman Maklouf, Ahmed Adwaib

Abstract:

This paper introduces a comparative study between the main GPS\INS coupling schemes, this will include the loosely coupled and tightly coupled configurations, several types of situations and operational conditions, in which the data fusion process is done using Kalman filtering. This will include the importance of sensors calibration as well as the alignment of the strap down inertial navigation system. The limitations of the inertial navigation systems are investigated.

Keywords: GPS, INS, Kalman Filter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2782
5855 A Study on the Mobile Web Generating using Element of User Experience

Authors: Heeae Ko, Jongkeun Kim, Kunjung Sim, Kunho Sim, Yonghwan Lim

Abstract:

As mobile service's subscriber is increasing; mobile contents services are getting more and more variables. So, mobile contents development needs not only contents design but also guideline for just mobile. And when mobile contents are developed, it is important to pass the limit and restriction of the mobile. The restrictions of mobile are small browser and screen size, limited download size and uncomfortable navigation. So each contents of mobile guideline will be presented for user's usability, easy of development and consistency of rule. This paper will be proposed methodology which is each contents of mobile guideline. Mobile web will be developed by mobile guideline which I proposed.

Keywords: Guideline, interface, mobile, mobile computing, userexperience.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1645
5854 SMaTTS: Standard Malay Text to Speech System

Authors: Othman O. Khalifa, Zakiah Hanim Ahmad, Teddy Surya Gunawan

Abstract:

This paper presents a rule-based text- to- speech (TTS) Synthesis System for Standard Malay, namely SMaTTS. The proposed system using sinusoidal method and some pre- recorded wave files in generating speech for the system. The use of phone database significantly decreases the amount of computer memory space used, thus making the system very light and embeddable. The overall system was comprised of two phases the Natural Language Processing (NLP) that consisted of the high-level processing of text analysis, phonetic analysis, text normalization and morphophonemic module. The module was designed specially for SM to overcome few problems in defining the rules for SM orthography system before it can be passed to the DSP module. The second phase is the Digital Signal Processing (DSP) which operated on the low-level process of the speech waveform generation. A developed an intelligible and adequately natural sounding formant-based speech synthesis system with a light and user-friendly Graphical User Interface (GUI) is introduced. A Standard Malay Language (SM) phoneme set and an inclusive set of phone database have been constructed carefully for this phone-based speech synthesizer. By applying the generative phonology, a comprehensive letter-to-sound (LTS) rules and a pronunciation lexicon have been invented for SMaTTS. As for the evaluation tests, a set of Diagnostic Rhyme Test (DRT) word list was compiled and several experiments have been performed to evaluate the quality of the synthesized speech by analyzing the Mean Opinion Score (MOS) obtained. The overall performance of the system as well as the room for improvements was thoroughly discussed.

Keywords: Natural Language Processing, Text-To-Speech (TTS), Diphone, source filter, low-/ high- level synthesis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1955
5853 Evaluation of SSR Markers Associated with High Oleic Acid in Sunflower

Authors: Atitaya Singchai, Nooduan Muangsan, Thitiporn Machikowa

Abstract:

Sunflower oil with high oleic acid content is most desirable because of its high oxidative stability. Screening sunflower of high oleic acid using conventional method is laborious and time consuming. Therefore, the use of molecular markers as a screening tool is promising. The objective of this research was to evaluate SSR primers for high oleic acid content in sunflower. Two sunflower lines, 5A and PI 649855 were used as the representative of low and high oleic acid sunflowers, respectively, and thirty seven SSR markers were used to identify oleic acid content trait. The results revealing 10 SSR primers showed polymorphic between high and low oleic acid lines and thus were informative. With these primers, therefore, it is possible to identify the genetic markers associated with high oleic acid trait in sunflower genotypes. 

Keywords: Microsatellite, Helianthus annuus L., fatty acid composition, molecular markers.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2566
5852 The Use of Minor Setups in an EPQ Model with Constrained Production Period Length

Authors: Behrouz Afshar Nadjafi

Abstract:

Extensive research has been devoted to economic production quantity (EPQ) problem. However, no attention has been paid to problems where production period length is constrained. In this paper, we address the problem of deciding the optimal production quantity and the number of minor setups within each cycle, in which, production period length is constrained but a minor setup is possible for pass the constraint. A mathematical model is developed and Iterated Local Search (ILS) is proposed to solve this problem. Finally, solution procedure illustrated with a numerical example and results are analyzed.

Keywords: EPQ, Inventory control, minor setup, ILS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1329
5851 Organization of the Purchasing Function for Innovation

Authors: Jasna Prester, Ivana Rašić Bakarić, Božidar Matijević

Abstract:

Innovations not only contribute to competitiveness of the company but have also positive effects on revenues. On average, product innovations account to 14 percent of companies’ sales. Innovation management has substantially changed during the last decade, because of growing reliance on external partners. As a consequence, a new task for purchasing arises, as firms need to understand which suppliers actually do have high potential contributing to the innovativeness of the firm and which do not. Proper organization of the purchasing function is important since for the majority of manufacturing companies deal with substantial material costs which pass through the purchasing function. In the past the purchasing function was largely seen as a transaction-oriented, clerical function but today purchasing is the intermediate with supply chain partners contributing to innovations, be it product or process innovations. Therefore, purchasing function has to be organized differently to enable firm innovation potential. However, innovations are inherently risky. There are behavioral risk (that some partner will take advantage of the other party), technological risk in terms of complexity of products and processes of manufacturing and incoming materials and finally market risks, which in fact judge the value of the innovation. These risks are investigated in this work. Specifically, technological risks which deal with complexity of the products, and processes will be investigated more thoroughly. Buying components or such high edge technologies necessities careful investigation of technical features and therefore is usually conducted by a team of experts. Therefore it is hypothesized that higher the technological risk, higher will be the centralization of the purchasing function as an interface with other supply chain members. Main contribution of this research lies is in the fact that analysis was performed on a large data set of 1493 companies, from 25 countries collected in the GMRG 4 survey. Most analyses of purchasing function are done by case study analysis of innovative firms. Therefore this study contributes with empirical evaluations that can be generalized.

Keywords: Purchasing function organization, innovation, technological risk, GMRG 4 survey.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3708
5850 Greywater Treatment Using Activated Biochar Produced from Agricultural Waste

Authors: Pascal Mwenge, Tumisang Seodigeng

Abstract:

The increase in urbanisation in South Africa has led to an increase in water demand and a decline in freshwater supply. Despite this, poor water usage is still a major challenge in South Africa, for instance, freshwater is still used for non-drinking applications. The freshwater shortage can be alleviated by using other sources of water for non-portable purposes such as greywater treated with activated biochar produced from agricultural waste. The success of activated biochar produced from agricultural waste to treat greywater can be both economically and environmentally beneficial. Greywater treated with activated biochar produced from agricultural waste is considered a cost-effective wastewater treatment.  This work was aimed at determining the ability of activated biochar to remove Total Suspended Solids (TSS), Ammonium (NH4-N), Nitrate (NO3-N), and Chemical Oxygen Demand (COD) from greywater. The experiments were carried out in 800 ml laboratory plastic cylinders used as filter columns. 2.5 cm layer of gravel was used at the bottom and top of the column to sandwich the activated biochar material. Activated biochar (200 g and 400 g) was loaded in a column and used as a filter medium for greywater. Samples were collected after a week and sent for analysis. Four types of greywater were treated: Kitchen, floor cleaning water, shower and laundry water. The findings showed: 95% removal of TSS, 76% of NO3-N and 63% of COD on kitchen greywater and 85% removal of NH4-N on bathroom greywater, as highest removal of efficiency of the studied pollutants. The results showed that activated biochar produced from agricultural waste reduces a certain amount of pollutants from greywater. The results also indicated the ability of activated biochar to treat greywater for onsite non-potable reuse purposes.

Keywords: Activated biochar produced from agriculture waste, ammonium (NH4-N), chemical oxygen demand (COD), greywater, nitrate (NO3-N), total suspended solids (TSS).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1393
5849 An Electrically Modulatable Silicon Waveguide Grating Using an Implantation Technology

Authors: Qing Fang, Lianxi Jia, JunFeng Song, Xiaoguang Tu, Mingbin Yu, Andy Eu-jin Lim, Guo Qiang Lo

Abstract:

The first pn-type carrier-induced silicon Bragg-grating filter is demonstrated. The extinction-ratio modulations are 11.5 dB and 10 dB with reverse and forward biases, respectively. 8-Gpbs data rate is achieved with a reverse bias.

Keywords: Silicon photonics, Waveguide grating, Carrier-induced, Extinction-ratio modulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1690