Search results for: Transportation problem
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3923

Search results for: Transportation problem

1013 Material Handling Equipment Selection using Hybrid Monte Carlo Simulation and Analytic Hierarchy Process

Authors: Amer M. Momani, Abdulaziz A. Ahmed

Abstract:

The many feasible alternatives and conflicting objectives make equipment selection in materials handling a complicated task. This paper presents utilizing Monte Carlo (MC) simulation combined with the Analytic Hierarchy Process (AHP) to evaluate and select the most appropriate Material Handling Equipment (MHE). The proposed hybrid model was built on the base of material handling equation to identify main and sub criteria critical to MHE selection. The criteria illustrate the properties of the material to be moved, characteristics of the move, and the means by which the materials will be moved. The use of MC simulation beside the AHP is very powerful where it allows the decision maker to represent his/her possible preference judgments as random variables. This will reduce the uncertainty of single point judgment at conventional AHP, and provide more confidence in the decision problem results. A small business pharmaceutical company is used as an example to illustrate the development and application of the proposed model.

Keywords: Analytic Hierarchy Process (AHP), Materialhandling equipment selection, Monte Carlo simulation, Multi-criteriadecision making

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3138
1012 Performance Analysis of Cluster Based Dual Tired Network Model with INTK Security Scheme in a Wireless Sensor Network

Authors: D. Satish Kumar, S. Karthik

Abstract:

A dual tiered network model is designed to overcome the problem of energy alert and fault tolerance. This model minimizes the delay time and overcome failure of links. Performance analysis of the dual tiered network model is studied in this paper where the CA and LS schemes are compared with DEO optimal. We then evaluate  the Integrated Network Topological Control and Key Management (INTK) Schemes, which was proposed to add security features of the wireless sensor networks. Clustering efficiency, level of protections, the time complexity is some of the parameters of INTK scheme that were analyzed. We then evaluate the Cluster based Energy Competent n-coverage scheme (CEC n-coverage scheme) to ensure area coverage for wireless sensor networks.

Keywords: CEC n-coverage scheme, Clustering efficiency, Dual tired network, Wireless sensor networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1673
1011 Transient Free Laminar Convection in the Vicinity of a Thermal Conductive Vertical Plate

Authors: Anna Bykalyuk, Frédéric Kuznik, Kévyn Johannes

Abstract:

In this paper the influence of a vertical plate’s thermal capacity is numerically investigated in order to evaluate the evolution of the thermal boundary layer structure, as well as the convective heat transfer coefficient and the velocity and temperature profiles. Whereas the heat flux of the heated vertical plate is evaluated under time depending boundary conditions. The main important feature of this problem is the unsteadiness of the physical phenomena. A 2D CFD model is developed with the Ansys Fluent 14.0 environment and is validated using unsteady data obtained for plasterboard studied under a dynamic temperature evolution. All the phenomena produced in the vicinity of the thermal conductive vertical plate (plasterboard) are analyzed and discussed. This work is the first stage of a holistic research on transient free convection that aims, in the future, to study the natural convection in the vicinity of a vertical plate containing Phase Change Materials (PCM).

Keywords: CFD modeling, natural convection, thermal conductive plate, time-depending boundary conditions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2154
1010 Single Event Transient Tolerance Analysis in 8051 Microprocessor Using Scan Chain

Authors: Jun Sung Go, Jong Kang Park, Jong Tae Kim

Abstract:

As semi-conductor manufacturing technology evolves; the single event transient problem becomes more significant issue. Single event transient has a critical impact on both combinational and sequential logic circuits, so it is important to evaluate the soft error tolerance of the circuits at the design stage. In this paper, we present a soft error detecting simulation using scan chain. The simulation model generates a single event transient randomly in the circuit, and detects the soft error during the execution of the test patterns. We verified this model by inserting a scan chain in an 8051 microprocessor using 65 nm CMOS technology. While the test patterns generated by ATPG program are passing through the scan chain, we insert a single event transient and detect the number of soft errors per sub-module. The experiments show that the soft error rates per cell area of the SFR module is 277% larger than other modules.

Keywords: Scan chain, single event transient, soft error, 8051 processor.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1490
1009 Solution of Optimal Reactive Power Flow using Biogeography-Based Optimization

Authors: Aniruddha Bhattacharya, Pranab Kumar Chattopadhyay

Abstract:

Optimal reactive power flow is an optimization problem with one or more objective of minimizing the active power losses for fixed generation schedule. The control variables are generator bus voltages, transformer tap settings and reactive power output of the compensating devices placed on different bus bars. Biogeography- Based Optimization (BBO) technique has been applied to solve different kinds of optimal reactive power flow problems subject to operational constraints like power balance constraint, line flow and bus voltages limits etc. BBO searches for the global optimum mainly through two steps: Migration and Mutation. In the present work, BBO has been applied to solve the optimal reactive power flow problems on IEEE 30-bus and standard IEEE 57-bus power systems for minimization of active power loss. The superiority of the proposed method has been demonstrated. Considering the quality of the solution obtained, the proposed method seems to be a promising one for solving these problems.

Keywords: Active Power Loss, Biogeography-Based Optimization, Migration, Mutation, Optimal Reactive Power Flow.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4277
1008 A Model for Collaborative COTS Software Acquisition (COSA)

Authors: Torsti Rantapuska, Sariseelia Sore

Abstract:

Acquiring commercial off-the-shelf (COTS) software applications is becoming routine in organizations. However, eliciting user requirements, finding the candidate COTS products and making the decision is a complex task, especially for SMEs who do not have the time and knowledge needed to do the task properly. The existing models intended to help the decision makers are originally designed for professional use. SMEs are obligated to rely on the software vendor’s ability to solve the problem with the systems provided.  In this paper, we develop a model for SMEs for the acquisition of Commercial Off-The-Shelf (COTS) software products. A leading idea of the model is that the ICT investment is basically a change initiative and therefore it should also be taken as a process of organizational learning. The model is designed bearing three objectives in mind: 1) business orientation, 2) agility, and 3) Learning and knowledge management orientation. The model can be applied to ICT investments in SMEs which have a professional team leader with basic business and IT knowledge. 

 

Keywords: COTS acquisition, ICT investment, organizational learning, ICT adoption.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1770
1007 Crystalline Structure of Starch Based Nano Composites

Authors: Farid Amidi Fazli, Afshin Babazadeh, Farnaz Amidi Fazli

Abstract:

In contrast with literal meaning of nano, researchers have been achieved mega adventures in this area and every day more nanomaterials are being introduced to the market. After long time application of fossil-based plastics, nowadays accumulation of their waste seems to be a big problem to the environment. On the other hand, mankind has more attention to safety and living environment. Replacing common plastic packaging materials with degradable ones that degrade faster and convert to non-dangerous components like water and carbon dioxide have more attractions; these new materials are based on renewable and inexpensive sources of starch and cellulose. However, the functional properties of them do not suitable for packaging. At this point, nanotechnology has an important role. Utilizing of nanomaterials in polymer structure will improve mechanical and physical properties of them; nanocrystalline cellulose (NCC) has this ability. This work has employed a chemical method to produce NCC and starch bio nanocomposite containing NCC. X-Ray Diffraction technique has characterized the obtained materials. Results showed that applied method is a suitable one as well as applicable one to NCC production.

Keywords: Biofilm, cellulose, nanocomposite, starch.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1771
1006 Fuzzy Logic Controller Based Shunt Active Filter with Different MFs for Current Harmonics Elimination

Authors: Shreyash Sinai Kunde, Siddhang Tendulkar, Shiv Prakash Gupta, Gaurav Kumar, Suresh Mikkili

Abstract:

One of the major power quality concerns in modern times is the problem of current harmonics. The current harmonics is caused due to the increase in non-linear loads which is largely dominated by power electronics devices. The Shunt active filtering is one of the best solutions for mitigating current harmonics. This paper describes a fuzzy logic controller based (FLC) based three Phase Shunt active Filter to achieve low current harmonic distortion (THD) and Reactive power compensation. The performance of fuzzy logic controller is analysed under both balanced sinusoidal and unbalanced sinusoidal source condition. The above controller serves the purpose of maintaining DC Capacitor Voltage constant. The proposed shunt active filter uses hysteresis current controller for current control of IGBT based PWM inverter. The simulation results of model in Simulink MATLAB reveals satisfying results.

Keywords: Shunt active filter, Current harmonics, Fuzzy logic controller, Hysteresis current controller.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2724
1005 Visual Analytics in K 12 Education - Emerging Dimensions of Complexity

Authors: Linnea Stenliden

Abstract:

The aim of this paper is to understand emerging learning conditions, when a visual analytics is implemented and used in K 12 (education). To date, little attention has been paid to the role visual analytics (digital media and technology that highlight visual data communication in order to support analytical tasks) can play in education, and to the extent to which these tools can process actionable data for young students. This study was conducted in three public K 12 schools, in four social science classes with students aged 10 to 13 years, over a period of two to four weeks at each school. Empirical data were generated using video observations and analyzed with help of metaphors within Actor-network theory (ANT). The learning conditions are found to be distinguished by broad complexity, characterized by four dimensions. These emerge from the actors’ deeply intertwined relations in the activities. The paper argues in relation to the found dimensions that novel approaches to teaching and learning could benefit students’ knowledge building as they work with visual analytics, analyzing visualized data.

Keywords: Analytical reasoning, complexity, data use, problem space, visual analytics, visual storytelling, translation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1697
1004 Methodology of Islamic Economics: Scope and Prospects

Authors: Ahmad Abdulkadir Ibrahim

Abstract:

Observation of the methodology of Islamic economics laid down for the methods and instruments of analysis and even some of its basic assumptions in the modern world; is a matter that is of paramount importance. There is a need to examine the implications of different suggested definitions of Islamic economics, exploring its scope and attempting to outline its methodology. This paper attempts to deal with the definition of Islamic economics, its methodology, and its scope. It will outline the main methodological problem by addressing the question of whether Islamic economics calls for a methodology of its own or as an expanded economics. It also aims at drawing the attention of economists in the modern world to the obligation and consideration of the methodology of Islamic economics. The methodology adopted in this research is library research through the consultation of relevant literature, which focuses on the thematic study of the subject matter. This is followed by an analysis and discussion of the contents of the materials used. It is concluded that there is a certain degree of inconsistency in the way assumptions are incorporated that perhaps are alien to Islamic economics. The paper also observed that there is a difference between Islamic economists and other (conventional) economists in the profession. An important conclusion is that Islamic economists need to rethink what economics is all about and whether we really have to create an alternative to economics in the form of Islamic economics or simply have an Islamic perspective of the same discipline.

Keywords: Islamic economics, conventional economics, Muslim economists, modern economy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 438
1003 Control Improvement of a C Sugar Cane Crystallization Using an Auto-Tuning PID Controller Based on Linearization of a Neural Network

Authors: S. Beyou, B. Grondin-Perez, M. Benne, C. Damour, J.-P. Chabriat

Abstract:

The industrial process of the sugar cane crystallization produces a residual that still contains a lot of soluble sucrose and the objective of the factory is to improve its extraction. Therefore, there are substantial losses justifying the search for the optimization of the process. Crystallization process studied on the industrial site is based on the “three massecuites process". The third step of this process constitutes the final stage of exhaustion of the sucrose dissolved in the mother liquor. During the process of the third step of crystallization (Ccrystallization), the phase that is studied and whose control is to be improved, is the growing phase (crystal growth phase). The study of this process on the industrial site is a problem in its own. A control scheme is proposed to improve the standard PID control law used in the factory. An auto-tuning PID controller based on instantaneous linearization of a neural network is then proposed.

Keywords: Auto-tuning, PID, Instantaneous linearization, Neural network, Non linear process, C-crystallisation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1468
1002 A New Graphical Password: Combination of Recall & Recognition Based Approach

Authors: Md. Asraful Haque, Babbar Imam

Abstract:

Information Security is the most describing problem in present times. To cop up with the security of the information, the passwords were introduced. The alphanumeric passwords are the most popular authentication method and still used up to now. However, text based passwords suffer from various drawbacks such as they are easy to crack through dictionary attacks, brute force attacks, keylogger, social engineering etc. Graphical Password is a good replacement for text password. Psychological studies say that human can remember pictures better than text. So this is the fact that graphical passwords are easy to remember. But at the same time due to this reason most of the graphical passwords are prone to shoulder surfing. In this paper, we have suggested a shoulder-surfing resistant graphical password authentication method. The system is a combination of recognition and pure recall based techniques. Proposed scheme can be useful for smart hand held devices (like smart phones i.e. PDAs, iPod, iPhone, etc) which are more handy and convenient to use than traditional desktop computer systems.

Keywords: Authentication, Graphical Password, Text Password, Information Security, Shoulder-surfing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4145
1001 Modified Naïve Bayes Based Prediction Modeling for Crop Yield Prediction

Authors: Kefaya Qaddoum

Abstract:

Most of greenhouse growers desire a determined amount of yields in order to accurately meet market requirements. The purpose of this paper is to model a simple but often satisfactory supervised classification method. The original naive Bayes have a serious weakness, which is producing redundant predictors. In this paper, utilized regularization technique was used to obtain a computationally efficient classifier based on naive Bayes. The suggested construction, utilized L1-penalty, is capable of clearing redundant predictors, where a modification of the LARS algorithm is devised to solve this problem, making this method applicable to a wide range of data. In the experimental section, a study conducted to examine the effect of redundant and irrelevant predictors, and test the method on WSG data set for tomato yields, where there are many more predictors than data, and the urge need to predict weekly yield is the goal of this approach. Finally, the modified approach is compared with several naive Bayes variants and other classification algorithms (SVM and kNN), and is shown to be fairly good.

Keywords: Tomato yields prediction, naive Bayes, redundancy

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5109
1000 Analysis of Web User Identification Methods

Authors: Renáta Iváncsy, Sándor Juhász

Abstract:

Web usage mining has become a popular research area, as a huge amount of data is available online. These data can be used for several purposes, such as web personalization, web structure enhancement, web navigation prediction etc. However, the raw log files are not directly usable; they have to be preprocessed in order to transform them into a suitable format for different data mining tasks. One of the key issues in the preprocessing phase is to identify web users. Identifying users based on web log files is not a straightforward problem, thus various methods have been developed. There are several difficulties that have to be overcome, such as client side caching, changing and shared IP addresses and so on. This paper presents three different methods for identifying web users. Two of them are the most commonly used methods in web log mining systems, whereas the third on is our novel approach that uses a complex cookie-based method to identify web users. Furthermore we also take steps towards identifying the individuals behind the impersonal web users. To demonstrate the efficiency of the new method we developed an implementation called Web Activity Tracking (WAT) system that aims at a more precise distinction of web users based on log data. We present some statistical analysis created by the WAT on real data about the behavior of the Hungarian web users and a comprehensive analysis and comparison of the three methods

Keywords: Data preparation, Tracking individuals, Web useridentification, Web usage mining

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4392
999 A New Hybrid K-Mean-Quick Reduct Algorithm for Gene Selection

Authors: E. N. Sathishkumar, K. Thangavel, T. Chandrasekhar

Abstract:

Feature selection is a process to select features which are more informative. It is one of the important steps in knowledge discovery. The problem is that all genes are not important in gene expression data. Some of the genes may be redundant, and others may be irrelevant and noisy. Here a novel approach is proposed Hybrid K-Mean-Quick Reduct (KMQR) algorithm for gene selection from gene expression data. In this study, the entire dataset is divided into clusters by applying K-Means algorithm. Each cluster contains similar genes. The high class discriminated genes has been selected based on their degree of dependence by applying Quick Reduct algorithm to all the clusters. Average Correlation Value (ACV) is calculated for the high class discriminated genes. The clusters which have the ACV value as 1 is determined as significant clusters, whose classification accuracy will be equal or high when comparing to the accuracy of the entire dataset. The proposed algorithm is evaluated using WEKA classifiers and compared. The proposed work shows that the high classification accuracy.

Keywords: Clustering, Gene Selection, K-Mean-Quick Reduct, Rough Sets.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2298
998 An Efficient 3D Animation Data Reduction Using Frame Removal

Authors: Jinsuk Yang, Choongjae Joo, Kyoungsu Oh

Abstract:

Existing methods in which the animation data of all frames are stored and reproduced as with vertex animation cannot be used in mobile device environments because these methods use large amounts of the memory. So 3D animation data reduction methods aimed at solving this problem have been extensively studied thus far and we propose a new method as follows. First, we find and remove frames in which motion changes are small out of all animation frames and store only the animation data of remaining frames (involving large motion changes). When playing the animation, the removed frame areas are reconstructed using the interpolation of the remaining frames. Our key contribution is to calculate the accelerations of the joints of individual frames and the standard deviations of the accelerations using the information of joint locations in the relevant 3D model in order to find and delete frames in which motion changes are small. Our methods can reduce data sizes by approximately 50% or more while providing quality which is not much lower compared to original animations. Therefore, our method is expected to be usefully used in mobile device environments or other environments in which memory sizes are limited.

Keywords: Data Reduction, Interpolation, Vertex Animation, 3D Animation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1661
997 Single Image Defogging Method Using Variational Approach for Edge-Preserving Regularization

Authors: Wan-Hyun Cho, In-Seop Na, Seong-ChaeSeo, Sang-Kyoon Kim, Soon-Young Park

Abstract:

In this paper, we propose the variational approach to solve single image defogging problem. In the inference process of the atmospheric veil, we defined new functional for atmospheric veil that satisfy edge-preserving regularization property. By using the fundamental lemma of calculus of variations, we derive the Euler-Lagrange equation foratmospheric veil that can find the maxima of a given functional. This equation can be solved by using a gradient decent method and time parameter. Then, we can have obtained the estimated atmospheric veil, and then have conducted the image restoration by using inferred atmospheric veil. Finally we have improved the contrast of restoration image by various histogram equalization methods. The experimental results show that the proposed method achieves rather good defogging results.

Keywords: Image defogging, Image restoration, Atmospheric veil, Transmission, Variational approach, Euler-Lagrange equation, Image enhancement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2942
996 3D Network-on-Chip with on-Chip DRAM: An Empirical Analysis for Future Chip Multiprocessor

Authors: Thomas Canhao Xu, Bo Yang, Alexander Wei Yin, Pasi Liljeberg, Hannu Tenhunen

Abstract:

With the increasing number of on-chip components and the critical requirement for processing power, Chip Multiprocessor (CMP) has gained wide acceptance in both academia and industry during the last decade. However, the conventional bus-based onchip communication schemes suffer from very high communication delay and low scalability in large scale systems. Network-on-Chip (NoC) has been proposed to solve the bottleneck of parallel onchip communications by applying different network topologies which separate the communication phase from the computation phase. Observing that the memory bandwidth of the communication between on-chip components and off-chip memory has become a critical problem even in NoC based systems, in this paper, we propose a novel 3D NoC with on-chip Dynamic Random Access Memory (DRAM) in which different layers are dedicated to different functionalities such as processors, cache or memory. Results show that, by using our proposed architecture, average link utilization has reduced by 10.25% for SPLASH-2 workloads. Our proposed design costs 1.12% less execution cycles than the traditional design on average.

Keywords: 3D integration, network-on-chip, memory-on-chip, DRAM, chip multiprocessor.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2447
995 Prediction of Rubberised Concrete Strength by Using Artificial Neural Networks

Authors: A. M. N. El-Khoja, A. F. Ashour, J. Abdalhmid, X. Dai, A. Khan

Abstract:

In recent years, waste tyre problem is considered as one of the most crucial environmental pollution problems facing the world. Thus, reusing waste rubber crumb from recycled tyres to develop highly damping concrete is technically feasible and a viable alternative to landfill or incineration. The utilization of waste rubber in concrete generally enhances the ductility, toughness, thermal insulation, and impact resistance. However, the mechanical properties decrease with the amount of rubber used in concrete. The aim of this paper is to develop artificial neural network (ANN) models to predict the compressive strength of rubberised concrete (RuC). A trained and tested ANN was developed using a comprehensive database collected from different sources in the literature. The ANN model developed used 5 input parameters that include: coarse aggregate (CA), fine aggregate (FA), w/c ratio, fine rubber (Fr), and coarse rubber (Cr), whereas the ANN outputs were the corresponding compressive strengths. A parametric study was also conducted to study the trend of various RuC constituents on the compressive strength of RuC.

Keywords: Rubberized concrete, compressive strength, artificial neural network, prediction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 908
994 OCR for Script Identification of Hindi (Devnagari) Numerals using Feature Sub Selection by Means of End-Point with Neuro-Memetic Model

Authors: Banashree N. P., R. Vasanta

Abstract:

Recognition of Indian languages scripts is challenging problems. In Optical Character Recognition [OCR], a character or symbol to be recognized can be machine printed or handwritten characters/numerals. There are several approaches that deal with problem of recognition of numerals/character depending on the type of feature extracted and different way of extracting them. This paper proposes a recognition scheme for handwritten Hindi (devnagiri) numerals; most admired one in Indian subcontinent. Our work focused on a technique in feature extraction i.e. global based approach using end-points information, which is extracted from images of isolated numerals. These feature vectors are fed to neuro-memetic model [18] that has been trained to recognize a Hindi numeral. The archetype of system has been tested on varieties of image of numerals. . In proposed scheme data sets are fed to neuro-memetic algorithm, which identifies the rule with highest fitness value of nearly 100 % & template associates with this rule is nothing but identified numerals. Experimentation result shows that recognition rate is 92-97 % compared to other models.

Keywords: OCR, Global Feature, End-Points, Neuro-Memetic model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1759
993 Evaluation of Model Evaluation Criterion for Software Development Effort Estimation

Authors: S. K. Pillai, M. K. Jeyakumar

Abstract:

Estimation of model parameters is necessary to predict the behavior of a system. Model parameters are estimated using optimization criteria. Most algorithms use historical data to estimate model parameters. The known target values (actual) and the output produced by the model are compared. The differences between the two form the basis to estimate the parameters. In order to compare different models developed using the same data different criteria are used. The data obtained for short scale projects are used here. We consider software effort estimation problem using radial basis function network. The accuracy comparison is made using various existing criteria for one and two predictors. Then, we propose a new criterion based on linear least squares for evaluation and compared the results of one and two predictors. We have considered another data set and evaluated prediction accuracy using the new criterion. The new criterion is easy to comprehend compared to single statistic. Although software effort estimation is considered, this method is applicable for any modeling and prediction.

Keywords: Software effort estimation, accuracy, Radial Basis Function, linear least squares.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2041
992 Thailand and Procession of Trafficking Human Beings (Women and Children)

Authors: Kawinphat Lertpongmanee

Abstract:

The problems of trafficking human beings were continuously violent in Thailand. The problems occurred from a variety of factors such as unemployment, agricultural workers’ urban immigration, sex tour, attitude of materialism society, divorced family, unsavourily effected law, and officers’ ignorance. The purposes of this study were to study the structure, connection, a number of trafficking human beings in Thailand. Qualitative and quantitative and results of previous research were used in this research. The previous procurers, interested persons, experienced people, human beings-aiding organization, and women-children rights organization were interviewed in depth. The field was used in a variety of regions. The findings showed that the structure and connection of trafficking human beings and their values are $8,750 million. There are 240,000 people in trafficked human beings. The trend of trafficking human beings grows continuously. It is changed according to economic circumstance, society and culture, and law. The state must be aware of its problem. The law is enacted by adding high penalty for serious fear.

Keywords: Human Trade, Prostitution trafficking, trafficking in women and children.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1619
991 A Black-box Approach for Response Quality Evaluation of Conversational Agent Systems

Authors: Ong Sing Goh, C. Ardil, Wilson Wong, Chun Che Fung

Abstract:

The evaluation of conversational agents or chatterbots question answering systems is a major research area that needs much attention. Before the rise of domain-oriented conversational agents based on natural language understanding and reasoning, evaluation is never a problem as information retrieval-based metrics are readily available for use. However, when chatterbots began to become more domain specific, evaluation becomes a real issue. This is especially true when understanding and reasoning is required to cater for a wider variety of questions and at the same time to achieve high quality responses. This paper discusses the inappropriateness of the existing measures for response quality evaluation and the call for new standard measures and related considerations are brought forward. As a short-term solution for evaluating response quality of conversational agents, and to demonstrate the challenges in evaluating systems of different nature, this research proposes a blackbox approach using observation, classification scheme and a scoring mechanism to assess and rank three example systems, AnswerBus, START and AINI.

Keywords: Evaluation, conversational agents, Response Quality, chatterbots

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1927
990 Optimal Design of Multimachine Power System Stabilizers Using Improved Multi-Objective Particle Swarm Optimization Algorithm

Authors: Badr M. Alshammari, T. Guesmi

Abstract:

In this paper, the concept of a non-dominated sorting multi-objective particle swarm optimization with local search (NSPSO-LS) is presented for the optimal design of multimachine power system stabilizers (PSSs). The controller design is formulated as an optimization problem in order to shift the system electromechanical modes in a pre-specified region in the s-plan. A composite set of objective functions comprising the damping factor and the damping ratio of the undamped and lightly damped electromechanical modes is considered. The performance of the proposed optimization algorithm is verified for the 3-machine 9-bus system. Simulation results based on eigenvalue analysis and nonlinear time-domain simulation show the potential and superiority of the NSPSO-LS algorithm in tuning PSSs over a wide range of loading conditions and large disturbance compared to the classic PSO technique and genetic algorithms.

Keywords: Multi-objective optimization, particle swarm optimization, power system stabilizer, low frequency oscillations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1232
989 N-Grams: A Tool for Repairing Word Order Errors in Ill-formed Texts

Authors: Theologos Athanaselis, Stelios Bakamidis, Ioannis Dologlou, Konstantinos Mamouras

Abstract:

This paper presents an approach for repairing word order errors in English text by reordering words in a sentence and choosing the version that maximizes the number of trigram hits according to a language model. A possible way for reordering the words is to use all the permutations. The problem is that for a sentence with length N words the number of all permutations is N!. The novelty of this method concerns the use of an efficient confusion matrix technique for reordering the words. The confusion matrix technique has been designed in order to reduce the search space among permuted sentences. The limitation of search space is succeeded using the statistical inference of N-grams. The results of this technique are very interesting and prove that the number of permuted sentences can be reduced by 98,16%. For experimental purposes a test set of TOEFL sentences was used and the results show that more than 95% can be repaired using the proposed method.

Keywords: Permutations filtering, Statistical language model N-grams, Word order errors, TOEFL

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1668
988 Structural Modelling of the LiCl Aqueous Solution: Using the Hybrid Reverse Monte Carlo (HRMC) Simulation

Authors: M. Habchi, S.M. Mesli, M. Kotbi

Abstract:

The Reverse Monte Carlo (RMC) simulation is applied in the study of an aqueous electrolyte LiCl6H2O. On the basis of the available experimental neutron scattering data, RMC computes pair radial distribution functions in order to explore the structural features of the system. The obtained results include some unrealistic features. To overcome this problem, we use the Hybrid Reverse Monte Carlo (HRMC), incorporating an energy constraint in addition to the commonly used constraints derived from experimental data. Our results show a good agreement between experimental and computed partial distribution functions (PDFs) as well as a significant improvement in pair partial distribution curves. This kind of study can be considered as a useful test for a defined interaction model for conventional simulation techniques.

Keywords: RMC simulation, HRMC simulation, energy constraint, screened potential, glassy state, liquid state, partial distribution function, pair partial distribution function.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1467
987 Effect of Different Treatments on Heavy Metal Concentration in Sugar Cane Molasses

Authors: Gomaa N. Abdel-Rahman, Nadia R. A. Nassar, Yehia A. Heikal, Mahmoud A. M. Abou-Donia, Mohamed M. Naguib, Mohamed Fadel

Abstract:

Cane molasses is used as a raw material for the production of baker’s yeast (Saccharomyces cerevisiae) in Egypt. The high levels of heavy metals in molasses cause a critical problem during fermentation and cause various kinds of technological difficulties (yield and quality of yeast become lower). The aim of the present study was to determine heavy metal concentrations (cadmium, nickel, lead, and copper) in crude and treated molasses obtained from the storage tanks of the baker’s yeast factory through four seasons. Also, the effect of crude molasses treatment by different methods (at laboratory scale) on heavy metals reduction and its comparison with factory treated molasses were conducted. The molasses samples obtained at autumn season had the highest values of all the studied heavy metals. The molasses treated by cation exchange resin then sulfuric acid had the lowest concentrations of heavy metals compared with other treatments.

Keywords: Molasses, baker’s yeast, heavy metals, treatment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2492
986 Aircraft Selection Problem Using Decision Uncertainty Distance in Fuzzy Multiple Criteria Decision Making Analysis

Authors: C. Ardil

Abstract:

Aircraft have different capabilities and specifications according to the required strategic goals and objectives in operations. With various types on the market with different aircraft characteristics, it becomes difficult to select a suitable aircraft for certain operations and requirements. The entropy weighting method (EWM) is a useful, highly consistent, and reliable method for obtaining the weights of the criteria and is worth integrating with the decision uncertainty distance (DUD) method, which is more applicable and requires less computation than other methods. An illustrative example is presented to demonstrate the validity and usability of the proposed methodology. Comparing the ranking results matches the distance-based approach, which is the technique for order preference by similarity to ideal solution (TOPSIS) method, which shows the robustness of the entropy DUD hybrid method. Validity analysis shows that the proposed hybrid multiple criteria decision-making analysis (MCDMA) methodology is quantitatively stable and reliable.

Keywords: aircraft selection, decision uncertainty distance (DUD), multiple criteria decision making analysis, MCDMA, TOPSIS

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 542
985 An Algorithm for Secure Visible Logo Embedding and Removing in Compression Domain

Authors: Hongyuan Li, Guang Liu, Yuewei Dai, Zhiquan Wang

Abstract:

Digital watermarking is the process of embedding information into a digital signal which can be used in DRM (digital rights managements) system. The visible watermark (often called logo) can indicate the owner of the copyright which can often be seen in the TV program and protects the copyright in an active way. However, most of the schemes do not consider the visible watermark removing process. To solve this problem, a visible watermarking scheme with embedding and removing process is proposed under the control of a secure template. The template generates different version of watermarks which can be seen visually the same for different users. Users with the right key can completely remove the watermark and recover the original image while the unauthorized user is prevented to remove the watermark. Experiment results show that our watermarking algorithm obtains a good visual quality and is hard to be removed by the illegally users. Additionally, the authorized users can completely remove the visible watermark and recover the original image with a good quality.

Keywords: digital watermarking, visible and removablewatermark, secure template, JPEG compression

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1536
984 Data Rate Based Grouping Scheme for Cooperative Communications in Wireless LANs

Authors: Sunmyeng Kim

Abstract:

IEEE 802.11a/b/g standards provide multiple transmission rates, which can be changed dynamically according to the channel condition. Cooperative communications were introduced to improve the overall performance of wireless LANs with the help of relay nodes with higher transmission rates. The cooperative communications are based on the fact that the transmission is much faster when sending data packets to a destination node through a relay node with higher transmission rate, rather than sending data directly to the destination node at low transmission rate. To apply the cooperative communications in wireless LAN, several MAC protocols have been proposed. Some of them can result in collisions among relay nodes in a dense network. In order to solve this problem, we propose a new protocol. Relay nodes are grouped based on their transmission rates. And then, relay nodes only in the highest group try to get channel access. Performance evaluation is conducted using simulation, and shows that the proposed protocol significantly outperforms the previous protocol in terms of throughput and collision probability.

Keywords: Cooperative communications, MAC protocol, relay node, WLAN.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1906