175 New Technologies for Modeling of Gas Turbine Cooled Blades
Authors: A. Pashayev, D. Askerov, R.Sadiqov, A. Samedov, C. Ardil
Abstract:
In contrast to existing methods which do not take into account multiconnectivity in a broad sense of this term, we develop mathematical models and highly effective combination (BIEM and FDM) numerical methods of calculation of stationary and cvazistationary temperature field of a profile part of a blade with convective cooling (from the point of view of realization on PC). The theoretical substantiation of these methods is proved by appropriate theorems. For it, converging quadrature processes have been developed and the estimations of errors in the terms of A.Ziqmound continuity modules have been received. For visualization of profiles are used: the method of the least squares with automatic conjecture, device spline, smooth replenishment and neural nets. Boundary conditions of heat exchange are determined from the solution of the corresponding integral equations and empirical relationships. The reliability of designed methods is proved by calculation and experimental investigations heat and hydraulic characteristics of the gas turbine 1st stage nozzle blade
Keywords: multiconnected systems, method of the boundary integrated equations, splines, neural networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1651174 Numerical Solution of Linear Ordinary Differential Equations in Quantum Chemistry by Clenshaw Method
Authors: M. Saravi, F. Ashrafi, S.R. Mirrajei
Abstract:
As we know, most differential equations concerning physical phenomenon could not be solved by analytical method. Even if we use Series Method, some times we need an appropriate change of variable, and even when we can, their closed form solution may be so complicated that using it to obtain an image or to examine the structure of the system is impossible. For example, if we consider Schrodinger equation, i.e., We come to a three-term recursion relations, which work with it takes, at least, a little bit time to get a series solution[6]. For this reason we use a change of variable such as or when we consider the orbital angular momentum[1], it will be necessary to solve. As we can observe, working with this equation is tedious. In this paper, after introducing Clenshaw method, which is a kind of Spectral method, we try to solve some of such equations.Keywords: Chebyshev polynomials, Clenshaw method, ODEs, Spectral methods
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1419173 Fast and Accuracy Control Chart Pattern Recognition using a New cluster-k-Nearest Neighbor
Authors: Samir Brahim Belhaouari
Abstract:
By taking advantage of both k-NN which is highly accurate and K-means cluster which is able to reduce the time of classification, we can introduce Cluster-k-Nearest Neighbor as "variable k"-NN dealing with the centroid or mean point of all subclasses generated by clustering algorithm. In general the algorithm of K-means cluster is not stable, in term of accuracy, for that reason we develop another algorithm for clustering our space which gives a higher accuracy than K-means cluster, less subclass number, stability and bounded time of classification with respect to the variable data size. We find between 96% and 99.7 % of accuracy in the lassification of 6 different types of Time series by using K-means cluster algorithm and we find 99.7% by using the new clustering algorithm.Keywords: Pattern recognition, Time series, k-Nearest Neighbor, k-means cluster, Gaussian Mixture Model, Classification
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1964172 Acceptance Single Sampling Plan with Fuzzy Parameter with The Using of Poisson Distribution
Authors: Ezzatallah Baloui Jamkhaneh, Bahram Sadeghpour-Gildeh, Gholamhossein Yari
Abstract:
This purpose of this paper is to present the acceptance single sampling plan when the fraction of nonconforming items is a fuzzy number and being modeled based on the fuzzy Poisson distribution. We have shown that the operating characteristic (oc) curves of the plan is like a band having a high and low bounds whose width depends on the ambiguity proportion parameter in the lot when that sample size and acceptance numbers is fixed. Finally we completed discuss opinion by a numerical example. And then we compared the oc bands of using of binomial with the oc bands of using of Poisson distribution.
Keywords: Statistical quality control, acceptance single sampling, fuzzy number.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1989171 The Application of Homotopy Method In Solving Electrical Circuit Design Problem
Authors: Talib Hashim Hasan
Abstract:
This paper describes simple implementation of homotopy (also called continuation) algorithm for determining the proper resistance of the resistor to dissipate energy at a specified rate of an electric circuit. Homotopy algorithm can be considered as a developing of the classical methods in numerical computing such as Newton-Raphson and fixed point methods. In homoptopy methods, an embedding parameter is used to control the convergence. The method purposed in this work utilizes a special homotopy called Newton homotopy. Numerical example solved in MATLAB is given to show the effectiveness of the purposed methodKeywords: electrical circuit homotopy, methods, MATLAB, Newton homotopy
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3028170 Panoramic Sensor Based Blind Spot Accident Prevention System
Authors: Rajendra Prasad Mahapatra, K. Vimal Kumar
Abstract:
There are many automotive accidents due to blind spots and driver inattentiveness. Blind spot is the area that is invisible to the driver's viewpoint without head rotation. Several methods are available for assisting the drivers. Simplest methods are — rear mirrors and wide-angle lenses. But, these methods have a disadvantage of the requirement for human assistance. So, the accuracy of these devices depends on driver. Another approach called an automated approach that makes use of sensors such as sonar or radar. These sensors are used to gather range information. The range information will be processed and used for detecting the collision. The disadvantage of this system is — low angular resolution and limited sensing volumes. This paper is a panoramic sensor based automotive vehicle monitoring..
Keywords: Panoramic sensors, Blind spot, Convex lens, Computer Vision, Sonar.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2115169 Detecting and Measuring Fabric Pills Using Digital Image Analysis
Authors: Dariush Semnani, Hossein Ghayoor
Abstract:
In this paper a novel method was presented for evaluating the fabric pills using digital image processing techniques. This work provides a novel technique for detecting pills and also measuring their heights, surfaces and volumes. Surely, measuring the intensity of defects by human vision is an inaccurate method for quality control; as a result, this problem became a motivation for employing digital image processing techniques for detection of defects of fabric surface. In the former works, the systems were just limited to measuring of the surface of defects, but in the presented method the height and the volume of defects were also measured, which leads to a more accurate quality control. An algorithm was developed to first, find pills and then measure their average intensity by using three criteria of height, surface and volume. The results showed a meaningful relation between the number of rotations and the quality of pilled fabrics.Keywords: 3D analysis, computer vision, fabric, pile, surface evaluation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2618168 A Novel Computer Vision Method for Evaluating Deformations of Fibers Cross Section in False Twist Textured Yarns
Authors: Dariush Semnani, Mehdi Ahangareianabhari, Hossein Ghayoor
Abstract:
In recent five decades, textured yarns of polyester fiber produced by false twist method are the most important and mass-produced manmade fibers. There are many parameters of cross section which affect the physical and mechanical properties of textured yarns. These parameters are surface area, perimeter, equivalent diameter, large diameter, small diameter, convexity, stiffness, eccentricity, and hydraulic diameter. These parameters were evaluated by digital image processing techniques. To find trends between production criteria and evaluated parameters of cross section, three criteria of production line have been adjusted and different types of yarns were produced. These criteria are temperature, drafting ratio, and D/Y ratio. Finally the relations between production criteria and cross section parameters were considered. The results showed that the presented technique can recognize and measure the parameters of fiber cross section in acceptable accuracy. Also, the optimum condition of adjustments has been estimated from results of image analysis evaluation.Keywords: Computer Vision, Cross Section Analysis, Fibers Deformation, Textured Yarn
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1646167 Time Series Forecasting Using Independent Component Analysis
Authors: Theodor D. Popescu
Abstract:
The paper presents a method for multivariate time series forecasting using Independent Component Analysis (ICA), as a preprocessing tool. The idea of this approach is to do the forecasting in the space of independent components (sources), and then to transform back the results to the original time series space. The forecasting can be done separately and with a different method for each component, depending on its time structure. The paper gives also a review of the main algorithms for independent component analysis in the case of instantaneous mixture models, using second and high-order statistics. The method has been applied in simulation to an artificial multivariate time series with five components, generated from three sources and a mixing matrix, randomly generated.Keywords: Independent Component Analysis, second order statistics, simulation, time series forecasting
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1778166 FHOJ: A New Java Benchmark Framework
Authors: Vinh Quang La, Dirk Jansen
Abstract:
There are some existing Java benchmarks, application benchmarks as well as micro benchmarks or mixture both of them,such as: Java Grande, Spec98, CaffeMark, HBech, etc. But none of them deal with behaviors of multi tasks operating systems. As a result, the achieved outputs are not satisfied for performance evaluation engineers. Behaviors of multi tasks operating systems are based on a schedule management which is employed in these systems. Different processes can have different priority to share the same resources. The time is measured by estimating from applications started to it is finished does not reflect the real time value which the system need for running those programs. New approach to this problem should be done. Having said that, in this paper we present a new Java benchmark, named FHOJ benchmark, which directly deals with multi tasks behaviors of a system. Our study shows that in some cases, results from FHOJ benchmark are far more reliable in comparison with some existing Java benchmarks.
Keywords: Java Virtual Machine, Java benchmark, FHOJ framework.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1527165 Designing and Implementing an Innovative Course about World Wide Web, Based on the Conceptual Representations of Students
Authors: Andreanna K. Koufou, Dimitrios K. Tsolis, Marida I. Ergazaki, Vasilis I. Komis, Vasiliki P. Zogza
Abstract:
Internet is nowadays included to all National Curriculums of the elementary school. A comparative study of their goals leads to the conclusion that a complete curriculum should aim to student-s acquisition of the abilities to navigate and search for information and additionally to emphasize on the evaluation of the information provided by the World Wide Web. In a constructivistic knowledge framework the design of a course has to take under consideration the conceptual representations of students. The following paper presents the conceptual representation of students of eleven years old, attending the Sixth Grade of Greek Elementary School about World Wide Web and their use in the design and implementation of an innovative course.Keywords: Conceptual representations, Constructivism, Internet Didactics, World Wide Web
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1400164 Identifying New Sequence Features for Exon-Intron Discrimination by Rescaled-Range Frameshift Analysis
Authors: Sing-Wu Liou, Yin-Fu Huang
Abstract:
For identifying the discriminative sequence features between exons and introns, a new paradigm, rescaled-range frameshift analysis (RRFA), was proposed. By RRFA, two new sequence features, the frameshift sensitivity (FS) and the accumulative penta-mer complexity (APC), were discovered which were further integrated into a new feature of larger scale, the persistency in anti-mutation (PAM). The feature-validation experiments were performed on six model organisms to test the power of discrimination. All the experimental results highly support that FS, APC and PAM were all distinguishing features between exons and introns. These identified new sequence features provide new insights into the sequence composition of genes and they have great potentials of forming a new basis for recognizing the exonintron boundaries in gene sequences.Keywords: Exon-Intron Discrimination, Rescaled-Range Frameshift Analysis, Frameshift Sensitivity, Accumulative Sequence Complexity
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1173163 An AHP-Delphi Multi-Criteria Usage Cases Model with Application to Citrogypsum Decisions, Case Study: Kimia Gharb Gostar Industries Company
Authors: Mohsen Pirdashti, Masoomeh Omidi, Hemmatollah Pidashti
Abstract:
Today, advantage of biotechnology especially in environmental issues compared to other technologies is irrefragable. Kimia Gharb Gostar Industries Company, as a largest producer of citric acid in Middle East, applies biotechnology for this goal. Citrogypsum is a by–product of citric acid production and it considered as a valid residuum of this company. At this paper summary of acid citric production and condition of Citrogypsum production in company were introduced in addition to defmition of Citrogypsum production and its applications in world. According to these information and evaluation of present conditions about Iran needing to Citrogypsum, the best priority was introduced and emphasized on strategy selection and proper programming for self-sufficiency. The Delphi technique was used to elicit expert opinions about criteria for evaluating the usages. The criteria identified by the experts were profitability, capacity of production, the degree of investment, marketable, production ease and time production. The Analytical Hierarchy Process (ARP) and Expert Choice software were used to compare the alternatives on the criteria derived from the Delphi process.
Keywords: Analytical Hierarchy Process, ARP, Delphi, Multi- criteria decision making, Citrogypsum
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2314162 The Impact of Germination and In Vitro Digestion on the Formation of Angiotensin Converting Enzyme (ACE) Inhibitory Peptides from Lentil Proteins Compared to Whey Proteins
Authors: F. Bamdad, Sh. Dokhani, J. Keramat, R. Zareie
Abstract:
Biologically active peptides are of particular interest in food science and human nutrition because they have been shown to play several physiological roles. In vitro gastrointestinal digestion of lentil and whey proteins in this study produced high angiotensin-I converting enzyme inhibitory activity with 75.5±1.9 and 91.4±2.3% inhibition, respectively. High ACE inhibitory activity was observed in lentil after 5 days of germination (84.3±1.2%). Fractionation by reverse phase chromatography gave inhibitory activities as high as 86.3±2.0 for lentil, 94.8±1.8% for whey and 93.7±1.7% at 5th day of germination. Further purification by HPLC resulted in several inhibitory peptides with IC50 values ranging from 0.064 to 0.164 mg/ml. These results demonstrate that lentil proteins are a good source of peptides with ACE inhibitory activity that can be released by germination or gastrointestinal digestion. Despite the lower bioactivity in comparison with whey proteins, incorporation of lentil proteins in functional food formulations and natural drugs look promising.Keywords: ACE inhibitory peptides, digestion, germination, lentil proteins, whey proteins
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2448161 DD Models for Reports Building
Authors: Ljerka Hrženjak-Šego, Željko Polić, Zdravka Aljinović
Abstract:
In general, reports are a form of representing data in such way that user gets the information he needs. They can be built in various ways, from the simplest (“select from") to the most complex ones (results derived from different sources/tables with complex formulas applied). Furthermore, rules of calculations could be written as a program hard code or built in the database to be used by dynamic code. This paper will introduce two types of reports, defined in the DB structure. The main goal is to manage calculations in optimal way, keeping maintenance of reports as simple and smooth as possible.Keywords: Data Definition diagram, Server Model Diagram, system modelling, reports.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1342160 Trust Enhanced Dynamic Source Routing Protocol for Adhoc Networks
Authors: N. Bhalaji, A. R. Sivaramkrishnan, Sinchan Banerjee, V. Sundar, A. Shanmugam
Abstract:
Nodes in mobile Ad Hoc Network (MANET) do not rely on a central infrastructure but relay packets originated by other nodes. Mobile ad hoc networks can work properly only if the participating nodes collaborate in routing and forwarding. For individual nodes it might be advantageous not to collaborate, though. In this conceptual paper we propose a new approach based on relationship among the nodes which makes them to cooperate in an Adhoc environment. The trust unit is used to calculate the trust values of each node in the network. The calculated trust values are being used by the relationship estimator to determine the relationship status of nodes. The proposed enhanced protocol was compared with the standard DSR protocol and the results are analyzed using the network simulator-2.Keywords: Reliable Routing, DSR, Grudger, Adhoc network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2502159 Optimal Solution of Constraint Satisfaction Problems
Authors: Jeffrey L. Duffany
Abstract:
An optimal solution for a large number of constraint satisfaction problems can be found using the technique of substitution and elimination of variables analogous to the technique that is used to solve systems of equations. A decision function f(A)=max(A2) is used to determine which variables to eliminate. The algorithm can be expressed in six lines and is remarkable in both its simplicity and its ability to find an optimal solution. However it is inefficient in that it needs to square the updated A matrix after each variable elimination. To overcome this inefficiency the algorithm is analyzed and it is shown that the A matrix only needs to be squared once at the first step of the algorithm and then incrementally updated for subsequent steps, resulting in significant improvement and an algorithm complexity of O(n3).Keywords: Algorithm, complexity, constraint, np-complete.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1421158 Mobile Robot Path Planning Utilizing Probability Recursive Function
Authors: Ethar H. Khalil, Bahaa I. Kazem
Abstract:
In this work a software simulation model has been proposed for two driven wheels mobile robot path planning; that can navigate in dynamic environment with static distributed obstacles. The work involves utilizing Bezier curve method in a proposed N order matrix form; for engineering the mobile robot path. The Bezier curve drawbacks in this field have been diagnosed. Two directions: Up and Right function has been proposed; Probability Recursive Function (PRF) to overcome those drawbacks. PRF functionality has been developed through a proposed; obstacle detection function, optimization function which has the capability of prediction the optimum path without comparison between all feasible paths, and N order Bezier curve function that ensures the drawing of the obtained path. The simulation results that have been taken showed; the mobile robot travels successfully from starting point and reaching its goal point. All obstacles that are located in its way have been avoided. This navigation is being done successfully using the proposed PRF techniques.Keywords: Mobile robot, path planning, Bezier curve.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1461157 A Pilot Study for the Optimization of Routes for Waste Collection Vehicles for the Göçmenköy District of Lefkoşa
Authors: Nergiz Fırıncı, Aysun Çelik, Ertan Akün, Md. Atif Khan
Abstract:
A pilot project was carried out in 2007 by the senior students of Cyprus International University, aiming to minimize the total cost of waste collection in Northern Cyprus. Many developed and developing countries have cut their transportation costs – which lies between 30-40% – down at a rate of 40% percent, by implementing network models for their route assignments. Accordingly, a network model was implemented at Göçmenköy district, to optimize and standardize waste collection works. The work environment of the employees were also redesigned to provide maximum ergonomy and to increase productivity, efficiency and safety. Following the collection of the required data including waste densities, lengths of roads and population, a model was constructed to allocate the optimal route assignment for the waste collection trucks at Göçmenköy district.Keywords: Minimization, waste collection, operations cost, transportation, ergonomy, productivity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2666156 Developing New Processes and Optimizing Performance Using Response Surface Methodology
Authors: S. Raissi
Abstract:
Response surface methodology (RSM) is a very efficient tool to provide a good practical insight into developing new process and optimizing them. This methodology could help engineers to raise a mathematical model to represent the behavior of system as a convincing function of process parameters. Through this paper the sequential nature of the RSM surveyed for process engineers and its relationship to design of experiments (DOE), regression analysis and robust design reviewed. The proposed four-step procedure in two different phases could help system analyst to resolve the parameter design problem involving responses. In order to check accuracy of the designed model, residual analysis and prediction error sum of squares (PRESS) described. It is believed that the proposed procedure in this study can resolve a complex parameter design problem with one or more responses. It can be applied to those areas where there are large data sets and a number of responses are to be optimized simultaneously. In addition, the proposed procedure is relatively simple and can be implemented easily by using ready-made standard statistical packages.Keywords: Response Surface Methodology (RSM), Design of Experiments (DOE), Process modeling, Process setting, Process optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1836155 Using Structural Equation Modeling in Causal Relationship Design for Balanced-Scorecards' Strategic Map
Authors: A. Saghaei, R. Ghasemi
Abstract:
Through 1980s, management accounting researchers described the increasing irrelevance of traditional control and performance measurement systems. The Balanced Scorecard (BSC) is a critical business tool for a lot of organizations. It is a performance measurement system which translates mission and strategy into objectives. Strategy map approach is a development variant of BSC in which some necessary causal relations must be established. To recognize these relations, experts usually use experience. It is also possible to utilize regression for the same purpose. Structural Equation Modeling (SEM), which is one of the most powerful methods of multivariate data analysis, obtains more appropriate results than traditional methods such as regression. In the present paper, we propose SEM for the first time to identify the relations between objectives in the strategy map, and a test to measure the importance of relations. In SEM, factor analysis and test of hypotheses are done in the same analysis. SEM is known to be better than other techniques at supporting analysis and reporting. Our approach provides a framework which permits the experts to design the strategy map by applying a comprehensive and scientific method together with their experience. Therefore this scheme is a more reliable method in comparison with the previously established methods.Keywords: BSC, SEM, Strategy map.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2704154 Applications of Stable Distributions in Time Series Analysis, Computer Sciences and Financial Markets
Authors: Mohammad Ali Baradaran Ghahfarokhi, Parvin Baradaran Ghahfarokhi
Abstract:
In this paper, first we introduce the stable distribution, stable process and theirs characteristics. The a -stable distribution family has received great interest in the last decade due to its success in modeling data, which are too impulsive to be accommodated by the Gaussian distribution. In the second part, we propose major applications of alpha stable distribution in telecommunication, computer science such as network delays and signal processing and financial markets. At the end, we focus on using stable distribution to estimate measure of risk in stock markets and show simulated data with statistical softwares.
Keywords: stable distribution, SaS, infinite variance, heavy tail networks, VaR.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2060153 Fuel Cell/DC-DC Convertor Control by Sliding Mode Method
Authors: Farzad Abdous
Abstract:
Fuel cell's system requires regulating circuit for voltage and current in order to control power in case of connecting to other generative devices or load. In this paper Fuel cell system and convertor, which is a multi-variable system, are controlled using sliding mode method. Use of weighting matrix in design procedure made it possible to regulate speed of control. Simulation results show the robustness and accuracy of proposed controller for controlling desired of outputs.Keywords: DC-DC converter, Fuel cell, PEM, Slides mode control.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1613152 The Recreation Technique Model from the Perspective of Environmental Quality Elements
Authors: G. Gradinaru, S. Olteanu
Abstract:
The quality improvements of the environmental elements could increase the recreational opportunities in a certain area (destination). The technique of the need for recreation focuses on choosing certain destinations for recreational purposes. The basic exchange taken into consideration is the one between the satisfaction gained after staying in that area and the value expressed in money and time allocated. The number of tourists in the respective area, the duration of staying and the money spent including transportation provide information on how individuals rank the place or certain aspects of the area (such as the quality of the environmental elements). For the statistical analysis of the environmental benefits offered by an area through the need of recreation technique, the following stages are suggested: - characterization of the reference area based on the statistical variables considered; - estimation of the environmental benefit through comparing the reference area with other similar areas (having the same environmental characteristics), from the perspective of the statistical variables considered. The model compared in recreation technique faced with a series of difficulties which refers to the reference area and correct transformation of time in money.Keywords: Comparison in recreation technique, the quality of the environmental elements, statistical analysis model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1088151 Low Dimensional Representation of Dorsal Hand Vein Features Using Principle Component Analysis (PCA)
Authors: M.Heenaye-Mamode Khan, R.K. Subramanian, N. A. Mamode Khan
Abstract:
The quest of providing more secure identification system has led to a rise in developing biometric systems. Dorsal hand vein pattern is an emerging biometric which has attracted the attention of many researchers, of late. Different approaches have been used to extract the vein pattern and match them. In this work, Principle Component Analysis (PCA) which is a method that has been successfully applied on human faces and hand geometry is applied on the dorsal hand vein pattern. PCA has been used to obtain eigenveins which is a low dimensional representation of vein pattern features. Low cost CCD cameras were used to obtain the vein images. The extraction of the vein pattern was obtained by applying morphology. We have applied noise reduction filters to enhance the vein patterns. The system has been successfully tested on a database of 200 images using a threshold value of 0.9. The results obtained are encouraging.Keywords: Biometric, Dorsal vein pattern, PCA.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1894150 An Advanced Method for Speech Recognition
Authors: Meysam Mohamad pour, Fardad Farokhi
Abstract:
In this paper in consideration of each available techniques deficiencies for speech recognition, an advanced method is presented that-s able to classify speech signals with the high accuracy (98%) at the minimum time. In the presented method, first, the recorded signal is preprocessed that this section includes denoising with Mels Frequency Cepstral Analysis and feature extraction using discrete wavelet transform (DWT) coefficients; Then these features are fed to Multilayer Perceptron (MLP) network for classification. Finally, after training of neural network effective features are selected with UTA algorithm.Keywords: Multilayer perceptron (MLP) neural network, Discrete Wavelet Transform (DWT) , Mels Scale Frequency Filter , UTA algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2365149 Detection and Pose Estimation of People in Images
Authors: Mousa Mojarrad, Amir Masoud Rahmani, Mehrab Mohebi
Abstract:
Detection, feature extraction and pose estimation of people in images and video is made challenging by the variability of human appearance, the complexity of natural scenes and the high dimensionality of articulated body models and also the important field in Image, Signal and Vision Computing in recent years. In this paper, four types of people in 2D dimension image will be tested and proposed. The system will extract the size and the advantage of them (such as: tall fat, short fat, tall thin and short thin) from image. Fat and thin, according to their result from the human body that has been extract from image, will be obtained. Also the system extract every size of human body such as length, width and shown them in output.Keywords: Analysis of Image Processing, Canny Edge Detection, Human Body Recognition, Measurement, Pose Estimation, 2D Human Dimension.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2299148 The Approximate Solution of Linear Fuzzy Fredholm Integral Equations of the Second Kind by Using Iterative Interpolation
Authors: N. Parandin, M. A. Fariborzi Araghi
Abstract:
in this paper, we propose a numerical method for the approximate solution of fuzzy Fredholm functional integral equations of the second kind by using an iterative interpolation. For this purpose, we convert the linear fuzzy Fredholm integral equations to a crisp linear system of integral equations. The proposed method is illustrated by some fuzzy integral equations in numerical examples.Keywords: Fuzzy function integral equations, Iterative method, Linear systems, Parametric form of fuzzy number.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1406147 Vision Based Hand Gesture Recognition
Authors: Pragati Garg, Naveen Aggarwal, Sanjeev Sofat
Abstract:
With the development of ubiquitous computing, current user interaction approaches with keyboard, mouse and pen are not sufficient. Due to the limitation of these devices the useable command set is also limited. Direct use of hands as an input device is an attractive method for providing natural Human Computer Interaction which has evolved from text-based interfaces through 2D graphical-based interfaces, multimedia-supported interfaces, to fully fledged multi-participant Virtual Environment (VE) systems. Imagine the human-computer interaction of the future: A 3Dapplication where you can move and rotate objects simply by moving and rotating your hand - all without touching any input device. In this paper a review of vision based hand gesture recognition is presented. The existing approaches are categorized into 3D model based approaches and appearance based approaches, highlighting their advantages and shortcomings and identifying the open issues.Keywords: Computer Vision, Hand Gesture, Hand Posture, Human Computer Interface.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6340146 New Feed-Forward/Feedback Generalized Minimum Variance Self-tuning Pole-placement Controller
Authors: S. A. Mohamed, A. S. Zayed, O. A. Abolaeha
Abstract:
A new Feed-Forward/Feedback Generalized Minimum Variance Pole-placement Controller to incorporate the robustness of classical pole-placement into the flexibility of generalized minimum variance self-tuning controller for Single-Input Single-Output (SISO) has been proposed in this paper. The design, which provides the user with an adaptive mechanism, which ensures that the closed loop poles are, located at their pre-specified positions. In addition, the controller design which has a feed-forward/feedback structure overcomes the certain limitations existing in similar poleplacement control designs whilst retaining the simplicity of adaptation mechanisms used in other designs. It tracks set-point changes with the desired speed of response, penalizes excessive control action, and can be applied to non-minimum phase systems. Besides, at steady state, the controller has the ability to regulate the constant load disturbance to zero. Example simulation results using both simulated and real plant models demonstrate the effectiveness of the proposed controller.Keywords: Pole-placement, Minimum variance control, self-tuning control and feedforward control.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1746145 Implementation of an On-Line PD Measurement System Using HFCT
Authors: F. Haghjoo, M. Sarlak, S.M. Shahrtash
Abstract:
In order to perform on-line measuring and detection of PD signals, a total solution composing of an HFCT, A/D converter and a complete software package is proposed. The software package includes compensation of HFCT contribution, filtering and noise reduction using wavelet transform and soft calibration routines. The results have shown good performance and high accuracy.Keywords: Partial Discharge, Measurement, On-line, HFCT
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1817144 Effect of Azespirilium Bacteria in Reducing Nitrogen Fertilizers (Urea) and the Interaction of it with Stereptomyces Sp due the Biological Control on the Wheat (Triticum Asstivum) Sustinibelation Culture
Authors: Omid Alizadeh, Ali Parsaeimehr, Barmak.jaefary Hagheghy
Abstract:
An experiment was conducted in October 2008 due the ability replacement plant associate biofertilizers by chemical fertilizers and the qualifying rate of chemical N fertilizers at the moment of using this biofertilizers and the interaction of this biofertilizer on each other. This field experiment has been done in Persepolis (Throne of Jamshid) and arrange by using factorial with the basis of randomized complete block design, in three replication Azespirilium SP bacteria has been admixed with consistence 108 cfu/g and inoculated with seeds of wheat, The streptomyces SP has been used in amount of 550 gr/ha and concatenated on clay and for the qualifying range of chemical fertilizer 4 level of N chemical fertilizer from the source of urea (N0=0, N1=60, N2=120, N3=180) has been used in this experiment. The results indicated there were Significant differences between levels of Nitrogen fertilizer in the entire characteristic which has been measured in this experiment. The admixed Azespirilium SP showed significant differences between their levels in the characteristics such as No. of fertile ear, No. of grain per ear, grain yield, grain protein percentage, leaf area index and the agronomic fertilizer use efficiency. Due the interaction streptomyses with Azespirilium SP bacteria this actinomycet didn-t show any statistically significant differences between it levels.
Keywords: AzetobacterSP, AzespiriliumSP, StreptomycesSP
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1680143 Power Generation Scheduling of Thermal Units Considering Gas Pipelines Constraints
Authors: Sara Mohtashami, Habib Rajabi Mashhadi
Abstract:
With the growth of electricity generation from gas energy gas pipeline reliability can substantially impact the electric generation. A physical disruption to pipeline or to a compressor station can interrupt the flow of gas or reduce the pressure and lead to loss of multiple gas-fired electric generators, which could dramatically reduce the supplied power and threaten the power system security. Gas pressure drops during peak loading time on pipeline system, is a common problem in network with no enough transportation capacity which limits gas transportation and causes many problem for thermal domain power systems in supplying their demand. For a feasible generation scheduling planning in networks with no sufficient gas transportation capacity, it is required to consider gas pipeline constraints in solving the optimization problem and evaluate the impacts of gas consumption in power plants on gas pipelines operating condition. This paper studies about operating of gas fired power plants in critical conditions when the demand of gas and electricity peak together. An integrated model of gas and electric model is used to consider the gas pipeline constraints in the economic dispatch problem of gas-fueled thermal generator units. Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2141142 Interoperable CNC System for Turning Operations
Authors: Yusri Yusof, Stephen Newman, Aydin Nassehi, Keith Case
Abstract:
The changing economic climate has made global manufacturing a growing reality over the last decade, forcing companies from east and west and all over the world to collaborate beyond geographic boundaries in the design, manufacture and assemble of products. The ISO10303 and ISO14649 Standards (STEP and STEP-NC) have been developed to introduce interoperability into manufacturing enterprises so as to meet the challenge of responding to production on demand. This paper describes and illustrates a STEP compliant CAD/CAPP/CAM System for the manufacture of rotational parts on CNC turning centers. The information models to support the proposed system together with the data models defined in the ISO14649 standard used to create the NC programs are also described. A structured view of a STEP compliant CAD/CAPP/CAM system framework supporting the next generation of intelligent CNC controllers for turn/mill component manufacture is provided. Finally a proposed computational environment for a STEP-NC compliant system for turning operations (SCSTO) is described. SCSTO is the experimental part of the research supported by the specification of information models and constructed using a structured methodology and object-oriented methods. SCSTO was developed to generate a Part 21 file based on machining features to support the interactive generation of process plans utilizing feature extraction. A case study component has been developed to prove the concept for using the milling and turning parts of ISO14649 to provide a turn-mill CAD/CAPP/CAM environment. Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1988141 STEP-NC-Compliant Systems for the Manufacturing Environment
Authors: Yusri Yusof
Abstract:
The paper provides a literature review of the STEPNC compliant research around the world. The first part of this paper focuses on projects based on STEP compliance followed by research and development in this area based on machining operations. Review the literature relating to relevant STEP standards and application in the area of turning centers. This research will review the various research work, carried out from the evolution of STEP-NC of the CNC manufacturing activities. The paper concludes with discussion of the applications in this particular area.Keywords: STEP-NC, CNC, Machining and Turning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2514140 Performance Assessment and Optimization of the After-Sale Networks
Authors: H. Izadbakhsh, M.Hour Ali, A. Amirkhani, A. Montazeri, M. Saberi
Abstract:
The after–sales activities are nowadays acknowledged as a relevant source of revenue, profit and competitive advantage in most manufacturing industries. Top and middle management, therefore, should focus on the definition of a structured business performance measurement system for the after-sales business. The paper aims at filling this gap, and presents an integrated methodology for the after-sales network performance measurement, and provides an empirical application to automotive case companies and their official service network. This is the first study that presents an integrated multivariate approach for total assessment and improvement of after-sale services.Keywords: Data Envelopment Analysis (DEA), Principal Component Analysis (PCA), Automotive companies, After-sale services.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1884139 Neutral to Earth Voltage Analysis in Harmonic Polluted Distribution Networks with Multi- Grounded Neutrals
Authors: G. Ahmadi, S.M. Shahrtash
Abstract:
A multiphase harmonic load flow algorithm is developed based on backward/forward sweep to examine the effects of various factors on the neutral to earth voltage (NEV), including unsymmetrical system configuration, load unbalance and harmonic injection. The proposed algorithm composes fundamental frequency and harmonic frequencies power flows. The algorithm and the associated models are tested on IEEE 13 bus system. The magnitude of NEV is investigated under various conditions of the number of grounding rods per feeder lengths, the grounding rods resistance and the grounding resistance of the in feeding source. Additionally, the harmonic injection of nonlinear loads has been considered and its influences on NEV under different conditions are shown.
Keywords: NEV, Distribution System, Multi-grounded, Backward/Forward Sweep, Harmonic Analysis
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2056138 Defect Detection of Tiles Using 2D-Wavelet Transform and Statistical Features
Authors: M.Ghazvini, S. A. Monadjemi, N. Movahhedinia, K. Jamshidi
Abstract:
In this article, a method has been offered to classify normal and defective tiles using wavelet transform and artificial neural networks. The proposed algorithm calculates max and min medians as well as the standard deviation and average of detail images obtained from wavelet filters, then comes by feature vectors and attempts to classify the given tile using a Perceptron neural network with a single hidden layer. In this study along with the proposal of using median of optimum points as the basic feature and its comparison with the rest of the statistical features in the wavelet field, the relational advantages of Haar wavelet is investigated. This method has been experimented on a number of various tile designs and in average, it has been valid for over 90% of the cases. Amongst the other advantages, high speed and low calculating load are prominent.Keywords: Defect detection, tile and ceramic quality inspection, wavelet transform, classification, neural networks, statistical features.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2376137 Anomaly Detection using Neuro Fuzzy system
Authors: Fatemeh Amiri, Caro Lucas, Nasser Yazdani
Abstract:
As the network based technologies become omnipresent, demands to secure networks/systems against threat increase. One of the effective ways to achieve higher security is through the use of intrusion detection systems (IDS), which are a software tool to detect anomalous in the computer or network. In this paper, an IDS has been developed using an improved machine learning based algorithm, Locally Linear Neuro Fuzzy Model (LLNF) for classification whereas this model is originally used for system identification. A key technical challenge in IDS and LLNF learning is the curse of high dimensionality. Therefore a feature selection phase is proposed which is applicable to any IDS. While investigating the use of three feature selection algorithms, in this model, it is shown that adding feature selection phase reduces computational complexity of our model. Feature selection algorithms require the use of a feature goodness measure. The use of both a linear and a non-linear measure - linear correlation coefficient and mutual information- is investigated respectivelyKeywords: anomaly Detection, feature selection, Locally Linear Neuro Fuzzy (LLNF), Mutual Information (MI), liner correlation coefficient.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2183136 User-s Hand Effect on TIS of Different GSM900/1800 Mobile Phone Models Using FDTD Method
Authors: Salah I. Al-Mously, Marai M. Abousetta
Abstract:
This paper predicts the effect of the user-s hand-hold position on the Total Isotropic Sensitivity (TIS) of GSM900/1800 mobile phone antennas of realistic in-use conditions, where different semi-realistic mobile phone models, i.e., candy bar and clamshell, as well as different antenna types, i.e., external and internal, are simulated using a FDTD-based platform. A semi-realistic hand model consisting of three tissues and the SAM head are used in simulations. The results show a considerable impact on TIS of the adopted mobile phone models owing to the user-s hand presence at different positions, where a maximum level of TIS is obtained while grasping the upper part of the mobile phone against head. Maximum TIS levels are recorded in talk position for mobile phones with external antenna and maximum differences in TIS levels due to the hand-hold alteration are recorded for clamshell-type phones.Keywords: FDTD, mobile phone, phantoms, TIS.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1969135 A Computer Model of Language Acquisition – Syllable Learning – Based on Hebbian Cell Assemblies and Reinforcement Learning
Authors: Sepideh Fazeli, Fariba Bahrami
Abstract:
Investigating language acquisition is one of the most challenging problems in the area of studying language. Syllable learning as a level of language acquisition has a considerable significance since it plays an important role in language acquisition. Because of impossibility of studying language acquisition directly with children, especially in its developmental phases, computer models will be useful in examining language acquisition. In this paper a computer model of early language learning for syllable learning is proposed. It is guided by a conceptual model of syllable learning which is named Directions Into Velocities of Articulators model (DIVA). The computer model uses simple associational and reinforcement learning rules within neural network architecture which are inspired by neuroscience. Our simulation results verify the ability of the proposed computer model in producing phonemes during babbling and early speech. Also, it provides a framework for examining the neural basis of language learning and communication disorders.Keywords: Brain modeling, computer models, language acquisition, reinforcement learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1589134 Detection of Power Quality Disturbances using Wavelet Transform
Authors: Sudipta Nath, Arindam Dey, Abhijit Chakrabarti
Abstract:
This paper presents features that characterize power quality disturbances from recorded voltage waveforms using wavelet transform. The discrete wavelet transform has been used to detect and analyze power quality disturbances. The disturbances of interest include sag, swell, outage and transient. A power system network has been simulated by Electromagnetic Transients Program. Voltage waveforms at strategic points have been obtained for analysis, which includes different power quality disturbances. Then wavelet has been chosen to perform feature extraction. The outputs of the feature extraction are the wavelet coefficients representing the power quality disturbance signal. Wavelet coefficients at different levels reveal the time localizing information about the variation of the signal.Keywords: Power quality, detection of disturbance, wavelet transform, multiresolution signal decomposition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3423133 Secure Protocol for Short Message Service
Authors: Shubat S. Ahmeda, Ashraf M. Ali Edwila
Abstract:
Short Message Service (SMS) has grown in popularity over the years and it has become a common way of communication, it is a service provided through General System for Mobile Communications (GSM) that allows users to send text messages to others. SMS is usually used to transport unclassified information, but with the rise of mobile commerce it has become a popular tool for transmitting sensitive information between the business and its clients. By default SMS does not guarantee confidentiality and integrity to the message content. In the mobile communication systems, security (encryption) offered by the network operator only applies on the wireless link. Data delivered through the mobile core network may not be protected. Existing end-to-end security mechanisms are provided at application level and typically based on public key cryptosystem. The main concern in a public-key setting is the authenticity of the public key; this issue can be resolved by identity-based (IDbased) cryptography where the public key of a user can be derived from public information that uniquely identifies the user. This paper presents an encryption mechanism based on the IDbased scheme using Elliptic curves to provide end-to-end security for SMS. This mechanism has been implemented over the standard SMS network architecture and the encryption overhead has been estimated and compared with RSA scheme. This study indicates that the ID-based mechanism has advantages over the RSA mechanism in key distribution and scalability of increasing security level for mobile service.Keywords: Elliptic Curve Cryptography (ECC), End-to-end Security, Identity-based Cryptography, Public Key, RSA, SMS Protocol.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2222132 Discovery of Quantified Hierarchical Production Rules from Large Set of Discovered Rules
Authors: Tamanna Siddiqui, M. Afshar Alam
Abstract:
Automated discovery of Rule is, due to its applicability, one of the most fundamental and important method in KDD. It has been an active research area in the recent past. Hierarchical representation allows us to easily manage the complexity of knowledge, to view the knowledge at different levels of details, and to focus our attention on the interesting aspects only. One of such efficient and easy to understand systems is Hierarchical Production rule (HPRs) system. A HPR, a standard production rule augmented with generality and specificity information, is of the following form: Decision If < condition> Generality
Keywords: Knowledge discovery in database, quantification, dempster shafer theory, genetic programming, hierarchy, subsumption matrix.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1526131 Discovery of Fuzzy Censored Production Rules from Large Set of Discovered Fuzzy if then Rules
Authors: Tamanna Siddiqui, M. Afshar Alam
Abstract:
Censored Production Rule is an extension of standard production rule, which is concerned with problems of reasoning with incomplete information, subject to resource constraints and problem of reasoning efficiently with exceptions. A CPR has a form: IF A (Condition) THEN B (Action) UNLESS C (Censor), Where C is the exception condition. Fuzzy CPR are obtained by augmenting ordinary fuzzy production rule “If X is A then Y is B with an exception condition and are written in the form “If X is A then Y is B Unless Z is C. Such rules are employed in situation in which the fuzzy conditional statement “If X is A then Y is B" holds frequently and the exception condition “Z is C" holds rarely. Thus “If X is A then Y is B" part of the fuzzy CPR express important information while the unless part acts only as a switch that changes the polarity of “Y is B" to “Y is not B" when the assertion “Z is C" holds. The proposed approach is an attempt to discover fuzzy censored production rules from set of discovered fuzzy if then rules in the form: A(X) ÔçÆ B(Y) || C(Z).Keywords: Uncertainty Quantification, Fuzzy if then rules, Fuzzy Censored Production Rules, Learning algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1484130 Auto Classification for Search Intelligence
Authors: Lilac A. E. Al-Safadi
Abstract:
This paper proposes an auto-classification algorithm of Web pages using Data mining techniques. We consider the problem of discovering association rules between terms in a set of Web pages belonging to a category in a search engine database, and present an auto-classification algorithm for solving this problem that are fundamentally based on Apriori algorithm. The proposed technique has two phases. The first phase is a training phase where human experts determines the categories of different Web pages, and the supervised Data mining algorithm will combine these categories with appropriate weighted index terms according to the highest supported rules among the most frequent words. The second phase is the categorization phase where a web crawler will crawl through the World Wide Web to build a database categorized according to the result of the data mining approach. This database contains URLs and their categories.Keywords: Information Processing on the Web, Data Mining, Document Classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1619129 Some Issues on Integrating Telepresence Technology into Industrial Robotic Assembly
Authors: Gunther Reinhart, Marwan Radi
Abstract:
Since the 1940s, many promising telepresence research results have been obtained. However, telepresence technology still has not reached industrial usage. As human intelligence is necessary for successful execution of most manual assembly tasks, the ability of the human is hindered in some cases, such as the assembly of heavy parts of small/medium lots or prototypes. In such a case of manual assembly, the help of industrial robots is mandatory. The telepresence technology can be considered as a solution for performing assembly tasks, where the human intelligence and haptic sense are needed to identify and minimize the errors during an assembly process and a robot is needed to carry heavy parts. In this paper, preliminary steps to integrate the telepresence technology into industrial robot systems are introduced. The system described here combines both, the human haptic sense and the industrial robot capability to perform a manual assembly task remotely using a force feedback joystick. Mapping between the joystick-s Degrees of Freedom (DOF) and the robot-s ones are introduced. Simulation and experimental results are shown and future work is discussed.Keywords: Assembly, Force Feedback, Industrial Robot, Teleassembly, Telepresence.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1243128 Position Control of an AC Servo Motor Using VHDL and FPGA
Authors: Kariyappa B. S., Hariprasad S. A., R. Nagaraj
Abstract:
In this paper, a new method of controlling position of AC Servomotor using Field Programmable Gate Array (FPGA). FPGA controller is used to generate direction and the number of pulses required to rotate for a given angle. Pulses are sent as a square wave, the number of pulses determines the angle of rotation and frequency of square wave determines the speed of rotation. The proposed control scheme has been realized using XILINX FPGA SPARTAN XC3S400 and tested using MUMA012PIS model Alternating Current (AC) servomotor. Experimental results show that the position of the AC Servo motor can be controlled effectively. KeywordsAlternating Current (AC), Field Programmable Gate Array (FPGA), Liquid Crystal Display (LCD).
Keywords: Alternating Current (AC), Field Programmable Gate Array (FPGA), Liquid Crystal Display (LCD).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5159127 Effectiveness of Contourlet vs Wavelet Transform on Medical Image Compression: a Comparative Study
Authors: Negar Riazifar, Mehran Yazdi
Abstract:
Discrete Wavelet Transform (DWT) has demonstrated far superior to previous Discrete Cosine Transform (DCT) and standard JPEG in natural as well as medical image compression. Due to its localization properties both in special and transform domain, the quantization error introduced in DWT does not propagate globally as in DCT. Moreover, DWT is a global approach that avoids block artifacts as in the JPEG. However, recent reports on natural image compression have shown the superior performance of contourlet transform, a new extension to the wavelet transform in two dimensions using nonseparable and directional filter banks, compared to DWT. It is mostly due to the optimality of contourlet in representing the edges when they are smooth curves. In this work, we investigate this fact for medical images, especially for CT images, which has not been reported yet. To do that, we propose a compression scheme in transform domain and compare the performance of both DWT and contourlet transform in PSNR for different compression ratios (CR) using this scheme. The results obtained using different type of computed tomography images show that the DWT has still good performance at lower CR but contourlet transform performs better at higher CR.Keywords: Computed Tomography (CT), DWT, Discrete Contourlet Transform, Image Compression.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2797126 Fast Database Indexing for Large Protein Sequence Collections Using Parallel N-Gram Transformation Algorithm
Authors: Jehad A. H. Hammad, Nur'Aini binti Abdul Rashid
Abstract:
With the rapid development in the field of life sciences and the flooding of genomic information, the need for faster and scalable searching methods has become urgent. One of the approaches that were investigated is indexing. The indexing methods have been categorized into three categories which are the lengthbased index algorithms, transformation-based algorithms and mixed techniques-based algorithms. In this research, we focused on the transformation based methods. We embedded the N-gram method into the transformation-based method to build an inverted index table. We then applied the parallel methods to speed up the index building time and to reduce the overall retrieval time when querying the genomic database. Our experiments show that the use of N-Gram transformation algorithm is an economical solution; it saves time and space too. The result shows that the size of the index is smaller than the size of the dataset when the size of N-Gram is 5 and 6. The parallel N-Gram transformation algorithm-s results indicate that the uses of parallel programming with large dataset are promising which can be improved further.Keywords: Biological sequence, Database index, N-gram indexing, Parallel computing, Sequence retrieval.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2135125 Analysis and Classification of Hiv-1 Sub- Type Viruses by AR Model through Artificial Neural Networks
Authors: O. Yavuz, L. Ozyilmaz
Abstract:
HIV-1 genome is highly heterogeneous. Due to this variation, features of HIV-I genome is in a wide range. For this reason, the ability to infection of the virus changes depending on different chemokine receptors. From this point of view, R5 HIV viruses use CCR5 coreceptor while X4 viruses use CXCR5 and R5X4 viruses can utilize both coreceptors. Recently, in Bioinformatics, R5X4 viruses have been studied to classify by using the experiments on HIV-1 genome. In this study, R5X4 type of HIV viruses were classified using Auto Regressive (AR) model through Artificial Neural Networks (ANNs). The statistical data of R5X4, R5 and X4 viruses was analyzed by using signal processing methods and ANNs. Accessible residues of these virus sequences were obtained and modeled by AR model since the dimension of residues is large and different from each other. Finally the pre-processed data was used to evolve various ANN structures for determining R5X4 viruses. Furthermore ROC analysis was applied to ANNs to show their real performances. The results indicate that R5X4 viruses successfully classified with high sensitivity and specificity values training and testing ROC analysis for RBF, which gives the best performance among ANN structures.Keywords: Auto-Regressive Model, HIV, Neural Networks, ROC Analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1179124 A Decision Boundary based Discretization Technique using Resampling
Authors: Taimur Qureshi, Djamel A Zighed
Abstract:
Many supervised induction algorithms require discrete data, even while real data often comes in a discrete and continuous formats. Quality discretization of continuous attributes is an important problem that has effects on speed, accuracy and understandability of the induction models. Usually, discretization and other types of statistical processes are applied to subsets of the population as the entire population is practically inaccessible. For this reason we argue that the discretization performed on a sample of the population is only an estimate of the entire population. Most of the existing discretization methods, partition the attribute range into two or several intervals using a single or a set of cut points. In this paper, we introduce a technique by using resampling (such as bootstrap) to generate a set of candidate discretization points and thus, improving the discretization quality by providing a better estimation towards the entire population. Thus, the goal of this paper is to observe whether the resampling technique can lead to better discretization points, which opens up a new paradigm to construction of soft decision trees.Keywords: Bootstrap, discretization, resampling, soft decision trees.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1433123 Web Service Security Method To SOA Development
Authors: Nafise Fareghzadeh
Abstract:
Web services provide significant new benefits for SOAbased applications, but they also expose significant new security risks. There are huge number of WS security standards and processes. At present, there is still a lack of a comprehensive approach which offers a methodical development in the construction of secure WS-based SOA. Thus, the main objective of this paper is to address this needs, presenting a comprehensive method for Web Services Security guaranty in SOA. The proposed method defines three stages, Initial Security Analysis, Architectural Security Guaranty and WS Security Standards Identification. These facilitate, respectively, the definition and analysis of WS-specific security requirements, the development of a WS-based security architecture and the identification of the related WS security standards that the security architecture must articulate in order to implement the security services.Keywords: Kernel, Repository, Security Standards, WS Security Policy, WS specification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1426122 Web Pages Aesthetic Evaluation Using Low-Level Visual Features
Authors: Maryam Mirdehghani, S. Amirhassan Monadjemi
Abstract:
Web sites are rapidly becoming the preferred media choice for our daily works such as information search, company presentation, shopping, and so on. At the same time, we live in a period where visual appearances play an increasingly important role in our daily life. In spite of designers- effort to develop a web site which be both user-friendly and attractive, it would be difficult to ensure the outcome-s aesthetic quality, since the visual appearance is a matter of an individual self perception and opinion. In this study, it is attempted to develop an automatic system for web pages aesthetic evaluation which are the building blocks of web sites. Based on the image processing techniques and artificial neural networks, the proposed method would be able to categorize the input web page according to its visual appearance and aesthetic quality. The employed features are multiscale/multidirectional textural and perceptual color properties of the web pages, fed to perceptron ANN which has been trained as the evaluator. The method is tested using university web sites and the results suggested that it would perform well in the web page aesthetic evaluation tasks with around 90% correct categorization.Keywords: Web Page Design, Web Page Aesthetic, Color Spaces, Texture, Neural Networks
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1633121 Toward An Agreement on Semantic Web Architecture
Authors: Haytham Al-Feel, M.A.Koutb, Hoda Suoror
Abstract:
There are many problems associated with the World Wide Web: getting lost in the hyperspace; the web content is still accessible only to humans and difficulties of web administration. The solution to these problems is the Semantic Web which is considered to be the extension for the current web presents information in both human readable and machine processable form. The aim of this study is to reach new generic foundation architecture for the Semantic Web because there is no clear architecture for it, there are four versions, but still up to now there is no agreement for one of these versions nor is there a clear picture for the relation between different layers and technologies inside this architecture. This can be done depending on the idea of previous versions as well as Gerber-s evaluation method as a step toward an agreement for one Semantic Web architecture.Keywords: Semantic Web Architecture, XML, RDF and Ontology.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1701120 Incremental Mining of Shocking Association Patterns
Authors: Eiad Yafi, Ahmed Sultan Al-Hegami, M. A. Alam, Ranjit Biswas
Abstract:
Association rules are an important problem in data mining. Massively increasing volume of data in real life databases has motivated researchers to design novel and incremental algorithms for association rules mining. In this paper, we propose an incremental association rules mining algorithm that integrates shocking interestingness criterion during the process of building the model. A new interesting measure called shocking measure is introduced. One of the main features of the proposed approach is to capture the user background knowledge, which is monotonically augmented. The incremental model that reflects the changing data and the user beliefs is attractive in order to make the over all KDD process more effective and efficient. We implemented the proposed approach and experiment it with some public datasets and found the results quite promising.Keywords: Knowledge discovery in databases (KDD), Data mining, Incremental Association rules, Domain knowledge, Interestingness, Shocking rules (SHR).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1866119 Model-Based Small Area Estimation with Application to Unemployment Estimates
Authors: Hichem Omrani, Philippe Gerber, Patrick Bousch
Abstract:
The problem of Small Area Estimation (SAE) is complex because of various information sources and insufficient data. In this paper, an approach for SAE is presented for decision-making at national, regional and local level. We propose an Empirical Best Linear Unbiased Predictor (EBLUP) as an estimator in order to combine several information sources to evaluate various indicators. First, we present the urban audit project and its environmental, social and economic indicators. Secondly, we propose an approach for decision making in order to estimate indicators. An application is used to validate the theoretical proposal. Finally, a decision support system is presented based on open-source environment.
Keywords: Small area estimation, statistical method, sampling, empirical best linear unbiased predictor (EBLUP), decision-making.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1710118 A Decision Support Tool for Evaluating Mobility Projects
Abstract:
Success is a European project that will implement several clean transport offers in three European cities and evaluate the environmental impacts. The goal of these measures is to improve urban mobility or the displacement of residents inside cities. For e.g. park and ride, electric vehicles, hybrid bus and bike sharing etc. A list of 28 criteria and 60 measures has been established for evaluation of these transport projects. The evaluation criteria can be grouped into: Transport, environment, social, economic and fuel consumption. This article proposes a decision support system based that encapsulates a hybrid approach based on fuzzy logic, multicriteria analysis and belief theory for the evaluation of impacts of urban mobility solutions. A web-based tool called DeSSIA (Decision Support System for Impacts Assessment) has been developed that treats complex data. The tool has several functionalities starting from data integration (import of data), evaluation of projects and finishes by graphical display of results. The tool development is based on the concept of MVC (Model, View, and Controller). The MVC is a conception model adapted to the creation of software's which impose separation between data, their treatment and presentation. Effort is laid on the ergonomic aspects of the application. It has codes compatible with the latest norms (XHTML, CSS) and has been validated by W3C (World Wide Web Consortium). The main ergonomic aspect focuses on the usability of the application, ease of learning and adoption. By the usage of technologies such as AJAX (XML and Java Script asynchrones), the application is more rapid and convivial. The positive points of our approach are that it treats heterogeneous data (qualitative, quantitative) from various information sources (human experts, survey, sensors, model etc.).
Keywords: Decision support tool, hybrid approach, urban mobility.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1992117 Some Discrete Propositions in IVSs
Authors: A. Pouhassani
Abstract:
The aim of this paper is to exhibit some properties of local topologies of an IVS. Also, we Introduce ISG structure as an interesting structure of semigroups in IVSs.Keywords: IVS, ISG, Local topology, Lebesgue number, Lindelof theorem
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1031116 Impact of Viscous and Heat Relaxation Loss on the Critical Temperature Gradients of Thermoacoustic Stacks
Authors: Zhibin Yu, Artur J. Jaworski, Abdulrahman S. Abduljalil
Abstract:
A stack with a small critical temperature gradient is desirable for a standing wave thermoacoustic engine to obtain a low onset temperature difference (the minimum temperature difference to start engine-s self-oscillation). The viscous and heat relaxation loss in the stack determines the critical temperature gradient. In this work, a dimensionless critical temperature gradient factor is obtained based on the linear thermoacoustic theory. It is indicated that the impedance determines the proportion between the viscous loss, heat relaxation losses and the power production from the heat energy. It reveals the effects of the channel dimensions, geometrical configuration and the local acoustic impedance on the critical temperature gradient in stacks. The numerical analysis shows that there exists a possible optimum combination of these parameters which leads to the lowest critical temperature gradient. Furthermore, several different geometries have been tested and compared numerically.Keywords: Critical temperature gradient, heat relaxation, stack, viscous effect.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1805115 Vortex Shedding at the End of Parallel-plate Thermoacoustic Stack in the Oscillatory Flow Conditions
Authors: Lei Shi, Zhibin Yu, Artur J. Jaworski, Abdulrahman S. Abduljalil
Abstract:
This paper investigates vortex shedding processes occurring at the end of a stack of parallel plates, due to an oscillating flow induced by an acoustic standing wave within an acoustic resonator. Here, Particle Image Velocimetry (PIV) is used to quantify the vortex shedding processes within an acoustic cycle phase-by-phase, in particular during the “ejection" of the fluid out of the stack. Standard hot-wire anemometry measurement is also applied to detect the velocity fluctuations near the end of the stack. Combination of these two measurement techniques allowed a detailed analysis of the vortex shedding phenomena. The results obtained show that, as the Reynolds number varies (by varying the plate thickness and drive ratio), different flow patterns of vortex shedding are observed by the PIV measurement. On the other hand, the time-dependent hot-wire measurements allow obtaining detailed frequency spectra of the velocity signal, used for calculating characteristic Strouhal numbers. The impact of the plate thickness and the Reynolds number on the vortex shedding pattern has been discussed. Furthermore, a detailed map of the relationship between the Strouhal number and Reynolds number has been obtained and discussed.Keywords: Oscillatory flow, Parallel-plate thermoacoustic stack, Strouhal numbers, Vortex shedding.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1882114 A Survey on Performance Tools for OpenMP
Authors: Mubarak S. Mohsen, Rosni Abdullah, Yong M. Teo
Abstract:
Advances in processors architecture, such as multicore, increase the size of complexity of parallel computer systems. With multi-core architecture there are different parallel languages that can be used to run parallel programs. One of these languages is OpenMP which embedded in C/Cµ or FORTRAN. Because of this new architecture and the complexity, it is very important to evaluate the performance of OpenMP constructs, kernels, and application program on multi-core systems. Performance is the activity of collecting the information about the execution characteristics of a program. Performance tools consists of at least three interfacing software layers, including instrumentation, measurement, and analysis. The instrumentation layer defines the measured performance events. The measurement layer determines what performance event is actually captured and how it is measured by the tool. The analysis layer processes the performance data and summarizes it into a form that can be displayed in performance tools. In this paper, a number of OpenMP performance tools are surveyed, explaining how each is used to collect, analyse, and display data collection.Keywords: Parallel performance tools, OpenMP, multi-core.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1921113 Optimal DG Placement in Distribution systems Using Cost/Worth Analysis
Authors: M Ahmadigorji, A. Abbaspour, A Rajabi-Ghahnavieh, M. Fotuhi- Firuzabad
Abstract:
DG application has received increasing attention during recent years. The impact of DG on various aspects of distribution system operation, such as reliability and energy loss, depend highly on DG location in distribution feeder. Optimal DG placement is an important subject which has not been fully discussed yet. This paper presents an optimization method to determine optimal DG placement, based on a cost/worth analysis approach. This method considers technical and economical factors such as energy loss, load point reliability indices and DG costs, and particularly, portability of DG. The proposed method is applied to a test system and the impacts of different parameters such as load growth rate and load forecast uncertainty (LFU) on optimum DG location are studied.Keywords: Distributed generation, optimal placement, cost/worthanalysis, customer interruption cost, Dynamic programming
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2974112 Reliability Analysis in Electrical Distribution System Considering Preventive Maintenance Applications on Circuit Breakers
Authors: Mahmud Fotuhi-Firuzabad, Saeed Afshar
Abstract:
This paper presents the results of a preventive maintenance application-based study and modeling of failure rates in breakers of electrical distribution systems. This is a critical issue in the reliability assessment of a system. In the analysis conducted in this paper, the impacts of failure rate variations caused by a preventive maintenance are examined. This is considered as a part of a Reliability Centered Maintenance (RCM) application program. A number of load point reliability indices is derived using the mathematical model of the failure rate, which is established using the observed data in a distribution system.
Keywords: Reliability-Centered Maintenance (RCM), failure rate, preventive maintenance (PM), Distribution System Reliability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2493111 Reliability-based Selection of Wind Turbines for Large-Scale Wind Farms
Authors: M. Fotuhi-Firuzabad, A. Salehi Dobakhshari
Abstract:
This paper presents a reliability-based approach to select appropriate wind turbine types for a wind farm considering site-specific wind speed patterns. An actual wind farm in the northern region of Iran with the wind speed registration of one year is studied in this paper. An analytic approach based on total probability theorem is utilized in this paper to model the probabilistic behavior of both turbines- availability and wind speed. Well-known probabilistic reliability indices such as loss of load expectation (LOLE), expected energy not supplied (EENS) and incremental peak load carrying capability (IPLCC) for wind power integration in the Roy Billinton Test System (RBTS) are examined. The most appropriate turbine type achieving the highest reliability level is chosen for the studied wind farm.
Keywords: Wind Turbine Generator, Wind Farm, Power System Reliability, Wind Turbine Type Selection
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1775110 Tree-on-DAG for Data Aggregation in Sensor Networks
Authors: Prakash G L, Thejaswini M, S H Manjula, K R Venugopal, L M Patnaik
Abstract:
Computing and maintaining network structures for efficient data aggregation incurs high overhead for dynamic events where the set of nodes sensing an event changes with time. Moreover, structured approaches are sensitive to the waiting time that is used by nodes to wait for packets from their children before forwarding the packet to the sink. An optimal routing and data aggregation scheme for wireless sensor networks is proposed in this paper. We propose Tree on DAG (ToD), a semistructured approach that uses Dynamic Forwarding on an implicitly constructed structure composed of multiple shortest path trees to support network scalability. The key principle behind ToD is that adjacent nodes in a graph will have low stretch in one of these trees in ToD, thus resulting in early aggregation of packets. Based on simulations on a 2,000-node Mica2- based network, we conclude that efficient aggregation in large-scale networks can be achieved by our semistructured approach.Keywords: Aggregation, Packet Merging, Query Processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1930109 An Enhanced Slicing Algorithm Using Nearest Distance Analysis for Layer Manufacturing
Authors: M. Vatani, A. R. Rahimi, F. Brazandeh, A. Sanati nezhad
Abstract:
Although the STL (stereo lithography) file format is widely used as a de facto industry standard in the rapid prototyping industry due to its simplicity and ability to tessellation of almost all surfaces, but there are always some defects and shortcoming in their usage, which many of them are difficult to correct manually. In processing the complex models, size of the file and its defects grow extremely, therefore, correcting STL files become difficult. In this paper through optimizing the exiting algorithms, size of the files and memory usage of computers to process them will be reduced. In spite of type and extent of the errors in STL files, the tail-to-head searching method and analysis of the nearest distance between tails and heads techniques were used. As a result STL models sliced rapidly, and fully closed contours produced effectively and errorless.Keywords: Layer manufacturing, STL files, slicing algorithm, nearest distance analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4157108 Manufacture of Electroless Nickel/YSZ Composite Coatings
Authors: N. Bahiyah Baba, W. Waugh, A.M. Davidson
Abstract:
The paper discusses optimising work on a method of processing ceramic / metal composite coatings for various applications and is based on preliminary work on processing anodes for solid oxide fuel cells (SOFCs). The composite coating is manufactured by the electroless co-deposition of nickel and yttria stabilised zirconia (YSZ) simultaneously on to a ceramic substrate. The effect on coating characteristics of substrate surface treatments and electroless nickel bath parameters such as pH and agitation methods are also investigated. Characterisation of the resulting deposit by scanning electron microscopy (SEM) and energy dispersive X-ray analysis (EDXA) is also discussed.
Keywords: Electroless deposition, nickel, YSZ, composite
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2568107 Ageing Assessment of Insulation Systems by Absorption/Resorption Currents
Authors: Petru V. Notingher, Stefan Busoi, Laurentiu M. Dumitran, Cristina Stancu, Gabriel Tanasescu, Emanuel Balescu
Abstract:
Degradation of polymeric insulation systems of electrical equipments increases the space charge density and the concentration of electrical dipoles. By consequence, the maximum values and the slopes of absorption/resorption (A/R) currents can change with insulation systems ageing. In this paper, an analysis of the nature of the A/R currents and the importance of their components, especially the polarization current and the current given by the space charge, is presented. The experimental study concerns the A/R currents measurements of plane samples (made from CALMICAGLAS tapes), virgin and thermally accelerated aged. The obtained results show that the ageing process produces an increase of the values and a decrease of shapes of the A/R currents. Finally, the possibility of estimating insulations ageing state and lifetime from A/R currents measurements is discussed.Keywords: Insulation Systems, Absorption/Resorption Currents, Ageing, Lifetime.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1996106 A Model Predicting the Microbiological Qualityof Aquacultured Sea Bream (Sparus aurata) According to Physicochemical Data: An Application in Western Greece Fish Aquaculture
Authors: Joan Iliopoulou-Georgudaki, Chris Theodoropoulos, Danae Venieri, Maria Lagkadinou
Abstract:
Monitoring of microbial flora in aquacultured sea bream, in relation to the physicochemical parameters of the rearing seawater, ended to a model describing the influence of the last to the quality of the fisheries. Fishes were sampled during eight months from four aqua farms in Western Greece and analyzed for psychrotrophic, H2S producing bacteria, Salmonella sp., heterotrophic plate count (PCA), with simultaneous physical evaluation. Temperature, dissolved oxygen, pH, conductivity, TDS, salinity, NO3 - and NH4 + ions were recorded. Temperature, dissolved oxygen and conductivity were correlated, respectively, to PCA, Pseudomonas sp. and Shewanella sp. counts. These parameters were the inputs of the model, which was driving, as outputs, to the prediction of PCA, Vibrio sp., Pseudomonas sp. and Shewanella sp. counts, and fish microbiological quality. The present study provides, for the first time, a ready-to-use predictive model of fisheries hygiene, leading to an effective management system for the optimization of aquaculture fisheries quality.
Keywords: Microbiological, model, physicochemical, Seabream.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1748105 Academic Program Administration via Semantic Web – A Case Study
Authors: Qurban A Memon, Shakeel A. Khoja
Abstract:
Generally, administrative systems in an academic environment are disjoint and support independent queries. The objective in this work is to semantically connect these independent systems to provide support to queries run on the integrated platform. The proposed framework, by enriching educational material in the legacy systems, provides a value-added semantics layer where activities such as annotation, query and reasoning can be carried out to support management requirements. We discuss the development of this ontology framework with a case study of UAE University program administration to show how semantic web technologies can be used by administration to develop student profiles for better academic program management.Keywords: Academic Program Administration, Semantic Web, Web Technology
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1618104 Packaging and Interconnection Technologies of Power Devices, Challenges and Future Trends
Authors: Raed A. Amro
Abstract:
Standard packaging and interconnection technologies of power devices have difficulties meeting the increasing thermal demands of new application fields of power electronics devices. Main restrictions are the decreasing reliability of bond-wires and solder layers with increasing junction temperature. In the last few years intensive efforts have been invested in developing new packaging and interconnection solutions which may open a path to future application of power devices. In this paper, the main failure mechanisms of power devices are described and principle of new packaging and interconnection concepts and their power cycling reliability are presented.Keywords: Power electronics devices, Reliability, Power Cycling, Low-temperature joining technique (LTJT)
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2594103 Blood Cell Dynamics in a Simple Shear Flow using an Implicit Fluid-Structure Interaction Method Based on the ALE Approach
Authors: Choeng-Ryul Choi, Chang-Nyung Kim, Tae-Hyub Hong
Abstract:
A numerical method is developed for simulating the motion of particles with arbitrary shapes in an effectively infinite or bounded viscous flow. The particle translational and angular motions are numerically investigated using a fluid-structure interaction (FSI) method based on the Arbitrary-Lagrangian-Eulerian (ALE) approach and the dynamic mesh method (smoothing and remeshing) in FLUENT ( ANSYS Inc., USA). Also, the effects of arbitrary shapes on the dynamics are studied using the FSI method which could be applied to the motions and deformations of a single blood cell and multiple blood cells, and the primary thrombogenesis caused by platelet aggregation. It is expected that, combined with a sophisticated large-scale computational technique, the simulation method will be useful for understanding the overall properties of blood flow from blood cellular level (microscopic) to the resulting rheological properties of blood as a mass (macroscopic).Keywords: Blood Flow, Fluid-Structure Interaction (FSI), Micro-Channels, Arbitrary Shapes, Red Blood Cells (RBCs)
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2309102 Characteristics of Hemodynamics in a Bileaflet Mechanical Heart Valve using an Implicit FSI Method
Authors: Tae-Hyub Hong, Choeng-Ryul Choi, Chang-Nyung Kim
Abstract:
Human heart valves diseased by congenital heart defects, rheumatic fever, bacterial infection, cancer may cause stenosis or insufficiency in the valves. Treatment may be with medication but often involves valve repair or replacement (insertion of an artificial heart valve). Bileaflet mechanical heart valves (BMHVs) are widely implanted to replace the diseased heart valves, but still suffer from complications such as hemolysis, platelet activation, tissue overgrowth and device failure. These complications are closely related to both flow characteristics through the valves and leaflet dynamics. In this study, the physiological flow interacting with the moving leaflets in a bileaflet mechanical heart valve (BMHV) is simulated with a strongly coupled implicit fluid-structure interaction (FSI) method which is newly organized based on the Arbitrary-Lagrangian-Eulerian (ALE) approach and the dynamic mesh method (remeshing) of FLUENT. The simulated results are in good agreement with previous experimental studies. This study shows the applicability of the present FSI model to the complicated physics interacting between fluid flow and moving boundary.Keywords: Bileaflet Mechanical Heart Valve, Fluid- Structure Interaction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2034101 Multi-Enterprise Tie and Co-Operation Mechanism in Mexican Agro Industry SME's
Authors: Tania Elena González Alvarado, Ma. Antonieta Martín Granados
Abstract:
The aim of this paper is to explain what a multienterprise tie is, what evidence its analysis provides and how does the cooperation mechanism influence the establishment of a multienterprise tie. The study focuses on businesses of smaller dimension, geographically dispersed and whose businessmen are learning to cooperate in an international environment. The empirical evidence obtained at this moment permits to conclude the following: The tie is not long-lasting, it has an end; opportunism is an opportunity to learn; the multi-enterprise tie is a space to learn about the cooperation mechanism; the local tie permits a businessman to alternate between competition and cooperation strategies; the disappearance of a tie is an experience of learning for a businessman, diminishing the possibility of failure in the next tie; the cooperation mechanism tends to eliminate hierarchical relations; the multienterprise tie diminishes the asymmetries and permits SME-s to have a better position when they negotiate with large companies; the multi-enterprise tie impacts positively on the local system. The collection of empirical evidence was done trough the following instruments: direct observation in a business encounter to which the businesses attended in 2003 (202 Mexican agro industry SME-s), a survey applied in 2004 (129), a questionnaire applied in 2005 (86 businesses), field visits to the businesses during the period 2006-2008 and; a survey applied by telephone in 2008 (55 Mexican agro industry SME-s).
Keywords: Cooperation, multi-enterprise tie, links, networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1273100 Salient Points Reduction for Content-Based Image Retrieval
Authors: Yao-Hong Tsai
Abstract:
Salient points are frequently used to represent local properties of the image in content-based image retrieval. In this paper, we present a reduction algorithm that extracts the local most salient points such that they not only give a satisfying representation of an image, but also make the image retrieval process efficiently. This algorithm recursively reduces the continuous point set by their corresponding saliency values under a top-down approach. The resulting salient points are evaluated with an image retrieval system using Hausdoff distance. In this experiment, it shows that our method is robust and the extracted salient points provide better retrieval performance comparing with other point detectors.Keywords: Barnard detector, Content-based image retrieval, Points reduction, Salient point.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 146899 Design and Simulation of a Concentrated Luneberg Antenna
Authors: Z. Briqech, M. Abousetta
Abstract:
Luneberg lens is a new generation of antennas that is developed in the last few years and inserts itself strongly in Microwaves, Communications and Telescopes area. The idea of this research is to improve the radiation pattern by decreasing the side lobes and increasing the main lobe. The new design is proposed to work in the X-band. The simulated result and analysis are presented.Keywords: Communications, Microwaves, lens Antenna, Lunberg Lens Antenna.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 282298 Haptics Enabled Offline AFM Image Analysis
Authors: Bhatti A., Nahavandi S., Hossny M.
Abstract:
Current advancements in nanotechnology are dependent on the capabilities that can enable nano-scientists to extend their eyes and hands into the nano-world. For this purpose, a haptics (devices capable of recreating tactile or force sensations) based system for AFM (Atomic Force Microscope) is proposed. The system enables the nano-scientists to touch and feel the sample surfaces, viewed through AFM, in order to provide them with better understanding of the physical properties of the surface, such as roughness, stiffness and shape of molecular architecture. At this stage, the proposed work uses of ine images produced using AFM and perform image analysis to create virtual surfaces suitable for haptics force analysis. The research work is in the process of extension from of ine to online process where interaction will be done directly on the material surface for realistic analysis.Keywords: Haptics, AFM, force feedback, image analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 150697 The Variable Step-Size Gauss-Seidel Pseudo Affine Projection Algorithm
Authors: F. Albu, C. Paleologu
Abstract:
In this paper, a new pseudo affine projection (AP) algorithm based on Gauss-Seidel (GS) iterations is proposed for acoustic echo cancellation (AEC). It is shown that the algorithm is robust against near-end signal variations (including double-talk).Keywords: pseudo affine projection algorithm, acoustic echo cancellation, double-talk.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 142596 Empirical Study of Real Retail Trade Turnover
Authors: J. Arneric, E. Jurun, L. Kordic
Abstract:
This paper deals with econometric analysis of real retail trade turnover. It is a part of an extensive scientific research about modern trends in Croatian national economy. At the end of the period of transition economy, Croatia confronts with challenges and problems of high consumption society. In such environment as crucial economic variables: real retail trade turnover, average monthly real wages and household loans are chosen for consequence analysis. For the purpose of complete procedure of multiple econometric analysis data base adjustment has been provided. Namely, it has been necessary to deflate original national statistics data of retail trade turnover using consumer price indices, as well as provide process of seasonally adjustment of its contemporary behavior. In model establishment it has been necessary to involve the overcoming procedure for the autocorrelation and colinearity problems. Moreover, for case of time-series shift a specific appropriate econometric instrument has been applied. It would be emphasize that the whole methodology procedure is based on the real Croatian national economy time-series.Keywords: Consumption society, multiple econometric model, real retail trade turnover, second order autocorrelation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 146795 Numerical Optimization within Vector of Parameters Estimation in Volatility Models
Authors: J. Arneric, A. Rozga
Abstract:
In this paper usefulness of quasi-Newton iteration procedure in parameters estimation of the conditional variance equation within BHHH algorithm is presented. Analytical solution of maximization of the likelihood function using first and second derivatives is too complex when the variance is time-varying. The advantage of BHHH algorithm in comparison to the other optimization algorithms is that requires no third derivatives with assured convergence. To simplify optimization procedure BHHH algorithm uses the approximation of the matrix of second derivatives according to information identity. However, parameters estimation in a/symmetric GARCH(1,1) model assuming normal distribution of returns is not that simple, i.e. it is difficult to solve it analytically. Maximum of the likelihood function can be founded by iteration procedure until no further increase can be found. Because the solutions of the numerical optimization are very sensitive to the initial values, GARCH(1,1) model starting parameters are defined. The number of iterations can be reduced using starting values close to the global maximum. Optimization procedure will be illustrated in framework of modeling volatility on daily basis of the most liquid stocks on Croatian capital market: Podravka stocks (food industry), Petrokemija stocks (fertilizer industry) and Ericsson Nikola Tesla stocks (information-s-communications industry).Keywords: Heteroscedasticity, Log-likelihood Maximization, Quasi-Newton iteration procedure, Volatility.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 264994 E-Business Security: Methodological Considerations
Authors: Ja'far Alqatawna, Jawed Siddiqi, Babak Akhgar, Mohammad Hjouj Btoush
Abstract:
A great deal of research works in the field information systems security has been based on a positivist paradigm. Applying the reductionism concept of the positivist paradigm for information security means missing the bigger picture and thus, the lack of holism which could be one of the reasons why security is still overlooked, comes as an afterthought or perceived from a purely technical dimension. We need to reshape our thinking and attitudes towards security especially in a complex and dynamic environment such as e- Business to develop a holistic understanding of e-Business security in relation to its context as well as considering all the stakeholders in the problem area. In this paper we argue the suitability and need for more inductive interpretive approach and qualitative research method to investigate e-Business security. Our discussion is based on a holistic framework of enquiry, nature of the research problem, the underling theoretical lens and the complexity of e-Business environment. At the end we present a research strategy for developing a holistic framework for understanding of e-Business security problems in the context of developing countries based on an interdisciplinary inquiry which considers their needs and requirements.Keywords: e-Business Security, Complexity, Methodological considerations, interpretive qualitative research and Case study method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 150693 New Identity Management Scheme and its Formal Analysis
Authors: Jeonghoon Han, Hanjae Jeong, Dongho Won, Seungjoo Kim
Abstract:
As the Internet technology has developed rapidly, the number of identities (IDs) managed by each individual person has increased and various ID management technologies have been developed to assist users. However, most of these technologies are vulnerable to the existing hacking methods such as phishing attacks and key-logging. If the administrator-s password is exposed, an attacker can access the entire contents of the stolen user-s data files in other devices. To solve these problems, we propose here a new ID management scheme based on a Single Password Protocol. The paper presents the details of the new scheme as well as a formal analysis of the method using BAN Logic.Keywords: Anti-phishing, BAN Logic, ID management.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 152592 Measuring Pressure Wave Velocity in a Hydraulic System
Authors: Lari Kela, Pekka Vähäoja
Abstract:
Pressure wave velocity in a hydraulic system was determined using piezo pressure sensors without removing fluid from the system. The measurements were carried out in a low pressure range (0.2 – 6 bar) and the results were compared with the results of other studies. This method is not as accurate as measurement with separate measurement equipment, but the fluid is in the actual machine the whole time and the effect of air is taken into consideration if air is present in the system. The amount of air is estimated by calculations and comparisons between other studies. This measurement equipment can also be installed in an existing machine and it can be programmed so that it measures in real time. Thus, it could be used e.g. to control dampers.Keywords: Bulk modulus, pressure wave, sound velocity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 430191 Optimizing Spatial Trend Detection By Artificial Immune Systems
Authors: M. Derakhshanfar, B. Minaei-Bidgoli
Abstract:
Spatial trends are one of the valuable patterns in geo databases. They play an important role in data analysis and knowledge discovery from spatial data. A spatial trend is a regular change of one or more non spatial attributes when spatially moving away from a start object. Spatial trend detection is a graph search problem therefore heuristic methods can be good solution. Artificial immune system (AIS) is a special method for searching and optimizing. AIS is a novel evolutionary paradigm inspired by the biological immune system. The models based on immune system principles, such as the clonal selection theory, the immune network model or the negative selection algorithm, have been finding increasing applications in fields of science and engineering. In this paper, we develop a novel immunological algorithm based on clonal selection algorithm (CSA) for spatial trend detection. We are created neighborhood graph and neighborhood path, then select spatial trends that their affinity is high for antibody. In an evolutionary process with artificial immune algorithm, affinity of low trends is increased with mutation until stop condition is satisfied.Keywords: Spatial Data Mining, Spatial Trend Detection, Heuristic Methods, Artificial Immune System, Clonal Selection Algorithm (CSA)
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 204590 Compensation–Based Current Decomposition
Authors: Mihaela Popescu, Alexandru Bitoleanu, Mircea Dobriceanu
Abstract:
This paper deals with the current space-vector decomposition in three-phase, three-wire systems on the basis of some case studies. We propose four components of the current spacevector in terms of DC and AC components of the instantaneous active and reactive powers. The term of supplementary useless current vector is also pointed out. The analysis shows that the current decomposition which respects the definition of the instantaneous apparent power vector is useful for compensation reasons only if the supply voltages are sinusoidal. A modified definition of the components of the current is proposed for the operation under nonsinusoidal voltage conditions.Keywords: Active current, Active filtering, p–q theory, Reactive current.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 151589 Shape Optimization of Permanent Magnet Motors Using the Reduced Basis Technique
Authors: A. Jabbari, M. Shakeri, A. Nabavi
Abstract:
In this paper, a tooth shape optimization method for cogging torque reduction in Permanent Magnet (PM) motors is developed by using the Reduced Basis Technique (RBT) coupled by Finite Element Analysis (FEA) and Design of Experiments (DOE) methods. The primary objective of the method is to reduce the enormous number of design variables required to define the tooth shape. RBT is a weighted combination of several basis shapes. The aim of the method is to find the best combination using the weights for each tooth shape as the design variables. A multi-level design process is developed to find suitable basis shapes or trial shapes at each level that can be used in the reduced basis technique. Each level is treated as a separated optimization problem until the required objective – minimum cogging torque – is achieved. The process is started with geometrically simple basis shapes that are defined by their shape co-ordinates. The experimental design of Taguchi method is used to build the approximation model and to perform optimization. This method is demonstrated on the tooth shape optimization of a 8-poles/12-slots PM motor.Keywords: PM motor, cogging torque, tooth shape optimization, RBT, FEA, DOE.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 250288 Effect of Recycle Gas on Activity and Selectivity of Co-Ru/Al2O3 Catalyst in Fischer- Tropsch Synthesis
Authors: A.A.Rohani, B.Hatami, L.Jokar, F.khorasheh, A.A.Safekordi
Abstract:
In industrial scale of Gas to Liquid (GTL) process in Fischer-Tropsch (FT) synthesis, a part of reactor outlet gases such as CO2 and CH4 as side reaction products, is usually recycled. In this study, the influence of CO2 and CH4 on the performance and selectivity of Co-Ru/Al2O3 catalyst is investigated by injection of these gases (0-20 vol. % of feed) to the feed stream. The effect of temperature and feed flow rate, are also inspected. The results show that low amounts of CO2 in the feed stream, doesn`t change the catalyst activity significantly but increasing the amount of CO2 (more than 10 vol. %) cause the CO conversion to decrease and the selectivity of heavy components to increase. Methane acts as an inert gas and doesn`t affect the catalyst performance. Increasing feed flow rate has negative effect on both CO conversion and heavy component selectivity. By raising the temperature, CO conversion will increase but there are more volatile components in the product. The effect of CO2 on the catalyst deactivation is also investigated carefully and a mechanism is suggested to explain the negative influence of CO2 on catalyst deactivation.Keywords: Alumina, Carbon dioxide, Cobalt catalyst, Conversion, Fischer Tropsch, Selectivity
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 197587 New Product-Type Estimators for the Population Mean Using Quartiles of the Auxiliary Variable
Authors: Amer Ibrahim Falah Al-Omari
Abstract:
In this paper, we suggest new product-type estimators for the population mean of the variable of interest exploiting the first or the third quartile of the auxiliary variable. We obtain mean square error equations and the bias for the estimators. We study the properties of these estimators using simple random sampling (SRS) and ranked set sampling (RSS) methods. It is found that, SRS and RSS produce approximately unbiased estimators of the population mean. However, the RSS estimators are more efficient than those obtained using SRS based on the same number of measured units for all values of the correlation coefficient.
Keywords: Product estimator, auxiliary variable, simple random sampling, extreme ranked set sampling
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 153086 Evidence of the Long-run Equilibrium between Money Demand Determinants in Croatia
Authors: B. Skrabic, N. Tomic-Plazibat
Abstract:
In this paper real money demand function is analyzed within multivariate time-series framework. Cointegration approach is used (Johansen procedure) assuming interdependence between money demand determinants, which are nonstationary variables. This will help us to understand the behavior of money demand in Croatia, revealing the significant influence between endogenous variables in vector autoregrression system (VAR), i.e. vector error correction model (VECM). Exogeneity of the explanatory variables is tested. Long-run money demand function is estimated indicating slow speed of adjustment of removing the disequilibrium. Empirical results provide the evidence that real industrial production and exchange rate explains the most variations of money demand in the long-run, while interest rate is significant only in short-run.Keywords: Cointegration, Long-run equilibrium, Money demand function, Vector error correction model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 215385 An Expectation of the Rate of Inflation According to Inflation-Unemployment Interaction in Croatia
Authors: Zdravka Aljinović, Snježana Pivac, Boško Šego
Abstract:
According to the interaction of inflation and unemployment, expectation of the rate of inflation in Croatia is estimated. The interaction between inflation and unemployment is shown by model based on three first-order differential i.e. difference equations: Phillips relation, adaptive expectations equation and monetary-policy equation. The resulting equation is second order differential i.e. difference equation which describes the time path of inflation. The data of the rate of inflation and the rate of unemployment are used for parameters estimation. On the basis of the estimated time paths, the stability and convergence analysis is done for the rate of inflation.Keywords: Differencing, inflation, time path, unemployment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 161384 Comparative Analysis of the Stochastic and Parsimonious Interest Rates Models on Croatian Government Market
Authors: Zdravka Aljinović, Branka Marasović, Blanka Škrabić
Abstract:
The paper provides a discussion of the most relevant aspects of yield curve modeling. Two classes of models are considered: stochastic and parsimonious function based, through the approaches developed by Vasicek (1977) and Nelson and Siegel (1987). Yield curve estimates for Croatia are presented and their dynamics analyzed and finally, a comparative analysis of models is conducted.Keywords: the term structure of interest rates, Vasicek model, Nelson-Siegel model, Croatian Government market.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 150083 Determining Optimal Demand Rate and Production Decisions: A Geometric Programming Approach
Authors: Farnaz G. Nezami, Mir B. Aryanezhad, Seyed J. Sadjadi
Abstract:
In this paper a nonlinear model is presented to demonstrate the relation between production and marketing departments. By introducing some functions such as pricing cost and market share loss functions it will be tried to show some aspects of market modelling which has not been regarded before. The proposed model will be a constrained signomial geometric programming model. For model solving, after variables- modifications an iterative technique based on the concept of geometric mean will be introduced to solve the resulting non-standard posynomial model which can be applied to a wide variety of models in non-standard posynomial geometric programming form. At the end a numerical analysis will be presented to accredit the validity of the mentioned model.Keywords: Geometric programming, marketing, nonlinear optimization, production.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 143482 Validation and Selection between Machine Learning Technique and Traditional Methods to Reduce Bullwhip Effects: a Data Mining Approach
Authors: Hamid R. S. Mojaveri, Seyed S. Mousavi, Mojtaba Heydar, Ahmad Aminian
Abstract:
The aim of this paper is to present a methodology in three steps to forecast supply chain demand. In first step, various data mining techniques are applied in order to prepare data for entering into forecasting models. In second step, the modeling step, an artificial neural network and support vector machine is presented after defining Mean Absolute Percentage Error index for measuring error. The structure of artificial neural network is selected based on previous researchers' results and in this article the accuracy of network is increased by using sensitivity analysis. The best forecast for classical forecasting methods (Moving Average, Exponential Smoothing, and Exponential Smoothing with Trend) is resulted based on prepared data and this forecast is compared with result of support vector machine and proposed artificial neural network. The results show that artificial neural network can forecast more precisely in comparison with other methods. Finally, forecasting methods' stability is analyzed by using raw data and even the effectiveness of clustering analysis is measured.Keywords: Artificial Neural Networks (ANN), bullwhip effect, demand forecasting, Support Vector Machine (SVM).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 200981 Integrating the Theory of Constraints and Six Sigma in Manufacturing Process Improvement
Authors: Kai Jin, Hyder Abdul-Razzak, Yousri Elkassabgi, Hong Zhou, Aaron Herrera
Abstract:
Six Sigma is a well known discipline that reduces variation using complex statistical tools and the DMAIC model. By integrating Goldratts-s Theory of Constraints, the Five Focusing Points and System Thinking tools, Six Sigma projects can be selected where it can cause more impact in the company. This research defines an integrated model of six sigma and constraint management that shows a step-by-step guide using the original methodologies from each discipline and is evaluated in a case study from the production line of a Automobile engine monoblock V8, resulting in an increase in the line capacity from 18.7 pieces per hour to 22.4 pieces per hour, a reduction of 60% of Work-In-Process and a variation decrease of 0.73%.Keywords: Constraint Management, Manufacturing Process Improvement, Six Sigma, System Thinking.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 176280 Tracing Quality Cost in a Luggage Manufacturing Industry
Authors: S. B. Jaju, R. R. Lakhe
Abstract:
Quality costs are the costs associated with preventing, finding, and correcting defective work. Since the main language of corporate management is money, quality-related costs act as means of communication between the staff of quality engineering departments and the company managers. The objective of quality engineering is to minimize the total quality cost across the life of product. Quality costs provide a benchmark against which improvement can be measured over time. It provides a rupee-based report on quality improvement efforts. It is an effective tool to identify, prioritize and select quality improvement projects. After reviewing through the literature it was noticed that a simplified methodology for data collection of quality cost in a manufacturing industry was required. The quantified standard methodology is proposed for collecting data of various elements of quality cost categories for manufacturing industry. Also in the light of research carried out so far, it is felt necessary to standardise cost elements in each of the prevention, appraisal, internal failure and external failure costs. . Here an attempt is made to standardise the various cost elements applicable to manufacturing industry and data is collected by using the proposed quantified methodology. This paper discusses the case study carried in luggage manufacturing industry.Keywords: Quality Costs, PAF model, quantified methodology, Case study.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 225279 On the Variability of Tool Wear and Life at Disparate Operating Parameters
Authors: S. E. Oraby, A.M. Alaskari
Abstract:
The stochastic nature of tool life using conventional discrete-wear data from experimental tests usually exists due to many individual and interacting parameters. It is a common practice in batch production to continually use the same tool to machine different parts, using disparate machining parameters. In such an environment, the optimal points at which tools have to be changed, while achieving minimum production cost and maximum production rate within the surface roughness specifications, have not been adequately studied. In the current study, two relevant aspects are investigated using coated and uncoated inserts in turning operations: (i) the accuracy of using machinability information, from fixed parameters testing procedures, when variable parameters situations are emerged, and (ii) the credibility of tool life machinability data from prior discrete testing procedures in a non-stop machining. A novel technique is proposed and verified to normalize the conventional fixed parameters machinability data to suit the cases when parameters have to be changed for the same tool. Also, an experimental investigation has been established to evaluate the error in the tool life assessment when machinability from discrete testing procedures is employed in uninterrupted practical machining.
Keywords: Machinability, tool life, tool wear, wear variability
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 179678 Production Planning and Measuring Method for Non Patterned Production System Using Stock Cutting Model
Authors: S. Homrossukon, D. Aromstain
Abstract:
The simple methods used to plan and measure non patterned production system are developed from the basic definition of working efficiency. Processing time is assigned as the variable and used to write the equation of production efficiency. Consequently, such equation is extensively used to develop the planning method for production of interest using one-dimensional stock cutting problem. The application of the developed method shows that production efficiency and production planning can be determined effectively.Keywords: Production Planning, Parallel Machine, Production Measurement, Cutting and Packing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 119977 A New Approach for Prioritization of Failure Modes in Design FMEA using ANOVA
Authors: Sellappan Narayanagounder, Karuppusami Gurusami
Abstract:
The traditional Failure Mode and Effects Analysis (FMEA) uses Risk Priority Number (RPN) to evaluate the risk level of a component or process. The RPN index is determined by calculating the product of severity, occurrence and detection indexes. The most critically debated disadvantage of this approach is that various sets of these three indexes may produce an identical value of RPN. This research paper seeks to address the drawbacks in traditional FMEA and to propose a new approach to overcome these shortcomings. The Risk Priority Code (RPC) is used to prioritize failure modes, when two or more failure modes have the same RPN. A new method is proposed to prioritize failure modes, when there is a disagreement in ranking scale for severity, occurrence and detection. An Analysis of Variance (ANOVA) is used to compare means of RPN values. SPSS (Statistical Package for the Social Sciences) statistical analysis package is used to analyze the data. The results presented are based on two case studies. It is found that the proposed new methodology/approach resolves the limitations of traditional FMEA approach.Keywords: Failure mode and effects analysis, Risk priority code, Critical failure mode, Analysis of variance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 543776 Case on Manufacturing Cell Formation Using Production Flow Analysis
Authors: Vladimír Modrák
Abstract:
This paper offers a case study, in which methodological aspects of cell design for transformation the production process are applied. The cell redesign in this work is tightly focused to reach optimization of material flows under real manufacturing conditions. Accordingly, more individual techniques were aggregated into compact methodical procedure with aim to built one-piece flow production. Case study was concentrated on relatively typical situation of transformation from batch production to cellular manufacturing.Keywords: Product/Quantity analysis, layout, design, manufacturing process.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 334075 A Goal Programming Approach for Plastic Recycling System in Thailand
Authors: Wuthichai Wongthatsanekorn
Abstract:
Plastic waste is a big issue in Thailand, but the amount of recycled plastic in Thailand is still low due to the high investment and operating cost. Hence, the rest of plastic waste are burnt to destroy or sent to the landfills. In order to be financial viable, an effective reverse logistics infrastructure is required to support the product recovery activities. However, there is a conflict between reducing the cost and raising environmental protection level. The purpose of this study is to build a goal programming (GP) so that it can be used to help analyze the proper planning of the Thailand-s plastic recycling system that involves multiple objectives. This study considers three objectives; reducing total cost, increasing the amount of plastic recovery, and raising the desired plastic materials in recycling process. The results from two priority structures show that it is necessary to raise the total cost budget in order to achieve targets on amount of recycled plastic and desired plastic materials.
Keywords: Goal Programming, Plastic Recycling, Thailand.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 265574 Deep Web Content Mining
Authors: Shohreh Ajoudanian, Mohammad Davarpanah Jazi
Abstract:
The rapid expansion of the web is causing the constant growth of information, leading to several problems such as increased difficulty of extracting potentially useful knowledge. Web content mining confronts this problem gathering explicit information from different web sites for its access and knowledge discovery. Query interfaces of web databases share common building blocks. After extracting information with parsing approach, we use a new data mining algorithm to match a large number of schemas in databases at a time. Using this algorithm increases the speed of information matching. In addition, instead of simple 1:1 matching, they do complex (m:n) matching between query interfaces. In this paper we present a novel correlation mining algorithm that matches correlated attributes with smaller cost. This algorithm uses Jaccard measure to distinguish positive and negative correlated attributes. After that, system matches the user query with different query interfaces in special domain and finally chooses the nearest query interface with user query to answer to it.Keywords: Content mining, complex matching, correlation mining, information extraction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 227773 An Efficient Approach to Mining Frequent Itemsets on Data Streams
Authors: Sara Ansari, Mohammad Hadi Sadreddini
Abstract:
The increasing importance of data stream arising in a wide range of advanced applications has led to the extensive study of mining frequent patterns. Mining data streams poses many new challenges amongst which are the one-scan nature, the unbounded memory requirement and the high arrival rate of data streams. In this paper, we propose a new approach for mining itemsets on data stream. Our approach SFIDS has been developed based on FIDS algorithm. The main attempts were to keep some advantages of the previous approach and resolve some of its drawbacks, and consequently to improve run time and memory consumption. Our approach has the following advantages: using a data structure similar to lattice for keeping frequent itemsets, separating regions from each other with deleting common nodes that results in a decrease in search space, memory consumption and run time; and Finally, considering CPU constraint, with increasing arrival rate of data that result in overloading system, SFIDS automatically detect this situation and discard some of unprocessing data. We guarantee that error of results is bounded to user pre-specified threshold, based on a probability technique. Final results show that SFIDS algorithm could attain about 50% run time improvement than FIDS approach.Keywords: Data stream, frequent itemset, stream mining.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 141872 Small and Silly? or Private Pitfall of Small and Medium-Sized Enterprises
Authors: A. Bencsik, V. Lőre, I. Marosi
Abstract:
Knowledge and these notions have become more and more important and we speak about a knowledge based society today. A lot of small and big companies have reacted upon these new challenges. But there is a deep abyss about knowledge conception and practice between the professional researchers and company - life. The question of this research was: How can small and mediumsized companies be equal to the demands of new economy? Questionnaires were used in this research and a special segment of the native knowledge based on economy was focused on. Researchers would have liked to know what the sources of success are and how they can be in connection with questions of knowledge acquisition, knowledge transfer, knowledge utilization in small and medium-sized companies. These companies know that they have to change their behaviour and thinking, but they are not on the suitable level that they can compete with bigger or multinational companies.Keywords: Knowledge, management, small and medium-sized companies, study.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 134871 A Modified Fuzzy C-Means Algorithm for Natural Data Exploration
Authors: Binu Thomas, Raju G., Sonam Wangmo
Abstract:
In Data mining, Fuzzy clustering algorithms have demonstrated advantage over crisp clustering algorithms in dealing with the challenges posed by large collections of vague and uncertain natural data. This paper reviews concept of fuzzy logic and fuzzy clustering. The classical fuzzy c-means algorithm is presented and its limitations are highlighted. Based on the study of the fuzzy c-means algorithm and its extensions, we propose a modification to the cmeans algorithm to overcome the limitations of it in calculating the new cluster centers and in finding the membership values with natural data. The efficiency of the new modified method is demonstrated on real data collected for Bhutan-s Gross National Happiness (GNH) program.Keywords: Adaptive fuzzy clustering, clustering, fuzzy logic, fuzzy clustering, c-means.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 198970 Design Analysis of a Slotted Microstrip Antenna for Wireless Communication
Authors: Norbahiah Misran, Mohammed N. Shakib, Mohammad T. Islam, Baharudin Yatim
Abstract:
In this paper, a new design technique for enhancing bandwidth that improves the performance of a conventional microstrip patch antenna is proposed. This paper presents a novel wideband probe fed inverted slotted microstrip patch antenna. The design adopts contemporary techniques; coaxial probe feeding, inverted patch structure and slotted patch. The composite effect of integrating these techniques and by introducing the proposed patch, offer a low profile, broadband, high gain, and low cross-polarization level. The results for the VSWR, gain and co-and cross-polarization patterns are presented. The antenna operating the band of 1.80-2.36 GHz shows an impedance bandwidth (2:1 VSWR) of 27% and a gain of 10.18 dBi with a gain variation of 1.12 dBi. Good radiation characteristics, including a cross-polarization level in xz-plane less than -42 dB, have been obtained.Keywords: Slotted antenna, microstrip patch antenna, wideband, coaxial probe fed.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 289969 Digital Learning Environments for Joint Master in Science Programmes in Building and Construction in Europe: Experimenting with Tools and Technologies
Authors: E. Dado, R. Beheshti
Abstract:
Recent developments in information and communication technologies (ICT) have created excellent conditions for profoundly enhancing the traditional learning and teaching practices. New modes of teaching in higher education subjects can profoundly enhance ones ability to proactively constructing his or her personal learning universe. These developments have contributed to digital learning environments becoming widely available and accessible. In addition, there is a trend towards enlargement and specialization in higher education in Europe. With as a result that existing Master of Science (MSc) programmes are merged or new programmes have been established that are offered as joint MSc programmes to students. In these joint MSc programmes, the need for (common) digital learning environments capable of surmounting the barriers of time and location has become evident. This paper discusses the past and ongoing efforts to establish such common digital learning environments in two joint MSc programmes in Europe and discusses the way technology-based learning environments affect the traditional way of learning.Keywords: education, engineering, learning environments, ICT.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 154968 Using Weblog to Promote Critical Thinking – An Exploratory Study
Authors: Huay Lit Woo, Qiyun Wang
Abstract:
Weblog is an Internet tool that is believed to possess great potential to facilitate learning in education. This study wants to know if weblog can be used to promote students- critical thinking. It used a group of secondary two students from a Singapore school to write weblogs as a means of substitution for their traditional handwritten assignments. The topics for the weblogging are taken from History syllabus but modified to suit the purpose of this study. Weblogs from the students were collected and analysed using a known coding system for measuring critical thinking. Results show that the topic for blogging is crucial in determining the types of critical thinking employed by the students. Students are seen to display critical thinking traits in the areas of information sourcing, linking information to arguments and viewpoints justification. Students- criticalness is more profound when the information for writing a topic is readily available. Otherwise, they tend to be less critical and subjective. The study also found that students lack the ability to source for external information suggesting that students may need to be taught information literacy in order to widen their use of critical thinking skills.Keywords: Affordance, blog, critical thinking, perception, weblog.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 217067 The Experiences of South-African High-School Girls in a Fab Lab Environment
Authors: Nomusa Dlodlo, Ronald Noel Beyers
Abstract:
This paper reports on an effort to address the issue of inequality in girls- and women-s access to science, engineering and technology (SET) education and careers through raising awareness on SET among secondary school girls in South Africa. Girls participated in hands-on high-tech rapid prototyping environment of a fabrication laboratory that was aimed at stimulating creativity and innovation as part of a Fab Kids initiative. The Fab Kids intervention is about creating a SET pipeline as part of the Young Engineers and Scientists of Africa Initiative.The methodology was based on a real world situation and a hands-on approach. In the process, participants acquired a number of skills including computer-aided design, research skills, communication skills, teamwork skills, technical drawing skills, writing skills and problem-solving skills. Exposure to technology enhanced the girls- confidence in being able to handle technology-related tasks.Keywords: Girls, design engineering, gender, science, women.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 253766 Explorative Data Mining of Constructivist Learning Experiences and Activities with Multiple Dimensions
Authors: Patrick Wessa, Bart Baesens
Abstract:
This paper discusses the use of explorative data mining tools that allow the educator to explore new relationships between reported learning experiences and actual activities, even if there are multiple dimensions with a large number of measured items. The underlying technology is based on the so-called Compendium Platform for Reproducible Computing (http://www.freestatistics.org) which was built on top the computational R Framework (http://www.wessa.net).Keywords: Reproducible computing, data mining, explorative data analysis, compendium technology, computer assisted education
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 125265 Social Influence in the Adoption Process and Usage of Innovation: Gender Differences
Authors: S. Güzin Mazman, Yasemin Koçak Usluel, Vildan Çevik
Abstract:
The purpose of this study is to determine in what ways elementary education prospective teachers are being informed about innovations and to explain the role of social influence in the usage process of a technological innovation in terms of genders. The study group consisted of 300 prospective teachers, including 234 females and 66 males. Data have been collected by a questionnaire developed by the researchers. The result of the study showed that, while prospective teachers are being informed about innovations most frequently by mass media, they rarely seek to take expert advice. In addition, analysis of results showed that the social influence on females were significantly higher than males in usage process of a technological innovation.Keywords: Gender differences, social influence, adoption, innovation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 279964 The Usage of Social Networks in Educational Context
Authors: Sacide Güzin Mazman, Yasemin Koçak Usluel
Abstract:
Possible advantages of technology in educational context required the defining boundaries of formal and informal learning. Increasing opportunity to ubiquitous learning by technological support has revealed a question of how to discover the potential of individuals in the spontaneous environments such as social networks. This seems to be related with the question of what purposes in social networks have been being used? Social networks provide various advantages in educational context as collaboration, knowledge sharing, common interests, active participation and reflective thinking. As a consequence of these, the purpose of this study is composed of proposing a new model that could determine factors which effect adoption of social network applications for usage in educational context. While developing a model proposal, the existing adoption and diffusion models have been reviewed and they are thought to be suitable on handling an original perspective instead of using completely other diffusion or acceptance models because of different natures of education from other organizations. In the proposed model; social factors, perceived ease of use, perceived usefulness and innovativeness are determined four direct constructs that effect adoption process. Facilitating conditions, image, subjective norms and community identity are incorporated to model as antecedents of these direct four constructs.Keywords: Adoption of innovation, educational context, social networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 387563 Instructional Design Using the Virtual Ecological Pond for Science Education in Elementary Schools
Authors: Wernhuar Tarng, Wen-Shin Tsai, Yu-Si Lin, Chen-Kai Shiu
Abstract:
Ecological ponds can be a good teaching tool for science teachers, but they must be built and maintained properly to provide students with a safe and suitable learning environment. Hence, many schools do not have the ability to build an ecological pond. This study used virtual reality technology to develop a webbased virtual ecological pond. Supported by situated learning theory and the instructional design of “Aquatic Life" learning unit, elementary school students can actively explore in the virtual ecological pond to observe aquatic animals and plants and learn about the concept of ecological conservation. A teaching experiment was conducted to investigate the learning effectiveness and practicability of this instructional design, and the results showed that students improved a great deal in learning about aquatic life. They found the virtual ecological pond interesting, easy to operate and helpful to understanding the aquatic ecological system. Therefore, it is useful in elementary science education.Keywords: Virtual reality, virtual ecological ponds, situated learning, instructional design, science education.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 205762 IDEL - A simple Instructional Design Tool for E-Learning
Authors: A. Zimnas, D. Kleftouris, N. Valkanos
Abstract:
Today-s Information and Knowledge Society has placed new demands on education and a new paradigm of education is required. Learning, facilitated by educational systems and the pedagogic process, is globally undergoing dramatic changes. The aim of this paper is the development of a simple Instructional Design tool for E-Learning, named IDEL (Instructional Design for Electronic Learning), that provides the educators with facilities to create their own courses with the essential educational material and manage communication with students. It offers flexibility in the way of learning and provides ease in employment and reusability of resources. IDEL is a web-based Instructional System and is designed to facilitate course design process in accordance with the ADDIE model and the instructional design principles with emphasis placed on the use of technology enhanced learning. An example case of using the ADDIE model to systematically develop a course and its implementation with the aid of IDEL is given and some results from student evaluation of the tool and the course are reported.Keywords: Education, E-learning, Instructional Design.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 206161 Database Development and Discrimination Algorithms for Membrane Protein Functions
Authors: M. Michael Gromiha, Y. Yabuki, K. Imai, P. Horton, K. Fukui
Abstract:
We have developed a database for membrane protein functions, which has more than 3000 experimental data on functionally important amino acid residues in membrane proteins along with sequence, structure and literature information. Further, we have proposed different methods for identifying membrane proteins based on their functions: (i) discrimination of membrane transport proteins from other globular and membrane proteins and classifying them into channels/pores, electrochemical and active transporters, and (ii) β-signal for the insertion of mitochondrial β-barrel outer membrane proteins and potential targets. Our method showed an accuracy of 82% in discriminating transport proteins and 68% to classify them into three different transporters. In addition, we have identified a motif for targeting β-signal and potential candidates for mitochondrial β-barrel membrane proteins. Our methods can be used as effective tools for genome-wide annotations.
Keywords: Membrane proteins, database, transporters, discrimination, β-signal.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 156760 Optimization by Ant Colony Hybryde for the Bin-Packing Problem
Authors: Ben Mohamed Ahemed Mohamed, Yassine Adnan
Abstract:
The problem of bin-packing in two dimensions (2BP) consists in placing a given set of rectangular items in a minimum number of rectangular and identical containers, called bins. This article treats the case of objects with a free orientation of 90Ôùª. We propose an approach of resolution combining optimization by colony of ants (ACO) and the heuristic method IMA to resolve this NP-Hard problem.
Keywords: Ant colony algorithm, bin-packing problem, heuristics methods.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 184259 BIDENS: Iterative Density Based Biclustering Algorithm With Application to Gene Expression Analysis
Authors: Mohamed A. Mahfouz, M. A. Ismail
Abstract:
Biclustering is a very useful data mining technique for identifying patterns where different genes are co-related based on a subset of conditions in gene expression analysis. Association rules mining is an efficient approach to achieve biclustering as in BIMODULE algorithm but it is sensitive to the value given to its input parameters and the discretization procedure used in the preprocessing step, also when noise is present, classical association rules miners discover multiple small fragments of the true bicluster, but miss the true bicluster itself. This paper formally presents a generalized noise tolerant bicluster model, termed as μBicluster. An iterative algorithm termed as BIDENS based on the proposed model is introduced that can discover a set of k possibly overlapping biclusters simultaneously. Our model uses a more flexible method to partition the dimensions to preserve meaningful and significant biclusters. The proposed algorithm allows discovering biclusters that hard to be discovered by BIMODULE. Experimental study on yeast, human gene expression data and several artificial datasets shows that our algorithm offers substantial improvements over several previously proposed biclustering algorithms.Keywords: Machine learning, biclustering, bi-dimensional clustering, gene expression analysis, data mining.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 196258 Fuzzy Relatives of the CLARANS Algorithm With Application to Text Clustering
Authors: Mohamed A. Mahfouz, M. A. Ismail
Abstract:
This paper introduces new algorithms (Fuzzy relative of the CLARANS algorithm FCLARANS and Fuzzy c Medoids based on randomized search FCMRANS) for fuzzy clustering of relational data. Unlike existing fuzzy c-medoids algorithm (FCMdd) in which the within cluster dissimilarity of each cluster is minimized in each iteration by recomputing new medoids given current memberships, FCLARANS minimizes the same objective function minimized by FCMdd by changing current medoids in such away that that the sum of the within cluster dissimilarities is minimized. Computing new medoids may be effected by noise because outliers may join the computation of medoids while the choice of medoids in FCLARANS is dictated by the location of a predominant fraction of points inside a cluster and, therefore, it is less sensitive to the presence of outliers. In FCMRANS the step of computing new medoids in FCMdd is modified to be based on randomized search. Furthermore, a new initialization procedure is developed that add randomness to the initialization procedure used with FCMdd. Both FCLARANS and FCMRANS are compared with the robust and linearized version of fuzzy c-medoids (RFCMdd). Experimental results with different samples of the Reuter-21578, Newsgroups (20NG) and generated datasets with noise show that FCLARANS is more robust than both RFCMdd and FCMRANS. Finally, both FCMRANS and FCLARANS are more efficient and their outputs are almost the same as that of RFCMdd in terms of classification rate.Keywords: Data Mining, Fuzzy Clustering, Relational Clustering, Medoid-Based Clustering, Cluster Analysis, Unsupervised Learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 240157 A Metametadata Architecture forPedagogic Data Description
Authors: A. Ismail, M. S. Joy, J. E. Sinclair, M. I. Hamzah
Abstract:
This paper focuses on a novel method for semantic searching and retrieval of information about learning materials. Metametadata encapsulate metadata instances by using the properties and attributes provided by ontologies rather than describing learning objects. A novel metametadata taxonomy has been developed which provides the basis for a semantic search engine to extract, match and map queries to retrieve relevant results. The use of ontological views is a foundation for viewing the pedagogical content of metadata extracted from learning objects by using the pedagogical attributes from the metametadata taxonomy. Using the ontological approach and metametadata (based on the metametadata taxonomy) we present a novel semantic searching mechanism.These three strands – the taxonomy, the ontological views, and the search algorithm – are incorporated into a novel architecture (OMESCOD) which has been implemented.Keywords: Metadata, metametadata, semantic, ontologies.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 151356 Construction and Performance Characterization of the Looped-Tube Travelling-Wave Thermoacoustic Engine with Ceramic Regenerator
Authors: Abdulrahman S. Abduljalil, Zhibin Yu, Artur J. Jaworski, Lei Shi
Abstract:
In a travelling wave thermoacoustic device, the regenerator sandwiched between a pair of (hot and cold) heat exchangers constitutes the so-called thermoacoustic core, where the thermoacoustic energy conversion from heat to acoustic power takes place. The temperature gradient along the regenerator caused by the two heat exchangers excites and maintains the acoustic wave in the resonator. The devices are called travelling wave thermoacoustic systems because the phase angle difference between the pressure and velocity oscillation is close to zero in the regenerator. This paper presents the construction and testing of a thermoacoustic engine equipped with a ceramic regenerator, made from a ceramic material that is usually used as catalyst substrate in vehicles- exhaust systems, with fine square channels (900 cells per square inch). The testing includes the onset temperature difference (minimum temperature difference required to start the acoustic oscillation in an engine), the acoustic power output, thermal efficiency and the temperature profile along the regenerator.Keywords: Regenerator, Temperature gradient, Thermoacoustic, Travelling-wave.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 226555 Heat Transfer and Frictional Characteristics in Rectangular Channel with Inclined Perforated Baffles
Authors: Se Kyung Oh, Ary Bachtiar Krishna Putra, Soo Whan Ahn
Abstract:
A numerical study on the turbulent flow and heat transfer characteristics in the rectangular channel with different types of baffles is carried out. The inclined baffles have the width of 19.8 cm, the square diamond type hole having one side length of 2.55 cm, and the inclination angle of 5o. Reynolds number is varied between 23,000 and 57,000. The SST turbulence model is applied in the calculation. The validity of the numerical results is examined by the experimental data. The numerical results of the flow field depict that the flow patterns around the different baffle type are entirely different and these significantly affect the local heat transfer characteristics. The heat transfer and friction factor characteristics are significantly affected by the perforation density of the baffle plate. It is found that the heat transfer enhancement of baffle type II (3 hole baffle) has the best values.Keywords: Turbulent flow, rectangular channel, inclined baffle, heat transfer, friction factor.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 233354 Bitrate Reduction Using FMO for Video Streaming over Packet Networks
Authors: Le Thanh Ha, Hye-Soo Kim, Chun-Su Park, Seung-Won Jung, Sung-Jea Ko
Abstract:
Flexible macroblock ordering (FMO), adopted in the H.264 standard, allows to partition all macroblocks (MBs) in a frame into separate groups of MBs called Slice Groups (SGs). FMO can not only support error-resilience, but also control the size of video packets for different network types. However, it is well-known that the number of bits required for encoding the frame is increased by adopting FMO. In this paper, we propose a novel algorithm that can reduce the bitrate overhead caused by utilizing FMO. In the proposed algorithm, all MBs are grouped in SGs based on the similarity of the transform coefficients. Experimental results show that our algorithm can reduce the bitrate as compared with conventional FMO.Keywords: Data Partition, Entropy Coding, Greedy Algorithm, H.264/AVC, Slice Group.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 130453 A New Method for Multiobjective Optimization Based on Learning Automata
Authors: M. R. Aghaebrahimi, S. H. Zahiri, M. Amiri
Abstract:
The necessity of solving multi dimensional complicated scientific problems beside the necessity of several objective functions optimization are the most motive reason of born of artificial intelligence and heuristic methods. In this paper, we introduce a new method for multiobjective optimization based on learning automata. In the proposed method, search space divides into separate hyper-cubes and each cube is considered as an action. After gathering of all objective functions with separate weights, the cumulative function is considered as the fitness function. By the application of all the cubes to the cumulative function, we calculate the amount of amplification of each action and the algorithm continues its way to find the best solutions. In this Method, a lateral memory is used to gather the significant points of each iteration of the algorithm. Finally, by considering the domination factor, pareto front is estimated. Results of several experiments show the effectiveness of this method in comparison with genetic algorithm based method.Keywords: Function optimization, Multiobjective optimization, Learning automata.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 167752 Data Mining Using Learning Automata
Authors: M. R. Aghaebrahimi, S. H. Zahiri, M. Amiri
Abstract:
In this paper a data miner based on the learning automata is proposed and is called LA-miner. The LA-miner extracts classification rules from data sets automatically. The proposed algorithm is established based on the function optimization using learning automata. The experimental results on three benchmarks indicate that the performance of the proposed LA-miner is comparable with (sometimes better than) the Ant-miner (a data miner algorithm based on the Ant Colony optimization algorithm) and CNZ (a well-known data mining algorithm for classification).Keywords: Data mining, Learning automata, Classification rules, Knowledge discovery.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 193451 Position Based Routing Protocol with More Reliability in Mobile Ad Hoc Network
Authors: Mahboobeh Abdoos, Karim Faez, Masoud Sabaei
Abstract:
Position based routing protocols are the kinds of routing protocols, which they use of nodes location information, instead of links information to routing. In position based routing protocols, it supposed that the packet source node has position information of itself and it's neighbors and packet destination node. Greedy is a very important position based routing protocol. In one of it's kinds, named MFR (Most Forward Within Radius), source node or packet forwarder node, sends packet to one of it's neighbors with most forward progress towards destination node (closest neighbor to destination). Using distance deciding metric in Greedy to forward packet to a neighbor node, is not suitable for all conditions. If closest neighbor to destination node, has high speed, in comparison with source node or intermediate packet forwarder node speed or has very low remained battery power, then packet loss probability is increased. Proposed strategy uses combination of metrics distancevelocity similarity-power, to deciding about giving the packet to which neighbor. Simulation results show that the proposed strategy has lower lost packets average than Greedy, so it has more reliability.Keywords: Mobile Ad Hoc Network, Position Based, Reliability, Routing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 176250 Anomaly Detection and Characterization to Classify Traffic Anomalies Case Study: TOT Public Company Limited Network
Authors: O. Siriporn, S. Benjawan
Abstract:
This paper represents four unsupervised clustering algorithms namely sIB, RandomFlatClustering, FarthestFirst, and FilteredClusterer that previously works have not been used for network traffic classification. The methodology, the result, the products of the cluster and evaluation of these algorithms with efficiency of each algorithm from accuracy are shown. Otherwise, the efficiency of these algorithms considering form the time that it use to generate the cluster quickly and correctly. Our work study and test the best algorithm by using classify traffic anomaly in network traffic with different attribute that have not been used before. We analyses the algorithm that have the best efficiency or the best learning and compare it to the previously used (K-Means). Our research will be use to develop anomaly detection system to more efficiency and more require in the future.
Keywords: Unsupervised, clustering, anomaly, machine learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 211149 A Novel Deinterlacing Algorithm Based on Adaptive Polynomial Interpolation
Authors: Seung-Won Jung, Hye-Soo Kim, Le Thanh Ha, Seung-Jin Baek, Sung-Jea Ko
Abstract:
In this paper, a novel deinterlacing algorithm is proposed. The proposed algorithm approximates the distribution of the luminance into a polynomial function. Instead of using one polynomial function for all pixels, different polynomial functions are used for the uniform, texture, and directional edge regions. The function coefficients for each region are computed by matrix multiplications. Experimental results demonstrate that the proposed method performs better than the conventional algorithms.Keywords: Deinterlacing, polynomial interpolation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 138148 Surface Topography Assessment Techniques based on an In-process Monitoring Approach of Tool Wear and Cutting Force Signature
Authors: A. M. Alaskari, S. E. Oraby
Abstract:
The quality of a machined surface is becoming more and more important to justify the increasing demands of sophisticated component performance, longevity, and reliability. Usually, any machining operation leaves its own characteristic evidence on the machined surface in the form of finely spaced micro irregularities (surface roughness) left by the associated indeterministic characteristics of the different elements of the system: tool-machineworkpart- cutting parameters. However, one of the most influential sources in machining affecting surface roughness is the instantaneous state of tool edge. The main objective of the current work is to relate the in-process immeasurable cutting edge deformation and surface roughness to a more reliable easy-to-measure force signals using a robust non-linear time-dependent modeling regression techniques. Time-dependent modeling is beneficial when modern machining systems, such as adaptive control techniques are considered, where the state of the machined surface and the health of the cutting edge are monitored, assessed and controlled online using realtime information provided by the variability encountered in the measured force signals. Correlation between wear propagation and roughness variation is developed throughout the different edge lifetimes. The surface roughness is further evaluated in the light of the variation in both the static and the dynamic force signals. Consistent correlation is found between surface roughness variation and tool wear progress within its initial and constant regions. At the first few seconds of cutting, expected and well known trend of the effect of the cutting parameters is observed. Surface roughness is positively influenced by the level of the feed rate and negatively by the cutting speed. As cutting continues, roughness is affected, to different extents, by the rather localized wear modes either on the tool nose or on its flank areas. Moreover, it seems that roughness varies as wear attitude transfers from one mode to another and, in general, it is shown that it is improved as wear increases but with possible corresponding workpart dimensional inaccuracy. The dynamic force signals are found reasonably sensitive to simulate either the progressive or the random modes of tool edge deformation. While the frictional force components, feeding and radial, are found informative regarding progressive wear modes, the vertical (power) components is found more representative carrier to system instability resulting from the edge-s random deformation.
Keywords: Dynamic force signals, surface roughness (finish), tool wear and deformation, tool wear modes (nose, flank)
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 134847 Using Linear Quadratic Gaussian Optimal Control for Lateral Motion of Aircraft
Authors: A. Maddi, A. Guessoum, D. Berkani
Abstract:
The purpose of this paper is to provide a practical example to the Linear Quadratic Gaussian (LQG) controller. This method includes a description and some discussion of the discrete Kalman state estimator. One aspect of this optimality is that the estimator incorporates all information that can be provided to it. It processes all available measurements, regardless of their precision, to estimate the current value of the variables of interest, with use of knowledge of the system and measurement device dynamics, the statistical description of the system noises, measurement errors, and uncertainty in the dynamics models. Since the time of its introduction, the Kalman filter has been the subject of extensive research and application, particularly in the area of autonomous or assisted navigation. For example, to determine the velocity of an aircraft or sideslip angle, one could use a Doppler radar, the velocity indications of an inertial navigation system, or the relative wind information in the air data system. Rather than ignore any of these outputs, a Kalman filter could be built to combine all of this data and knowledge of the various systems- dynamics to generate an overall best estimate of velocity and sideslip angle.Keywords: Aircraft motion, Kalman filter, LQG control, Lateral stability, State estimator.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 246946 Grid Learning; Computer Grid Joins to e- Learning
Authors: A. Nassiry, A. Kardan
Abstract:
According to development of communications and web-based technologies in recent years, e-Learning has became very important for everyone and is seen as one of most dynamic teaching methods. Grid computing is a pattern for increasing of computing power and storage capacity of a system and is based on hardware and software resources in a network with common purpose. In this article we study grid architecture and describe its different layers. In this way, we will analyze grid layered architecture. Then we will introduce a new suitable architecture for e-Learning which is based on grid network, and for this reason we call it Grid Learning Architecture. Various sections and layers of suggested architecture will be analyzed; especially grid middleware layer that has key role. This layer is heart of grid learning architecture and, in fact, regardless of this layer, e-Learning based on grid architecture will not be feasible.Keywords: Distributed learning, Grid Learning, Grid network, SCORM standard.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 172645 A Matlab / Simulink Based Tool for Power Electronic Circuits
Authors: Abdulatif A. M. Shaban
Abstract:
Transient simulation of power electronic circuits is of considerable interest to the designer. The switching nature of the devices used permits development of specialized algorithms which allow a considerable reduction in simulation time compared to general purpose simulation algorithms. This paper describes a method used to simulate a power electronic circuits using the SIMULINK toolbox within MATLAB software. Theoretical results are presented provides the basis of transient analysis of a power electronic circuits.Keywords: Modelling, Simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 554144 Interfacing C and TMS320C6713 Assembly Language (Part-I)
Authors: Abdullah A. Wardak
Abstract:
This paper describes an interfacing of C and the TMS320C6713 assembly language which is crucially important for many real-time applications. Similarly, interfacing of C with the assembly language of a conventional microprocessor such as MC68000 is presented for comparison. However, it should be noted that the way the C compiler passes arguments among various functions in the TMS320C6713-based environment is totally different from the way the C compiler passes arguments in a conventional microprocessor such as MC68000. Therefore, it is very important for a user of the TMS320C6713-based system to properly understand and follow the register conventions when interfacing C with the TMS320C6713 assembly language subroutine. It should be also noted that in some cases (examples 6-9) the endian-mode of the board needs to be taken into consideration. In this paper, one method is presented in great detail. Other methods will be presented in the future.Keywords: Assembly language, high level language, interfacing, stack, arguments.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 252043 Cosastudio: A Software Architecture Modeling Tool
Authors: Adel Smeda, Adel Alti, Mourad Oussalah, Abdallah Boukerram
Abstract:
A key aspect of the design of any software system is its architecture. An architecture description provides a formal model of the architecture in terms of components and connectors and how they are composed together. COSA (Component-Object based Software Structures), is based on object-oriented modeling and component-based modeling. The model improves the reusability by increasing extensibility, evolvability, and compositionality of the software systems. This paper presents the COSA modelling tool which help architects the possibility to verify the structural coherence of a given system and to validate its semantics with COSA approach.Keywords: Software Architecture, Architecture Description Languages, UML, Components, Connectors.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 168042 Information Resource Management Maturity Model
Authors: Afshari H., Khosravi Sh.
Abstract:
Nowadays there are more than thirty maturity models in different knowledge areas. Maturity model is an area of interest that contributes organizations to find out where they are in a specific knowledge area and how to improve it. As Information Resource Management (IRM) is the concept that information is a major corporate resource and must be managed using the same basic principles used to manage other assets, assessment of the current IRM status and reveal the improvement points can play a critical role in developing an appropriate information structure in organizations. In this paper we proposed a framework for information resource management maturity model (IRM3) that includes ten best practices for the maturity assessment of the organizations' IRM.Keywords: Information resource management (IRM), information resource management maturity model (IRM3), maturity model, best practice.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 238341 Analysis of a Secondary Autothermal Reformer Using a Thermodynamic POX Model
Authors: Akbar Zamaniyan, Alireza Behroozsarand, Hadi Ebrahimi
Abstract:
Partial oxidation (POX) of light hydrocarbons (e.g. methane) is occurred in the first part of the autothermal reformer (ATR). The results of the detailed modeling of the reformer based on the thermodynamic model of the POX and 1D heterogeneous catalytic model for the fixed bed section are considered here. According to the results, the overall performance of the ATR can be improved by changing the important feed parameters.Keywords: Autothermal Reformer, Partial Oxidation, Mathematical Modeling, Process Simulation, Syngas.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 220740 Design for Manufacturability and Concurrent Engineering for Product Development
Authors: Alemu Moges Belay
Abstract:
In the 1980s, companies began to feel the effect of three major influences on their product development: newer and innovative technologies, increasing product complexity and larger organizations. And therefore companies were forced to look for new product development methods. This paper tries to focus on the two of new product development methods (DFM and CE). The aim of this paper is to see and analyze different product development methods specifically on Design for Manufacturability and Concurrent Engineering. Companies can achieve and be benefited by minimizing product life cycle, cost and meeting delivery schedule. This paper also presents simplified models that can be modified and used by different companies based on the companies- objective and requirements. Methodologies that are followed to do this research are case studies. Two companies were taken and analysed on the product development process. Historical data, interview were conducted on these companies in addition to that, Survey of literatures and previous research works on similar topics has been done during this research. This paper also tries to show the implementation cost benefit analysis and tries to calculate the implementation time. From this research, it has been found that the two companies did not achieve the delivery time to the customer. Some of most frequently coming products are analyzed and 50% to 80 % of their products are not delivered on time to the customers. The companies are following the traditional way of product development that is sequentially design and production method, which highly affect time to market. In the case study it is found that by implementing these new methods and by forming multi disciplinary team in designing and quality inspection; the company can reduce the workflow steps from 40 to 30.
Keywords: Design for manufacturability, Concurrent Engineering, Time-to-Market, Product development
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 558539 Auto Regressive Tree Modeling for Parametric Optimization in Fuzzy Logic Control System
Authors: Arshia Azam, J. Amarnath, Ch. D. V. Paradesi Rao
Abstract:
The advantage of solving the complex nonlinear problems by utilizing fuzzy logic methodologies is that the experience or expert-s knowledge described as a fuzzy rule base can be directly embedded into the systems for dealing with the problems. The current limitation of appropriate and automated designing of fuzzy controllers are focused in this paper. The structure discovery and parameter adjustment of the Branched T-S fuzzy model is addressed by a hybrid technique of type constrained sparse tree algorithms. The simulation result for different system model is evaluated and the identification error is observed to be minimum.Keywords: Fuzzy logic, branch T-S fuzzy model, tree modeling, complex nonlinear system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 138838 Modeling, Simulation and Monitoring of Nuclear Reactor Using Directed Graph and Bond Graph
Authors: A. Badoud, M. Khemliche, S. Latreche
Abstract:
The main objective developed in this paper is to find a graphic technique for modeling, simulation and diagnosis of the industrial systems. This importance is much apparent when it is about a complex system such as the nuclear reactor with pressurized water of several form with various several non-linearity and time scales. In this case the analytical approach is heavy and does not give a fast idea on the evolution of the system. The tool Bond Graph enabled us to transform the analytical model into graphic model and the software of simulation SYMBOLS 2000 specific to the Bond Graphs made it possible to validate and have the results given by the technical specifications. We introduce the analysis of the problem involved in the faults localization and identification in the complex industrial processes. We propose a method of fault detection applied to the diagnosis and to determine the gravity of a detected fault. We show the possibilities of application of the new diagnosis approaches to the complex system control. The industrial systems became increasingly complex with the faults diagnosis procedures in the physical systems prove to become very complex as soon as the systems considered are not elementary any more. Indeed, in front of this complexity, we chose to make recourse to Fault Detection and Isolation method (FDI) by the analysis of the problem of its control and to conceive a reliable system of diagnosis making it possible to apprehend the complex dynamic systems spatially distributed applied to the standard pressurized water nuclear reactor.Keywords: Bond Graph, Modeling, Simulation, Monitoring, Analytical Redundancy Relations, Pressurized Water Reactor, Directed Graph.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 197737 An Effective Approach for Distribution System Power Flow Solution
Authors: A. Alsaadi, B. Gholami
Abstract:
An effective approach for unbalanced three-phase distribution power flow solutions is proposed in this paper. The special topological characteristics of distribution networks have been fully utilized to make the direct solution possible. Two matrices–the bus-injection to branch-current matrix and the branch-current to busvoltage matrix– and a simple matrix multiplication are used to obtain power flow solutions. Due to the distinctive solution techniques of the proposed method, the time-consuming LU decomposition and forward/backward substitution of the Jacobian matrix or admittance matrix required in the traditional power flow methods are no longer necessary. Therefore, the proposed method is robust and time-efficient. Test results demonstrate the validity of the proposed method. The proposed method shows great potential to be used in distribution automation applications.Keywords: Distribution power flow, distribution automation system, radial network, unbalanced networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 423836 The Potential Use of Nanofilters to Supply Potable Water in Persian Gulf and Oman Sea Watershed Basin
Authors: Sara Zamani, Mojtaba Fazeli, Abdollah Rashidi Mehrabadi
Abstract:
In a world worried about water resources with the shadow of drought and famine looming all around, the quality of water is as important as its quantity. The source of all concerns is the constant reduction of per capita quality water for different uses. Iran With an average annual precipitation of 250 mm compared to the 800 mm world average, Iran is considered a water scarce country and the disparity in the rainfall distribution, the limitations of renewable resources and the population concentration in the margins of desert and water scarce areas have intensified the problem. The shortage of per capita renewable freshwater and its poor quality in large areas of the country, which have saline, brackish or hard water resources, and the profusion of natural and artificial pollutant have caused the deterioration of water quality. Among methods of treatment and use of these waters one can refer to the application of membrane technologies, which have come into focus in recent years due to their great advantages. This process is quite efficient in eliminating multi-capacity ions; and due to the possibilities of production at different capacities, application as treatment process in points of use, and the need for less energy in comparison to Reverse Osmosis processes, it can revolutionize the water and wastewater sector in years to come. The article studied the different capacities of water resources in the Persian Gulf and Oman Sea watershed basins, and processes the possibility of using nanofiltration process to treat brackish and non-conventional waters in these basins.Keywords: Membrane processes, saline waters, brackish waters, hard waters, zoning water quality in the Persian Gulf and the Oman Sea Watershed area, nanofiltration.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 195335 Modeling the Fischer-Tropsch Reaction In a Slurry Bubble Column Reactor
Authors: F. Gholami, M. Torabi Angaji, Z. Gholami
Abstract:
Fischer-Tropsch synthesis is one of the most important catalytic reactions that convert the synthetic gas to light and heavy hydrocarbons. One of the main issues is selecting the type of reactor. The slurry bubble reactor is suitable choice for Fischer- Tropsch synthesis because of its good qualification to transfer heat and mass, high durability of catalyst, low cost maintenance and repair. The more common catalysts for Fischer-Tropsch synthesis are Iron-based and Cobalt-based catalysts, the advantage of these catalysts on each other depends on which type of hydrocarbons we desire to produce. In this study, Fischer-Tropsch synthesis is modeled with Iron and Cobalt catalysts in a slurry bubble reactor considering mass and momentum balance and the hydrodynamic relations effect on the reactor behavior. Profiles of reactant conversion and reactant concentration in gas and liquid phases were determined as the functions of residence time in the reactor. The effects of temperature, pressure, liquid velocity, reactor diameter, catalyst diameter, gasliquid and liquid-solid mass transfer coefficients and kinetic coefficients on the reactant conversion have been studied. With 5% increase of liquid velocity (with Iron catalyst), H2 conversions increase about 6% and CO conversion increase about 4%, With 8% increase of liquid velocity (with Cobalt catalyst), H2 conversions increase about 26% and CO conversion increase about 4%. With 20% increase of gas-liquid mass transfer coefficient (with Iron catalyst), H2 conversions increase about 12% and CO conversion increase about 10% and with Cobalt catalyst H2 conversions increase about 10% and CO conversion increase about 6%. Results show that the process is sensitive to gas-liquid mass transfer coefficient and optimum condition operation occurs in maximum possible liquid velocity. This velocity must be more than minimum fluidization velocity and less than terminal velocity in such a way that avoid catalysts particles from leaving the fluidized bed.Keywords: Modeling, Fischer-Tropsch Synthesis, Slurry Bubble Column Reactor.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 301934 Reactive Absorption of Hydrogen Sulfide in Aqueous Ferric Sulfate Solution
Authors: Z. Gholami, M. Torabi Angaji, F. Gholami, S. A. Razavi Alavi
Abstract:
Many commercial processes are available for the removal of H2S from gaseous streams. The desulfurization of gas streams using aqueous ferric sulfate solution as washing liquor is studied. Apart from sulfur, only H2O is generated in the process, and consequently, no waste treatment facilities are required. A distinct advantage of the process is that the reaction of H2S with is so rapid and complete that there remains no danger of discharging toxic waste gas. In this study, the reactive absorption of hydrogen sulfide into aqueous ferric sulfate solution has been studied and design calculations for equipments have been done and effective operation parameters on this process considered. Results show that high temperature and low pressure are suitable for absorption reaction. Variation of hydrogen sulfide concentration and Fe3+ concentration with time in absorption reaction shown that the reaction of ferric sulfate and hydrogen sulfide is first order with respect to the both reactant. At low Fe2(SO4)3 concentration the absorption rate of H2S increase with increasing the Fe2(SO4)3 concentration. At higher concentration a decrease in the absorption rate was found. At higher concentration of Fe2(SO4)3, the ionic strength and viscosity of solution increase remarkably resulting in a decrease of solubility, diffusivity and hence absorption rate.Keywords: Absorption, Fe2(SO4)3, H2S, Reactive Absorption.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 394233 Twin-Screw Extruder and Effective Parameters on the HDPE Extrusion Process
Authors: S. A. Razavi Alavi, M. Torabi Angaji, Z. Gholami
Abstract:
In the process of polyethylene extrusion polymer material similar to powder or granule is under compression, melting and transmission operation and on base of special form, extrudate has been produced. Twin-screw extruders are applicable in industries because of their high capacity. The powder mixing with chemical additives and melting with thermal and mechanical energy in three zones (feed, compression and metering zone) and because of gear pump and screw's pressure, converting to final product in latest plate. Extruders with twin-screw and short distance between screws are better than other types because of their high capacity and good thermal and mechanical stress. In this paper, process of polyethylene extrusion and various tapes of extruders are studied. It is necessary to have an exact control on process to producing high quality products with safe operation and optimum energy consumption. The granule size is depending on granulator motor speed. Results show at constant feed rate a decrease in granule size was found whit Increase in motor speed. Relationships between HDPE feed rate and speed of granulator motor, main motor and gear pump are calculated following as: x = HDPE feed flow rate, yM = Main motor speed yM = (-3.6076e-3) x^4+ (0.24597) x^3+ (-5.49003) x^2+ (64.22092) x+61.66786 (1) x = HDPE feed flow rate, yG = Gear pump speed yG = (-2.4996e-3) x^4+ (0.18018) x^3+ (-4.22794) x^2+ (48.45536) x+18.78880 (2) x = HDPE feed flow rate, y = Granulator motor speed 10th Degree Polynomial Fit: y = a+bx+cx^2+dx^3... (3) a = 1.2751, b = 282.4655, c = -165.2098, d = 48.3106, e = -8.18715, f = 0.84997 g = -0.056094, h = 0.002358, i = -6.11816e-5 j = 8.919726e-7, k = -5.59050e-9Keywords: Extrusion, Extruder, Granule, HDPE, Polymer, Twin-Screw extruder.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 497832 Removal of CO2 and H2S using Aqueous Alkanolamine Solusions
Authors: Zare Aliabad, H., Mirzaei, S.
Abstract:
This work presents a theoretical investigation of the simultaneous absorption of CO2 and H2S into aqueous solutions of MDEA and DEA. In this process the acid components react with the basic alkanolamine solution via an exothermic, reversible reaction in a gas/liquid absorber. The use of amine solvents for gas sweetening has been investigated using process simulation programs called HYSYS and ASPEN. We use Electrolyte NRTL and Amine Package and Amines (experimental) equation of state. The effects of temperature and circulation rate and amine concentration and packed column and murphree efficiency on the rate of absorption were studied. When lean amine flow and concentration increase, CO2 and H2S absorption increase too. With the improvement of inlet amine temperature in absorber, CO2 and H2S penetrate to upper stages of absorber and absorption of acid gases in absorber decreases. The CO2 concentration in the clean gas can be greatly influenced by the packing height, whereas for the H2S concentration in the clean gas the packing height plays a minor role. HYSYS software can not estimate murphree efficiency correctly and it applies the same contributions in all diagrams for HYSYS software. By improvement in murphree efficiency, maximum temperature of absorber decrease and the location of reaction transfer to the stages of bottoms absorber and the absorption of acid gases increase.Keywords: Absorber, DEA, MDEA, Simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1731331 Thermodynamic Modeling of the High Temperature Shift Converter Reactor Using Minimization of Gibbs Free Energy
Authors: H. Zare Aliabadi
Abstract:
The equilibrium chemical reactions taken place in a converter reactor of the Khorasan Petrochemical Ammonia plant was studied using the minimization of Gibbs free energy method. In the minimization of the Gibbs free energy function the Davidon– Fletcher–Powell (DFP) optimization procedure using the penalty terms in the well-defined objective function was used. It should be noted that in the DFP procedure along with the corresponding penalty terms the Hessian matrices for the composition of constituents in the Converter reactor can be excluded. This, in fact, can be considered as the main advantage of the DFP optimization procedure. Also the effect of temperature and pressure on the equilibrium composition of the constituents was investigated. The results obtained in this work were compared with the data collected from the converter reactor of the Khorasan Petrochemical Ammonia plant. It was concluded that the results obtained from the method used in this work are in good agreement with the industrial data. Notably, the algorithm developed in this work, in spite of its simplicity, takes the advantage of short computation and convergence time.
Keywords: Gibbs free energy, converter reactors, Chemical equilibrium
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 256030 Low Temperature Ethanol Gas Sensor based on SnO2/MWNTs Nanocomposite
Authors: O. Alizadeh Sahraei, A. Khodadadi, Y. Mortazavi, M. Vesali Naseh, S. Mosadegh
Abstract:
A composite made of plasma functionalized multiwall carbon nanotubes (MWNTs) coated with SnO2 was synthesized by sonochemical precipitation method. Thick layer of this nanocomposite material was used as ethanol sensor at low temperatures. The composite sensitivity for ethanol has increased by a factor of 2 at room temperature and by a factor of 13 at 250°C in comparison to that of pure SnO2. SEM image of nanocomposite material showed MWNTs were embedded in SnO2 matrix and also a higher surface area was observed in the presence of functionalized MWNTs. Greatly improved sensitivity of the composite material to ethanol can be attributed to new gas accessing passes through MWNTs and higher specific surface area.Keywords: Carbon nanotube, Functionalized, Gas sensor, Low temperature, Nanocomposite, Tin oxide.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 233729 Sonochemically Prepared SnO2 Quantum Dots as a Selective and Low Temperature CO Sensor
Authors: S. Mosadegh Sedghi, Y. Mortazavi, A. Khodadadi, O. Alizadeh Sahraei, M. Vesali Naseh
Abstract:
In this study, a low temperature sensor highly selective to CO in presence of methane is fabricated by using 4 nm SnO2 quantum dots (QDs) prepared by sonication assisted precipitation. SnCl4 aqueous solution was precipitated by ammonia under sonication, which continued for 2 h. A part of the sample was then dried and calcined at 400°C for 1.5 h and characterized by XRD and BET. The average particle size and the specific surface area of the SnO2 QDs as well as their sensing properties were compared with the SnO2 nano-particles which were prepared by conventional sol-gel method. The BET surface area of sonochemically as-prepared product and the one calcined at 400°C after 1.5 hr are 257 m2/gr and 212 m2/gr respectively while the specific surface area for SnO2 nanoparticles prepared by conventional sol-gel method is about 80m2/gr. XRD spectra revealed pure crystalline phase of SnO2 is formed for both as-prepared and calcined samples of SnO2 QDs. However, for the sample prepared by sol-gel method and calcined at 400°C SnO crystals are detected along with those of SnO2. Quantum dots of SnO2 show exceedingly high sensitivity to CO with different concentrations of 100, 300 and 1000 ppm in whole range of temperature (25- 350°C). At 50°C a sensitivity of 27 was obtained for 1000 ppm CO, which increases to a maximum of 147 when the temperature rises to 225°C and then drops off while the maximum sensitivity for the SnO2 sample prepared by the sol-gel method was obtained at 300°C with the amount of 47.2. At the same time no sensitivity to methane is observed in whole range of temperatures for SnO2 QDs. The response and recovery times of the sensor sharply decreases with temperature, while the high selectivity to CO does not deteriorate.
Keywords: Sonochemical, SnO2 QDs, SnO2 gas sensor
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 224728 Functionalization of Carbon Nanotubes Using Nitric Acid Oxidation and DBD Plasma
Authors: M. Vesali Naseh, A. A. Khodadadi, Y. Mortazavi, O. Alizadeh Sahraei, F. Pourfayaz, S. Mosadegh Sedghi
Abstract:
In this study, multiwall carbon nanotubes (MWNTs) were modified with nitric acid chemically and by dielectric barrier discharge (DBD) plasma in an oxygen-based atmosphere. Used carbon nanotubes (CNTs) were prepared by chemical vapour deposition (CVD) floating catalyst method. For removing amorphous carbon and metal catalyst, MWNTs were exposed to dry air and washed with hydrochloric acid. Heating purified CNTs under helium atmosphere caused elimination of acidic functional groups. Fourier transformed infrared spectroscopy (FTIR) shows formation of oxygen containing groups such as C=O and COOH. Brunauer, Emmett, Teller (BET) analysis revealed that functionalization causes generation of defects on the sidewalls and opening of the ends of CNTs. Results of temperature-programmed desorption (TPD) and gas chromatography(GC) indicate that nitric acid treatment create more acidic groups than plasma treatment.Keywords: Carbon nanotubes (CNTs), chemical treatment, functionalization, plasma.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 577327 Mass Transfer Modeling in a Packed Bed of Palm Kernels under Supercritical Conditions
Authors: I. Norhuda, A. K. Mohd Omar
Abstract:
Studies on gas solid mass transfer using Supercritical fluid CO2 (SC-CO2) in a packed bed of palm kernels was investigated at operating conditions of temperature 50 °C and 70 °C and pressures ranges from 27.6 MPa, 34.5 MPa, 41.4 MPa and 48.3 MPa. The development of mass transfer models requires knowledge of three properties: the diffusion coefficient of the solute, the viscosity and density of the Supercritical fluids (SCF). Matematical model with respect to the dimensionless number of Sherwood (Sh), Schmidt (Sc) and Reynolds (Re) was developed. It was found that the model developed was found to be in good agreement with the experimental data within the system studied.
Keywords: Mass Transfer, Palm Kernel, Supercritical fluid.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 181426 Mass Transfer of Palm Kernel Oil under Supercritical Conditions
Authors: I. Norhuda, A. K. Mohd Omar
Abstract:
The purpose of the study was to determine the amount of Palm Kernel Oil (PKO) extracted from a packed bed of palm kernels in a supercritical fluid extractor using supercritical carbon dioxide (SC-CO2) as an environmental friendly solvent. Further, the study sought to ascertain the values of the overall mass transfer coefficient (K) of PKO evaluation through a mass transfer model, at constant temperature of 50 °C, 60 °C, and 70 °C and pressures range from 27.6 MPa, 34.5 MPa, 41.4 MPa and 48.3 MPa respectively. Finally, the study also seeks to demonstrate the application of the overall mass transfer coefficient values in relation to temperature and pressure. The overall mass transfer coefficient was found to be dependent pressure at each constant temperature of 50 °C, 60 °C and 70 °C. The overall mass transfer coefficient for PKO in a packed bed of palm kernels was found to be in the range of 1.21X 10-4 m min-1 to 1.72 X 10-4 m min-1 for a constant temperature of 50 °C and in the range of 2.02 X 10-4 m min-1 to 2.43 X 10-4 m min-1 for a constant temperature of 60 °C. Similar increasing trend of the overall mass transfer coefficient from 1.77 X 10-4 m min-1 to 3.64 X 10-4 m min-1 was also observed at constant temperature of 70 °C within the same pressure range from 27.6 MPa to 48.3 MPa.
Keywords: Overall Mass Transfer Coefficient (D), Supercritical Carbon Dioxide (SC-CO2), Palm Kernel Oil (PKO).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 174825 The Development of Decision Support System for Waste Management; a Review
Authors: M. S. Bani, Z. A. Rashid, K. H. K. Hamid, M. E. Harbawi, A.B.Alias, M. J. Aris
Abstract:
Most Decision Support Systems (DSS) for waste management (WM) constructed are not widely marketed and lack practical applications. This is due to the number of variables and complexity of the mathematical models which include the assumptions and constraints required in decision making. The approach made by many researchers in DSS modelling is to isolate a few key factors that have a significant influence to the DSS. This segmented approach does not provide a thorough understanding of the complex relationships of the many elements involved. The various elements in constructing the DSS must be integrated and optimized in order to produce a viable model that is marketable and has practical application. The DSS model used in assisting decision makers should be integrated with GIS, able to give robust prediction despite the inherent uncertainties of waste generation and the plethora of waste characteristics, and gives optimal allocation of waste stream for recycling, incineration, landfill and composting.Keywords: Review, decision support system, GIS and waste management.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 374424 Adhesion Properties of Bifidobacterium Pseudocatenulatum G4 and Bifidobacterium Longum BB536 on HT-29 Human Epithelium Cell Line at Different Times and pH
Authors: Ali Q. S., Farid A. J., Kabeir B. M., Zamberi S., Shuhaimi M., Ghazali H. M., Yazid A. M.
Abstract:
Adhesion to the human intestinal cell is considered as one of the main selection criteria of lactic acid bacteria for probiotic use. The adhesion ability of two Bifidobacteriums strains Bifidobacterium longum BB536 and Bifidobacterium psudocatenulatum G4 was done using HT-29 human epithelium cell line as in vitro study. Four different level of pH were used 5.6, 5.7, 6.6, and 6.8 with four different times 15, 30, 60, and 120 min. Adhesion was quantified by counting the adhering bacteria after Gram staining. The adhesion of B. longum BB536 was higher than B. psudocatenulatum G4. Both species showed significant different in the adhesion properties at the factors tested. The highest adhesion for both Bifidobacterium was observed at 120 min and the low adhesion was in 15 min. The findings of this study will contribute to the introduction of new effective probiotic strain for future utilization.Keywords: Bifidobacterium, Adhesion, HT-29 human epithelium cells.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 184923 Are PEG Molecules a Universal Protein Repellent?
Authors: Norzita Ngadi, John Abrahamson, Conan Fee, Ken Morison
Abstract:
Poly (ethylene glycol) (PEG) molecules attached to surfaces have shown high potential as a protein repellent due to their flexibility and highly water solubility. A quartz crystal microbalance recording frequency and dissipation changes (QCM-D) has been used to study the adsorption from aqueous solutions, of lysozyme and α-lactalbumin proteins (the last with and without calcium) onto modified stainless steel surfaces. Surfaces were coated with poly(ethylene imine) (PEI) and silicate before grafting on PEG molecules. Protein adsorption was also performed on the bare stainless steel surface as a control. All adsorptions were conducted at 23°C and pH 7.2. The results showed that the presence of PEG molecules significantly reduced the adsorption of lysozyme and α- lactalbumin (with calcium) onto the stainless steel surface. By contrast, and unexpected, PEG molecules enhanced the adsorption of α-lactalbumin (without calcium). It is suggested that the PEG -α- lactalbumin hydrophobic interaction plays a dominant role which leads to protein aggregation at the surface for this latter observation. The findings also lead to the general conclusion that PEG molecules are not a universal protein repellent. PEG-on-PEI surfaces were better at inhibiting the adsorption of lysozyme and α-lactalbumin (with calcium) than with PEG-on-silicate surfaces.
Keywords: Stainless steel, PEG, QCM-D, protein, PEI layer, silicate layer.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 228622 Bioethanol Production from Enzymatically Saccharified Sunflower Stalks Using Steam Explosion as Pretreatment
Authors: Pilanee Vaithanomsat, Sinsupha Chuichulcherm, Waraporn Apiwatanapiwat
Abstract:
Sunflower stalks were analysed for chemical compositions: pentosan 15.84%, holocellulose 70.69%, alphacellulose 45.74%, glucose 27.10% and xylose 7.69% based on dry weight of 100-g raw material. The most optimum condition for steam explosion pretreatment was as follows. Sunflower stalks were cut into small pieces and soaked in 0.02 M H2SO4 for overnight. After that, they were steam exploded at 207 C and 21 kg/cm2 for 3 minutes to fractionate cellulose, hemicellulose and lignin. The resulting hydrolysate, containing hemicellulose, and cellulose pulp contained xylose sugar at 2.53% and 7.00%, respectively.The pulp was further subjected to enzymatic saccharification at 50 C, pH 4.8 citrate buffer) with pulp/buffer 6% (w/w)and Celluclast 1.5L/pulp 2.67% (w/w) to obtain single glucose with maximum yield 11.97%. After fixed-bed fermentation under optimum condition using conventional yeast mixtures to produce bioethanol, it indicated maximum ethanol yield of 0.028 g/100 g sunflower stalk.Keywords: Enzymatic, steam explosion, sunflower stalk, ethanol production.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 240621 Removal of Cationic Heavy Metal and HOC from Soil-Washed Water Using Activated Carbon
Authors: Chi Kyu Ahn, Young Mi Kim, Seung Han Woo, Jong Moon Park
Abstract:
Soil washing process with a surfactant solution is a potential technology for the rapid removal of hydrophobic organic compound (HOC) from soil. However, large amount of washed water would be produced during operation and this should be treated effectively by proper methods. The soil washed water for complex contaminated site with HOC and heavy metals might contain high amount of pollutants such as HOC and heavy metals as well as used surfactant. The heavy metals in the soil washed water have toxic effects on microbial activities thus these should be removed from the washed water before proceeding to a biological waste-water treatment system. Moreover, the used surfactant solutions are necessary to be recovered for reducing the soil washing operation cost. In order to simultaneously remove the heavy metals and HOC from soil-washed water, activated carbon (AC) was used in the present study. In an anionic-nonionic surfactant mixed solution, the Cd(II) and phenanthrene (PHE) were effectively removed by adsorption on activated carbon. The removal efficiency for Cd(II) was increased from 0.027 mmol-Cd/g-AC to 0.142 mmol-Cd/g-AC as the mole ratio of SDS increased in the presence of PHE. The adsorptive capacity of PHE was also increased according to the SDS mole ratio due to the decrement of molar solubilization ratios (MSR) for PHE in an anionic-nonionic surfactant mixture. The simultaneous adsorption of HOC and cationic heavy metals using activated carbon could be a useful method for surfactant recovery and the reduction of heavy metal toxicity in a surfactant-enhanced soil washing process.
Keywords: Activated carbon, Anionic-nonionic surfactant mixture, Cationic heavy metal, HOC, Soil washing
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 173020 Decolorization of Reactive Black 5 and Reactive Red 198 using Nanoscale Zerovalent Iron
Authors: C. Chompuchan, T. Satapanajaru, P. Suntornchot, P. Pengthamkeerati
Abstract:
Residual dye contents in textile dyeing wastewater have complex aromatic structures that are resistant to degrade in biological wastewater treatment. The objectives of this study were to determine the effectiveness of nanoscale zerovalent iron (NZVI) to decolorize Reactive Black 5 (RB5) and Reactive Red 198 (RR198) in synthesized wastewater and to investigate the effects of the iron particle size, iron dosage and solution pHs on the destruction of RB5 and RR198. Synthesized NZVI was confirmed by transmission electron microscopy (TEM), X-ray diffraction (XRD), and X-ray photoelectron spectroscopy (XPS). The removal kinetic rates (kobs) of RB5 (0.0109 min-1) and RR198 (0.0111 min-1) by 0.5% NZVI were many times higher than those of microscale zerovalent iron (ZVI) (0.0007 min-1 and 0.0008 min-1, respectively). The iron dosage increment exponentially increased the removal efficiencies of both RB5 and RR198. Additionally, lowering pH from 9 to 5 increased the decolorization kinetic rates of both RB5 and RR198 by NZVI. The destruction of azo bond (N=N) in the chromophore of both reactive dyes led to decolorization of dye solutions.
Keywords: decolorization, nanoscale zerovalent iron, Reactive Black 5, Reactive Red 198.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 230419 Some Biological and Molecular Characterization of Bean Common Mosaic Necrosis Virus Isolated from Soybean in Tehran Province, Iran
Authors: F. S. Abtahi, M. Koohi Hbibi, M. Khodaei Motlagh
Abstract:
Bean common mosaic necrosis virus (BCMNV) is a potyvirus with a worldwide distribution. This virus causes serious economic losses in Iran in many leguminoses. During 20008, samples were collected from soybeans fields in Tehran Province. Four isolates (S1, S2 and S3) were inoculated on 15 species of Cucurbitaceae, Chenopodiaceae, Solanacae and Leguminosae. Chenopodium quinoa and C. amaranticolor. Did not developed any symptoms.all isolates caused mosaic symptoms on Phaseolus vulgaris cv. Red Kidney and P. vulgaris cv. Bountiful. The molecular weights of coat protein using SDS-PAGE and western blotting were estimated at 33 kDa. Reverse transcription polymerase chain reaction (RT-PCR) was performed using one primer pairs designed by L. XU et al. An approximately 920 bp fragment was amplified with a specific primer.Keywords: ELISA, RT-PCR, SDS-PAGE, BCMNV.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 180618 Molecular Characterization of Free Radicals Decomposing Genes on Plant Developmental Stages
Authors: R. Haddad, K. Morris, V. Buchanan-Wollaston
Abstract:
Biochemical and molecular analysis of some antioxidant enzyme genes revealed different level of gene expression on oilseed (Brassica napus). For molecular and biochemical analysis, leaf tissues were harvested from plants at eight different developmental stages, from young to senescence. The levels of total protein and chlorophyll were increased during maturity stages of plant, while these were decreased during the last stages of plant growth. Structural analysis (nucleotide and deduced amino acid sequence, and phylogenic tree) of a complementary DNA revealed a high level of similarity for a family of Catalase genes. The expression of the gene encoded by different Catalase isoforms was assessed during different plant growth phase. No significant difference between samples was observed, when Catalase activity was statistically analyzed at different developmental stages. EST analysis exhibited different transcripts levels for a number of other relevant antioxidant genes (different isoforms of SOD and glutathione). The high level of transcription of these genes at senescence stages was indicated that these genes are senescenceinduced genes.Keywords: Biochemical analysis, Oilseed, Expression pattern, Growth phases
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 154917 Study on the Effect of Sulphur, Glucose, Nitrogen and Plant Residues on the Immobilization of Sulphate-S in Soil
Authors: S. Shahsavani, A. Gholami
Abstract:
In order to evaluate the relationship between the sulphur (S), glucose (G), nitrogen (N) and plant residues (st), sulphur immobilization and microbial transformation were monitored in five soil samples from 0-30 cm of Bastam farmers fields of Shahrood area following 11 treatments with different levels of Sulphur (S), glucose (G), N and plant residues (wheat straw) in a randomized block design with three replications and incubated over 20, 45 and 60 days, the immobilization of SO4 -2-S presented as a percentage of that added, was inversely related to its addition rate. Additions of glucose and plant residues increased with the C-to-S ratio of the added amendments, irrespective of their origins (glucose and plant residues). In the presence of C sources (glucose or plant residues). N significantly increased the immobilization of SO4 -2-S, whilst the effect of N was insignificant in the absence of a C amendment. In first few days the amounts of added SO4 -2-S immobilized were linearly correlated with the amounts of added S recovered in the soil microbial biomass. With further incubation the proportions of immobilized SO4 -2-S remaining as biomass-S decreased. Decrease in biomass-S was thought to be due to the conversion of biomass-S into soil organic-S. Glucose addition increased the immobilization (microbial utilization and incorporation into the soil organic matter) of native soil SO4 -2-S. However, N addition enhance the mineralization of soil organic-S, increasing the concentration of SO4 - 2-S in soil.
Keywords: Immobilization, microbial biomass, sulphur, nitrogen, glucose.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 148016 Effects of Skim Milk Powder Supplementation to Soy Yogurts on Biotransformation of Isoflavone Glycosides to Biologically Active Forms during Storage
Authors: T. T. Pham, N. P. Shah
Abstract:
Three batches of yogurts were made with soy protein isolate (SPI) supplemented with 2% (S2), 4% (S4) or 6% (S6) of skim milk powder (SMP). The fourth batch (control; S0) was prepared from SPI without SMP supplementation. Lactobacillus delbrueckii ssp. bulgaricus ATCC 11842 (Lb 11842) and Streptococcus thermophilus ST 1342 (ST 1342) were used as the starter culture. Biotransformation of the inactive forms, isoflavone glycosides (IG) to biologically active forms, isoflavone aglycones (IA), was determined during 28 d storage. The viability of both microorganisms was significantly higher (P < 0.05) in S2, S4, and S6 than that in S0. The ratio of lactic acid/acetic acid in S0 was in the range of 15.53 – 22.31 compared to 7.24 – 12.81 in S2, S4 and S6. The biotransformation of IG to IA in S2, S4 and S6 was also enhanced by 9.9 -13.3% compared to S0.Keywords: Isoflavone aglycones, isoflavone glycosides, skim milk powder and soy yogurt.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 194715 Evaluating the Response of Rainfed-Chickpea to Population Density in Iran, Using Simulation
Authors: Manoochehr Gholipoor
Abstract:
The response of growth and yield of rainfed-chickpea to population density should be evaluated based on long-term experiments to include the climate variability. This is achievable just by simulation. In this simulation study, this evaluation was done by running the CYRUS model for long-term daily weather data of five locations in Iran. The tested population densities were 7 to 59 (with interval of 2) stands per square meter. Various functions, including quadratic, segmented, beta, broken linear, and dent-like functions, were tested. Considering root mean square of deviations and linear regression statistics [intercept (a), slope (b), and correlation coefficient (r)] for predicted versus observed variables, the quadratic and broken linear functions appeared to be appropriate for describing the changes in biomass and grain yield, and in harvest index, respectively. Results indicated that in all locations, grain yield tends to show increasing trend with crowding the population, but subsequently decreases. This was also true for biomass in five locations. The harvest index appeared to have plateau state across low population densities, but decreasing trend with more increasing density. The turning point (optimum population density) for grain yield was 30.68 stands per square meter in Isfahan, 30.54 in Shiraz, 31.47 in Kermanshah, 34.85 in Tabriz, and 32.00 in Mashhad. The optimum population density for biomass ranged from 24.6 (in Tabriz) to 35.3 stands per square meter (Mashhad). For harvest index it varied between 35.87 and 40.12 stands per square meter.Keywords: Rainfed-chickpea, biomass, harvest index, grain yield, simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 133314 Effects of Some Natural Antioxidants Mixtures on Margarine Stability
Authors: Maryam Azizkhani, Parvin Zandi
Abstract:
Application of synthetic antioxidants such as tertbutylhydroquinon (TBHQ), in spite of their efficiency, is questioned because of their possible carcinogenic effect. The purpose of this study was application of mixtures of natural antioxidants that provide the best oxidative stability for margarine. Antioxidant treatments included 10 various mixtures (F1- F10) containing 100-500ppm tocopherol mixture (Toc), 100-200ppm ascorbyl palmitate (AP), 100- 200ppm rosemary extract (Ros) and 1000ppm lecithin(Lec) along with a control or F0 (with no antioxidant) and F11 with 120ppm TBHQ. The effect of antioxidant mixtures on the stability of margarine samples during oven test (60°C), rancimat test at 110°C and storage at 4°C was evaluated. Final ranking of natural antioxidant mixtures was as follows: F2,F10>F5,F9>F8>F1,F3,F4>F6, F7. Considering the results of this research and ranking criteria, F2(200ppmAp + 200ppmRos) and F10(200ppmRos + 200ppmToc +1000ppmLec) were recommended as substitutes for TBHQ to maintain the quality and increase the shelf-life of margarine.Keywords: Margarine, Natural antioxidant, Oxidative stability, Shelf-life.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 313213 Effect of Phosphate Solubilization Microorganisms (PSM) and Plant Growth Promoting Rhizobacteria (PGPR) on Yield and Yield Components of Corn (Zea mays L.)
Authors: Mohammad Yazdani, Mohammad Ali Bahmanyar, Hemmatollah Pirdashti, Mohammad Ali Esmaili
Abstract:
In order to study the effect of phosphate solubilization microorganisms (PSM) and plant growth promoting rhizobacteria (PGPR) on yield and yield components of corn Zea mays (L. cv. SC604) an experiment was conducted at research farm of Sari Agricultural Sciences and Natural Resources University, Iran during 2007. Experiment laid out as split plot based on randomized complete block design with three replications. Three levels of manures (consisted of 20 Mg.ha-1 farmyard manure, 15 Mg.ha-1 green manure and check or without any manures) as main plots and eight levels of biofertilizers (consisted of 1-NPK or conventional fertilizer application; 2-NPK+PSM+PGPR; 3 NP50%K+PSM+PGPR; 4- N50%PK+PSM +PGPR; 5-N50%P50%K+PSM+ PGPR; 6-PK+PGPR; 7- NK+PSM and 8-PSM+PGPR) as sub plots were treatments. Results showed that farmyard manure application increased row number, ear weight, grain number per ear, grain yield, biological yield and harvest index compared to check. Furthermore, using of PSM and PGPR in addition to conventional fertilizer applications (NPK) could improve ear weight, row number and grain number per row and ultimately increased grain yield in green manure and check plots. According to results in all fertilizer treatments application of PSM and PGPR together could reduce P application by 50% without any significant reduction of grain yield. However, this treatment could not compensate 50% reduction of N application.Keywords: Biofertilizers, corn, PSM, PGPR, grain yield.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 668312 Effects of Discharge Fan on the Drying Efficiency in Flat-bed type Dryer
Authors: Jafar Hashemi, Reza Tabatabaekoloor, Toshinori Kimura
Abstract:
The study of interaction among the grain, moisture, and the surrounding space (air) is key to understanding the graindrying process. In Iran, rice (mostly Indica type) is dried by flat bed type dryer until the final MC reaches to 6 to 8%. The experiments were conducted to examine the effect of application of discharge fan with different heights of paddy on the drying efficiency. Experiments were designed based on two different configurations of the drying methods; with and without discharge fan with three different heights of paddy including; 5, 10, and 15 cm. The humid heated air will be going out immediately by the suction of discharge fan. The drying time is established upon the average final MC to achieve about 8%. To save energy and reduce the drying time, the distribution of temperature between layers should be fast and uniform with minimum difference; otherwise the difference of MC gradient between layers will be high and will induce grain breakage. The difference of final MC between layers in the two methods was 48-73%. The steady state of temperature between the two methods has saved time in the range of 10-20%, and the efficiency of temperature distribution increased 17-26% by the use of discharge fan.Keywords: FBT Dryer, Final MC, Discharge Fan.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 176211 Economic effects and Energy Use Efficiency of Incorporating Alfalfa and Fertilizer into Grass- Based Pasture Systems
Authors: M. Khakbazan, S. L. Scott, H. C. Block, C. D. Robins, W. P. McCaughey
Abstract:
A ten-year grazing study was conducted at the Agriculture and Agri-Food Canada Brandon Research Centre in Manitoba to study the effect of alfalfa inclusion and fertilizer (N, P, K, and S) addition on economics and efficiency of non-renewable energy use in meadow brome grass-based pasture systems for beef production. Fertilizing grass-only or alfalfa-grass pastures to full soil test recommendations improved pasture productivity, but did not improve profitability compared to unfertilized pastures. Fertilizing grass-only pastures resulted in the highest net loss of any pasture management strategy in this study. Adding alfalfa at the time of seeding, with no added fertilizer, was economically the best pasture improvement strategy in this study. Because of moisture limitations, adding commercial fertilizer to full soil test recommendations is probably not economically justifiable in most years, especially with the rising cost of fertilizer. Improving grass-only pastures by adding fertilizer and/or alfalfa required additional non-renewable energy inputs; however, the additional energy required for unfertilized alfalfa-grass pastures was minimal compared to the fertilized pastures. Of the four pasture management strategies, adding alfalfa to grass pastures without adding fertilizer had the highest efficiency of energy use. Based on energy use and economic performance, the unfertilized alfalfa-grass pasture was the most efficient and sustainable pasture system.Keywords: Alfalfa, grass, fertilizer, pasture systems, economics, energy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 167510 Efficiency of Floristic and Molecular Markers to Determine Diversity in Iranian Populations of T. boeoticum
Authors: M. R. Naghavi, M. Maleki, S. F. Tabatabaei
Abstract:
In order to study floristic and molecular classification of common wild wheat (Triticum boeoticum Boiss.), an analysis was conducted on populations of the Triticum boeoticum collected from different regions of Iran. Considering all floristic compositions of habitats, six floristic groups (syntaxa) within the populations were identified. A high level of variation of T. boeoticum also detected using SSR markers. Our results showed that molecular method confirmed the grouping of floristic method. In other word, the results from our study indicate that floristic classification are still useful, efficient, and economic tools for characterizing the amount and distribution of genetic variation in natural populations of T. boeoticum. Nevertheless, molecular markers appear as useful and complementary techniques for identification and for evaluation of genetic diversity in studied populations.Keywords: T. boeoticum, diversity, floristic, SSRs.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13499 Genetic Variation of Durum Wheat Landraces and Cultivars Using Morphological and Protein Markers
Authors: M. R. Naghavi, S. Rashidi Monfared, A. H. Ahkami, M. A. Ombidbakhsh
Abstract:
Knowledge of patterns of genetic diversity enhances the efficiency of germplasm conservation and improvement. In this study 96 Iranian landraces of Triticum turgidum originating from different geographical areas of Iran, along with 18 durum cultivars from ten countries were evaluated for variation in morphological and high molecular weight glutenin subunit (HMW-GS) composition. The first two principal components clearly separated the Iranian landraces from cultivars. Three alleles were present at the Glu-A1 locus and 11 alleles at Glu-B1. In both cultivars and landraces of durum wheat, the null allele (Glu-A1c) was observed more frequently than the Glu-A1a and Glu-A1b alleles. Two alleles, namely Glu-B1a (subunit 7) and Glu-B1e (subunit 20) represented the more frequent alleles at Glu-B1 locus. The results showed that the evaluated Iranian landraces formed an interesting source of favourable glutenin subunits that might be very desirable in breeding activities for improving pasta-making quality.Keywords: Triticum turgidum var. durum, glutenin subunits, morphological characters.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19968 Estimation of Critical Period for Weed Control in Corn in Iran
Authors: Sohrab Mahmoodi, Ali Rahimi
Abstract:
The critical period for weed control (CPWC) is the period in the crop growth cycle during which weeds must be controlled to prevent unacceptable yield losses. Field studies were conducted in 2005 and 2006 in the University of Birjand at the south east of Iran to determine CPWC of corn using a randomized complete block design with 14 treatments and four replications. The treatments consisted of two different periods of weed interference, a critical weed-free period and a critical time of weed removal, were imposed at V3, V6, V9, V12, V15, and R1 (based on phonological stages of corn development) with a weedy check and a weed-free check. The CPWC was determined with the use of 2.5, 5, 10, 15 and 20% acceptable yield loss levels by non-linear Regression method and fitting Logistic and Gompertz nonlinear equations to relative yield data. The CPWC of corn was from 5- to 15-leaf stage (19-55 DAE) to prevent yield losses of 5%. This period to prevent yield losses of 2.5, 10 and 20% was 4- to 17-leaf stage (14-59 DAE), 6- to 12-leaf stage (25-47 DAE) and 8- to 9-leaf stage (31-36 DAE) respectively. The height and leaf area index of corn were significantly decreased by weed competition in both weed free and weed infested treatments (P<0.01). Results also showed that there was a significant positive correlation between yield and LAI of corn at silk stage when competing with weeds (r= 0.97).
Keywords: Corn, Critical period, Gompertz, Logistic, Weed control.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20297 Multi-Criteria Decision-Making Selection Model with Application to Chemical Engineering Management Decisions
Authors: Mohsen Pirdashti, Arezou Ghadi, Mehrdad Mohammadi, Gholamreza Shojatalab
Abstract:
Chemical industry project management involves complex decision making situations that require discerning abilities and methods to make sound decisions. Project managers are faced with decision environments and problems in projects that are complex. In this work, case study is Research and Development (R&D) project selection. R&D is an ongoing process for forward thinking technology-based chemical industries. R&D project selection is an important task for organizations with R&D project management. It is a multi-criteria problem which includes both tangible and intangible factors. The ability to make sound decisions is very important to success of R&D projects. Multiple-criteria decision making (MCDM) approaches are major parts of decision theory and analysis. This paper presents all of MCDM approaches for use in R&D project selection. It is hoped that this work will provide a ready reference on MCDM and this will encourage the application of the MCDM by chemical engineering management.Keywords: Chemical Engineering, R&D Project, MCDM, Selection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 40876 Optimization of Some Process Parameters to Produce Raisin Concentrate in Khorasan Region of Iran
Authors: Peiman Ariaii, Hamid Tavakolipour, Mohsen Pirdashti, Rabehe Izadi Amoli
Abstract:
Raisin Concentrate (RC) are the most important products obtained in the raisin processing industries. These RC products are now used to make the syrups, drinks and confectionery productions and introduced as natural substitute for sugar in food applications. Iran is a one of the biggest raisin exporter in the world but unfortunately despite a good raw material, no serious effort to extract the RC has been taken in Iran. Therefore, in this paper, we determined and analyzed affected parameters on extracting RC process and then optimizing these parameters for design the extracting RC process in two types of raisin (round and long) produced in Khorasan region. Two levels of solvent (1:1 and 2:1), three levels of extraction temperature (60°C, 70°C and 80°C), and three levels of concentration temperature (50°C, 60°C and 70°C) were the treatments. Finally physicochemical characteristics of the obtained concentrate such as color, viscosity, percentage of reduction sugar, acidity and the microbial tests (mould and yeast) were counted. The analysis was performed on the basis of factorial in the form of completely randomized design (CRD) and Duncan's multiple range test (DMRT) was used for the comparison of the means. Statistical analysis of results showed that optimal conditions for production of concentrate is round raisins when the solvent ratio was 2:1 with extraction temperature of 60°C and then concentration temperature of 50°C. Round raisin is cheaper than the long one, and it is more economical to concentrate production. Furthermore, round raisin has more aromas and the less color degree with increasing the temperature of concentration and extraction. Finally, according to mentioned factors the concentrate of round raisin is recommended.Keywords: Raisin concentrate, optimization, process parameters, round raisin, Iran.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15995 Wheat Yield Prediction through Agro Meteorological Indices for Ardebil District
Authors: Fariba Esfandiary, Ghafoor Aghaie, Ali Dolati Mehr
Abstract:
Wheat prediction was carried out using different meteorological variables together with agro meteorological indices in Ardebil district for the years 2004-2005 & 2005–2006. On the basis of correlation coefficients, standard error of estimate as well as relative deviation of predicted yield from actual yield using different statistical models, the best subset of agro meteorological indices were selected including daily minimum temperature (Tmin), accumulated difference of maximum & minimum temperatures (TD), growing degree days (GDD), accumulated water vapor pressure deficit (VPD), sunshine hours (SH) & potential evapotranspiration (PET). Yield prediction was done two months in advance before harvesting time which was coincide with commencement of reproductive stage of wheat (5th of June). It revealed that in the final statistical models, 83% of wheat yield variability was accounted for variation in above agro meteorological indices.
Keywords: Wheat yields prediction, agro meteorological indices, statistical models
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21434 Disinfestation of Wheat Using Liquid Nitrogen Aeration
Authors: Haiyan. Li, Jitendra. Paliwal, Digvir S. Jayas, Noel D. G. White
Abstract:
A study was undertaken to investigate the effect of liquid nitrogen aeration on mortalities of adult Cryptolestes furrugineus, rusty grain beetles, in a prototype cardboard grain bin equipped with an aeration system. The grain bin was filled with Hard Red Spring wheat and liquid nitrogen was introduced from the bottom of the bin. The survival of both cold acclimated and unacclimated C. furrugineus was tested. The study reveals that cold acclimated insects had higher survival than unacclimated insects under similar cooling conditions. In most cases, mortalities of as high as 100% were achieved at the bottom 100 cm of the grain bin for unacclimated insects for most of the trials. Insect survival increased as the distance from the bottom of the grain bin increased. There was no adverse effect of liquid nitrogen aeration on wheat germination.Keywords: Cold acclimated, liquid nitrogen aeration, mortalities, rusty grain beetles.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15203 The Effect of Plant Growth Promoting Rhizobacteria (PGPR) on Germination, Seedling Growth and Yield of Maize
Authors: A. Gholami, S. Shahsavani, S. Nezarat
Abstract:
The effect of plant growth-promoting rhizobacteria (PGPR) on seed germination, seedling growth and yield of field grown maize were evaluated in three experiments. In these experiments six bacterial strains include P.putida strain R-168, P.fluorescens strain R-93, P.fluorescens DSM 50090, P.putida DSM291, A.lipoferum DSM 1691, A.brasilense DSM 1690 were used. Results of first study showed seed Inoculation significantly enhanced seed germination and seedling vigour of maize. In second experiment, leaf and shoot dry weight and also leaf surface area significantly were increased by bacterial inoculation in both sterile and non-sterile soil. The results showed that inoculation with bacterial treatments had a more stimulating effect on growth and development of plants in nonsterile than sterile soil. In the third experiment, Inoculation of maize seeds with all bacterial strains significantly increased plant height, 100 seed weight, number of seed per ear and leaf area .The results also showed significant increase in ear and shoot dry weight of maize.Keywords: Azospirillum, biofertilizer, Maize, PGPR, Pseudomonas.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 84062 Cold Hardiness in Near Isogenic Lines of Bread Wheat (Triticum Aestivum L. em. Thell.)
Authors: Abolfazl Rashidi Asl, Siroos Mahfoozi, Mohammad Reza Bihamta
Abstract:
Low temperature (LT) is one of the most abiotic stresses causing loss of yield in wheat (T. aestivum). Four major genes in wheat (Triticum aestivum L.) with the dominant alleles designated Vrn–A1,Vrn–B1,Vrn–D1 and Vrn4, are known to have large effects on the vernalization response, but the effects on cold hardiness are ambiguous. Poor cold tolerance has restricted winter wheat production in regions of high winter stress [9]. It was known that nearly all wheat chromosomes [5] or at least 10 chromosomes of 21 chromosome pairs are important in winter hardiness [15]. The objective of present study was to clarify the role of each chromosome in cold tolerance. With this purpose we used 20 isogenic lines of wheat. In each one of these isogenic lines only a chromosome from ‘Bezostaya’ variety (a winter habit cultivar) was substituted to ‘Capple desprez’ variety. The plant materials were planted in controlled conditions with 20º C and 16 h day length in moderately cold areas of Iran at Karaj Agricultural Research Station in 2006-07 and the acclimation period was completed for about 4 weeks in a cold room with 4º C. The cold hardiness of these isogenic lines was measured by LT50 (the temperature in which 50% of the plants are killed by freezing stress).The experimental design was completely randomized block design (RCBD)with three replicates. The results showed that chromosome 5A had a major effect on freezing tolerance, and then chromosomes 1A and 4A had less effect on this trait. Further studies are essential to understanding the importance of each chromosome in controlling cold hardiness in wheat.Keywords: Cold hardiness, isogenic lines, LT50 , Triticum.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14061 Blood Lymphocyte and Neutrophil Response of Cultured Rainbow Trout, Oncorhynchus mykiss, Administered Varying Dosages of an Oral Immunomodulator – ‘Fin-Immune™’
Authors: Duane Barker, John Holliday
Abstract:
In a 10-week (May – August, 2008) Phase I trial, 840, 1+ rainbow trout, Oncorhynchus mykiss, received a commercial oral immunomodulator, Fin Immune™, at four different dosages (0, 10, 20 and 30 mg g-1) to evaluate immune response and growth. The overall objective of was to determine an optimal dosage of this product for rainbow trout that provides enhanced immunity with maximal growth and health. Biweekly blood samples were taken from 10 randomly selected fish in each tank (30 samples per treatment) to evaluate the duration of enhanced immunity conferred by Fin-Immune™. The immunological assessment included serum white blood cell (lymphocyte, neutrophil) densities and blood hematocrit (packed cell volume %). Of these three variables, only lymphocyte density increased significantly among trout fed Fin- Immune™ at 20 and 30 mg g-1 which peaked at week 6. At week 7, all trout were switched to regular feed (lacking Fin-Immune™) and by week 10, lymphocyte levels decreased among all levels but were still greater than at week 0. There was growth impairment at the highest dose of Fin-Immune™ tested (30 mg g-1) which can be associated with a physiological compensatory mechanism due to a dose-specific threshold level. Thus, our main objective of this Phase I study was achieved, the 20 mg g-1 dose of Fin-Immune™ should be the most efficacious (of those we tested) to use for a Phase II disease challenge trial.
Keywords: Blood Lymphocyte, Neutrophil Response of Cultured Rainbow Trout, Oncorhynchus mykiss, Oral Immunomodulator – 'Fin-ImmuneTM'.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1515