Search results for: Process Models.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7444

Search results for: Process Models.

4024 Implementation of a Paraconsistent-Fuzzy Digital PID Controller in a Level Control Process

Authors: H. M. Côrtes, J. I. Da Silva Filho, M. F. Blos, B. S. Zanon

Abstract:

In a modern society the factor corresponding to the increase in the level of quality in industrial production demand new techniques of control and machinery automation. In this context, this work presents the implementation of a Paraconsistent-Fuzzy Digital PID controller. The controller is based on the treatment of inconsistencies both in the Paraconsistent Logic and in the Fuzzy Logic. Paraconsistent analysis is performed on the signals applied to the system inputs using concepts from the Paraconsistent Annotated Logic with annotation of two values (PAL2v). The signals resulting from the paraconsistent analysis are two values defined as Dc - Degree of Certainty and Dct - Degree of Contradiction, which receive a treatment according to the Fuzzy Logic theory, and the resulting output of the logic actions is a single value called the crisp value, which is used to control dynamic system. Through an example, it was demonstrated the application of the proposed model. Initially, the Paraconsistent-Fuzzy Digital PID controller was built and tested in an isolated MATLAB environment and then compared to the equivalent Digital PID function of this software for standard step excitation. After this step, a level control plant was modeled to execute the controller function on a physical model, making the tests closer to the actual. For this, the control parameters (proportional, integral and derivative) were determined for the configuration of the conventional Digital PID controller and of the Paraconsistent-Fuzzy Digital PID, and the control meshes in MATLAB were assembled with the respective transfer function of the plant. Finally, the results of the comparison of the level control process between the Paraconsistent-Fuzzy Digital PID controller and the conventional Digital PID controller were presented.

Keywords: Fuzzy logic, paraconsistent annotated logic, level control, digital PID.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1237
4023 An Adaptive Memetic Algorithm With Dynamic Population Management for Designing HIV Multidrug Therapies

Authors: Hassan Zarei, Ali Vahidian Kamyad, Sohrab Effati

Abstract:

In this paper, a mathematical model of human immunodeficiency virus (HIV) is utilized and an optimization problem is proposed, with the final goal of implementing an optimal 900-day structured treatment interruption (STI) protocol. Two type of commonly used drugs in highly active antiretroviral therapy (HAART), reverse transcriptase inhibitors (RTI) and protease inhibitors (PI), are considered. In order to solving the proposed optimization problem an adaptive memetic algorithm with population management (AMAPM) is proposed. The AMAPM uses a distance measure to control the diversity of population in genotype space and thus preventing the stagnation and premature convergence. Moreover, the AMAPM uses diversity parameter in phenotype space to dynamically set the population size and the number of crossovers during the search process. Three crossover operators diversify the population, simultaneously. The progresses of crossover operators are utilized to set the number of each crossover per generation. In order to escaping the local optima and introducing the new search directions toward the global optima, two local searchers assist the evolutionary process. In contrast to traditional memetic algorithms, the activation of these local searchers is not random and depends on both the diversity parameters in genotype space and phenotype space. The capability of AMAPM in finding optimal solutions compared with three popular metaheurestics is introduced.

Keywords: HIV therapy design, memetic algorithms, adaptivealgorithms, nonlinear integer programming.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1628
4022 Index t-SNE: Tracking Dynamics of High-Dimensional Datasets with Coherent Embeddings

Authors: G. Candel, D. Naccache

Abstract:

t-SNE is an embedding method that the data science community has widely used. It helps two main tasks: to display results by coloring items according to the item class or feature value; and for forensic, giving a first overview of the dataset distribution. Two interesting characteristics of t-SNE are the structure preservation property and the answer to the crowding problem, where all neighbors in high dimensional space cannot be represented correctly in low dimensional space. t-SNE preserves the local neighborhood, and similar items are nicely spaced by adjusting to the local density. These two characteristics produce a meaningful representation, where the cluster area is proportional to its size in number, and relationships between clusters are materialized by closeness on the embedding. This algorithm is non-parametric. The transformation from a high to low dimensional space is described but not learned. Two initializations of the algorithm would lead to two different embedding. In a forensic approach, analysts would like to compare two or more datasets using their embedding. A naive approach would be to embed all datasets together. However, this process is costly as the complexity of t-SNE is quadratic, and would be infeasible for too many datasets. Another approach would be to learn a parametric model over an embedding built with a subset of data. While this approach is highly scalable, points could be mapped at the same exact position, making them indistinguishable. This type of model would be unable to adapt to new outliers nor concept drift. This paper presents a methodology to reuse an embedding to create a new one, where cluster positions are preserved. The optimization process minimizes two costs, one relative to the embedding shape and the second relative to the support embedding’ match. The embedding with the support process can be repeated more than once, with the newly obtained embedding. The successive embedding can be used to study the impact of one variable over the dataset distribution or monitor changes over time. This method has the same complexity as t-SNE per embedding, and memory requirements are only doubled. For a dataset of n elements sorted and split into k subsets, the total embedding complexity would be reduced from O(n2) to O(n2/k), and the memory requirement from n2 to 2(n/k)2 which enables computation on recent laptops. The method showed promising results on a real-world dataset, allowing to observe the birth, evolution and death of clusters. The proposed approach facilitates identifying significant trends and changes, which empowers the monitoring high dimensional datasets’ dynamics.

Keywords: Concept drift, data visualization, dimension reduction, embedding, monitoring, reusability, t-SNE, unsupervised learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 489
4021 Learning User Keystroke Patterns for Authentication

Authors: Ying Zhao

Abstract:

Keystroke authentication is a new access control system to identify legitimate users via their typing behavior. In this paper, machine learning techniques are adapted for keystroke authentication. Seven learning methods are used to build models to differentiate user keystroke patterns. The selected classification methods are Decision Tree, Naive Bayesian, Instance Based Learning, Decision Table, One Rule, Random Tree and K-star. Among these methods, three of them are studied in more details. The results show that machine learning is a feasible alternative for keystroke authentication. Compared to the conventional Nearest Neighbour method in the recent research, learning methods especially Decision Tree can be more accurate. In addition, the experiment results reveal that 3-Grams is more accurate than 2-Grams and 4-Grams for feature extraction. Also, combination of attributes tend to result higher accuracy.

Keywords: Keystroke Authentication, Pattern recognition, MachineLearning, Instance-based Learning, Bayesian, Decision Tree.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2822
4020 Investigating Real Ship Accidents with Descriptive Analysis in Turkey

Authors: İsmail Karaca, Ömer Söner

Abstract:

The use of advanced methods has been increasing day by day in the maritime sector, which is one of the sectors least affected by the COVID-19 pandemic. It is aimed to minimize accidents, especially by using advanced methods in the investigation of marine accidents. This research aimed to conduct an exploratory statistical analysis of particular ship accidents in the Transport Safety Investigation Center of Turkey database. 46 ship accidents, which occurred between 2010-2018, have been selected from the database. In addition to the availability of a reliable and comprehensive database, taking advantage of the robust statistical models for investigation is critical to improving the safety of ships. Thus, descriptive analysis has been used in the research to identify causes and conditional factors related to different types of ship accidents. The research outcomes underline the fact that environmental factors and day and night ratio have great influence on ship safety.

Keywords: Descriptive analysis, maritime industry, maritime safety, marine accident analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 712
4019 Effect of Nanoparticle Diameter of Nano-Fluid on Average Nusselt Number in the Chamber

Authors: A. Ghafouri, N. Pourmahmoud, I. Mirzaee

Abstract:

In this numerical study, effects of using Al2O3-water nanofluid on the rate of heat transfer have been investigated. Physical model is a square enclosure with insulated top and bottom horizontal walls, while the vertical walls are kept at different constant temperatures. Two appropriate models are used to evaluate the viscosity and thermal conductivity of nanofluid. The governing stream-vorticity equations are solved using a second order central finite difference scheme, coupled to the conservation of mass and energy. The study has been carried out for the nanoparticle diameter 30, 60 and 90 nm and the solid volume fraction 0 to 0.04. Results are presented by average Nusselt number and normalized Nusselt number in different range of φ and D for mixed convection dominated regime. It is found that different heat transfer rate is predicted when the effect of nanoparticle diameter is taken into account.

Keywords: Nano-fluid, nanoparticle diameter, heat transfer enhancement, square enclosure, Nusselt number.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1694
4018 Studying the Value-Added Chain for the Fish Distribution Process at Quang Binh Fishing Port in Vietnam

Authors: Van Chung Nguyen

Abstract:

The purpose of this research is to study the current status of the value chain for fish distribution at Quang Binh Fishing Port with 360 research samples, in which the research subjects are fishermen, traders, retailers, and businesses. The research uses the approach of applying the value chain theoretical framework of Kaplinsky and Morris to quantify and describe market channels and actors participating in the value chain and analyze the value-added process of these companies according to market channels. The analysis results show that fishermen directly catch fish with high economic efficiency, but processing enterprises and, especially retailers, are the agents to obtain higher added value. Processing enterprises play a role that is not really clear due to outdated processing technology; in contrast, retailers have the highest added value. This shows that the added value of the fish supply chain at Quang Binh fishing port is still limited, leading to low output quality. Therefore, the selling price of fish to the market is still high compared to the abundant fish resources, leading to low consumption and limiting exports due to the quality of processing enterprises. This reduces demand and fishing capacity, and productivity is lower than potential. To improve the fish value chain at fishing ports, it is necessary to focus on improving product quality, strengthening linkages between actors, building brands and product consumption markets at the same time, improving the capacity of export processing enterprises.

Keywords: Quang Binh fishing port, value chain, fish market, distributions channel.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 73
4017 Effect of Columns Stiffness's and Number of Floors on the Accuracy of the Tributary Area Method

Authors: Anas M. Fares

Abstract:

The using of finite element programs in analyzing and designing buildings are becoming very popular, but there are many engineers still using the tributary area method (TAM) in designing the structural members such as columns. This study is an attempt to investigate the accuracy of the TAM results with different load condition (gravity and lateral load), different floors numbers, and different columns stiffness's. To conduct this study, linear elastic analysis in ETABS program is used. The results from finite element method are compared to those obtained from TAM. According to the analysis of the data obtained, it can be seen that there is significance difference between the real load carried by columns and the load which is calculated by using the TAM. Thus, using 3-D models are the best choice to calculate the real load effected on columns and design these columns according to this load.

Keywords: Tributary area method, finite element method, ETABS, lateral load, axial loads, reinforced concrete, stiffness, multi-floor buildings.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1121
4016 Characterization of Complex Electromagnetic Environment Created by Multiple Sources of Electromagnetic Radiation

Authors: C. Temaneh-Nyah, J. Makiche, J. Nujoma

Abstract:

This paper considers the characterization of a complex electromagnetic environment due to multiple sources of electromagnetic radiation as a five-dimensional surface which can be described by a set of several surface sections including: instant EM field intensity distribution maps at a given frequency and altitude, instantaneous spectrum at a given location in space and the time evolution of the electromagnetic field spectrum at a given point in space. This characterization if done over time can enable the exposure levels of Radio Frequency Radiation at every point in the analysis area to be determined and results interpreted based on comparison of the determined RFR exposure level with the safe guidelines for general public exposure given by recognized body such as the International commission on non-ionizing radiation protection (ICNIRP), Institute of Electrical and Electronic Engineers (IEEE), the National Radiation Protection Authority (NRPA).

Keywords: Electromagnetic Environment, Electric Field Strength, Mathematical Models.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2575
4015 Developing a Model for the Relation between Heritage and Place Identity

Authors: A. Arjomand Kermani, N. Charbgoo, M. Alalhesabi

Abstract:

In the situation of great acceleration of changes and the need for new developments in the cities on one hand and conservation and regeneration approaches on the other hand, place identity and its relation with heritage context have taken on new importance. This relation is generally mutual and complex one. The significant point in this relation is that the process of identifying something as heritage rather than just historical  phenomena, brings that which may be inherited into the realm of identity. In planning and urban design as well as environmental psychology and phenomenology domain, place identity and its attributes and components were studied and discussed. However, the relation between physical environment (especially heritage) and identity has been neglected in the planning literature. This article aims to review the knowledge on this field and develop a model on the influence and relation of these two major concepts (heritage and identity). To build this conceptual model, we draw on available literature in environmental psychology as well as planning on place identity and heritage environment using a descriptive-analytical methodology to understand how they can inform the planning strategies and governance policies. A cross-disciplinary analysis is essential to understand the nature of place identity and heritage context and develop a more holistic model of their relationship in order to be employed in planning process and decision making. Moreover, this broader and more holistic perspective would enable both social scientists and planners to learn from one another’s expertise for a fuller understanding of community dynamics. The result indicates that a combination of these perspectives can provide a richer understanding—not only of how planning impacts our experience of place, but also how place identity can impact community planning and development.

Keywords: heritage, Inter-disciplinary study, Place identity, planning

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1912
4014 Modeling of Gas Turbine Cooled Blades

Authors: A. Pashayev, D. Askerov, R. Sadiqov, A. Samedov, C. Ardil

Abstract:

In contrast to existing methods which do not take into account multiconnectivity in a broad sense of this term, we develop mathematical models and highly effective combination (BIEM and FDM) numerical methods of calculation of stationary and quasi-stationary temperature field of a profile part of a blade with convective cooling (from the point of view of realization on PC). The theoretical substantiation of these methods is proved by appropriate theorems. For it, converging quadrature processes have been developed and the estimations of errors in the terms of A.Ziqmound continuity modules have been received. For visualization of profiles are used: the method of the least squares with automatic conjecture, device spline, smooth replenishment and neural nets. Boundary conditions of heat exchange are determined from the solution of the corresponding integral equations and empirical relationships. The reliability of designed methods is proved by calculation and experimental investigations heat and hydraulic characteristics of the gas turbine first stage nozzle blade.

Keywords: Gas turbine, cooled blade, nozzle blade, temperature field.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 616
4013 Exit Strategies from The Global Crisis

Authors: Petr Teply

Abstract:

While the form of crises may change, their essence remains the same (such as a cycle of abundant liquidity, rapid credit growth, and a low-inflation environment followed by an asset-price bubble). The current market turbulence began in mid-2000s when the US economy shifted to imbalanced both internal and external macroeconomic positions. We see two key causes of these problems – loose US monetary policy in early 2000s and US government guarantees issued on the securities by government-sponsored enterprises what was further fueled by financial innovations such as structured credit products. We have discovered both negative and positive lessons deriving from this crisis and divided the negative lessons into three groups: financial products and valuation, processes and business models, and strategic issues. Moreover, we address key risk management lessons and exit strategies derived from the current crisis and recommend policies that should help diminish the negative impact of future potential crises.

Keywords: exist strategy, global crisis, risk management, corporate governance

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2085
4012 A New High Speed Neural Model for Fast Character Recognition Using Cross Correlation and Matrix Decomposition

Authors: Hazem M. El-Bakry

Abstract:

Neural processors have shown good results for detecting a certain character in a given input matrix. In this paper, a new idead to speed up the operation of neural processors for character detection is presented. Such processors are designed based on cross correlation in the frequency domain between the input matrix and the weights of neural networks. This approach is developed to reduce the computation steps required by these faster neural networks for the searching process. The principle of divide and conquer strategy is applied through image decomposition. Each image is divided into small in size sub-images and then each one is tested separately by using a single faster neural processor. Furthermore, faster character detection is obtained by using parallel processing techniques to test the resulting sub-images at the same time using the same number of faster neural networks. In contrast to using only faster neural processors, the speed up ratio is increased with the size of the input image when using faster neural processors and image decomposition. Moreover, the problem of local subimage normalization in the frequency domain is solved. The effect of image normalization on the speed up ratio of character detection is discussed. Simulation results show that local subimage normalization through weight normalization is faster than subimage normalization in the spatial domain. The overall speed up ratio of the detection process is increased as the normalization of weights is done off line.

Keywords: Fast Character Detection, Neural Processors, Cross Correlation, Image Normalization, Parallel Processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1537
4011 A Numerical Investigation on the Dynamic Stall of a Wind Turbine Section Using Different Turbulent Models

Authors: S. A. Ahmadi, S. Sharif, R. Jamshidi

Abstract:

In this article, the flow behavior around a NACA 0012 airfoil which is oscillating with different Reynolds numbers and in various amplitudes has been investigated numerically. Numerical simulations have been performed with ANSYS software. First, the 2- D geometry has been studied in different Reynolds numbers and angles of attack with various numerical methods in its static condition. This analysis was to choose the best turbulent model and comparing the grids to have the optimum one for dynamic simulations. Because the analysis was to study the blades of wind turbines, the Reynolds numbers were not arbitrary. They were in the range of 9.71e5 to 22.65e5. The angle of attack was in the range of -41.81° to 41.81°. By choosing the forward wind speed as the independent parameter, the others like Reynolds and the amplitude of the oscillation would be known automatically. The results show that the SST turbulent model is the best choice that leads the least numerical error with respect the experimental ones. Also, a dynamic stall phenomenon is more probable at lower wind speeds in which the lift force is less.

Keywords: Dynamic stall, Numerical simulation, Wind turbine, Turbulent Model

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2007
4010 A Parametric Study on the Backwater Level Due to a Bridge Constriction

Authors: S. Atabay, T. A. Ali, Md. M. Mortula

Abstract:

This paper presents the results and findings from a parametric study on the water surface elevation at upstream of bridge constriction for subcritical flow. In this study, the influence of Manning's Roughness Coefficient of main channel (nmc) and floodplain (nfp), and bridge opening (b) flow rate (Q), contraction (kcon) and expansion coefficients (kexp) were investigated on backwater level. The DECK bridge models with different span widths and without any pier were investigated within the two stage channel having various roughness conditions. One of the most commonly used commercial one-dimensional HEC-RAS model was used in this parametric study. This study showed that the effects of main channel roughness (nmc) and flow rate (Q) on the backwater level are much higher than those of the floodplain roughness (nfp). Bridge opening (b) with contraction (kcon) and expansion coefficients (kexp) have very little effect on the backwater level within this range of parameters.

Keywords: Bridge backwater, parametric study and waterways.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2505
4009 Prediction-Based Midterm Operation Planning for Energy Management of Exhibition Hall

Authors: Doseong Eom, Jeongmin Kim, Kwang Ryel Ryu

Abstract:

Large exhibition halls require a lot of energy to maintain comfortable atmosphere for the visitors viewing inside. One way of reducing the energy cost is to have thermal energy storage systems installed so that the thermal energy can be stored in the middle of night when the energy price is low and then used later when the price is high. To minimize the overall energy cost, however, we should be able to decide how much energy to save during which time period exactly. If we can foresee future energy load and the corresponding cost, we will be able to make such decisions reasonably. In this paper, we use machine learning technique to obtain models for predicting weather conditions and the number of visitors on hourly basis for the next day. Based on the energy load thus predicted, we build a cost-optimal daily operation plan for the thermal energy storage systems and cooling and heating facilities through simulation-based optimization.

Keywords: Building energy management, machine learning, simulation-based optimization, operation planning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 989
4008 Estimation of Asphalt Pavement Surfaces Using Image Analysis Technique

Authors: Mohammad A. Khasawneh

Abstract:

Asphalt concrete pavements gradually lose their skid resistance causing safety problems especially under wet conditions and high driving speeds. In order to enact the actual field polishing and wearing process of asphalt pavement surfaces in a laboratory setting, several laboratory-scale accelerated polishing devices were developed by different agencies. To mimic the actual process, friction and texture measuring devices are needed to quantify surface deterioration at different polishing intervals that reflect different stages of the pavement life. The test could still be considered lengthy and to some extent labor-intensive. Therefore, there is a need to come up with another method that can assist in investigating the bituminous pavement surface characteristics in a practical and time-efficient test procedure.

The purpose of this paper is to utilize a well-developed image analysis technique to characterize asphalt pavement surfaces without the need to use conventional friction and texture measuring devices in an attempt to shorten and simplify the polishing procedure in the lab.

Promising findings showed the possibility of using image analysis in lieu of the labor-sensitive-variable-in-nature friction and texture measurements. It was found that the exposed aggregate surface area of asphalt specimens made from limestone and gravel aggregates produced solid evidence of the validity of this method in describing asphalt pavement surfaces. Image analysis results correlated well with the British Pendulum Numbers (BPN), Polish Values (PV) and Mean Texture Depth (MTD) values.

Keywords: Friction, Image Analysis, Polishing, Statistical Analysis, Texture.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2561
4007 Reliability Analysis of Tubular Joints of Offshore Platforms in Malaysia

Authors: Nelson J. Cossa, Narayanan S. Potty, Mohd Shahir Liew, Arazi B. Idrus

Abstract:

The oil and gas industry has moved towards Load and Resistance Factor Design through API RP2A - LRFD and the recently published international standard, ISO-19902, for design of fixed steel offshore structures. The ISO 19902 is intended to provide a harmonized design practice that offers a balanced structural fitness for the purpose, economy and safety. As part of an ongoing work, the reliability analysis of tubular joints of the jacket structure has been carried out to calibrate the load and resistance factors for the design of offshore platforms in Malaysia, as proposed in the ISO. Probabilistic models have been established for the load effects (wave, wind and current) and the tubular joints strengths. In this study the First Order Reliability Method (FORM), coded in MATLAB Software has been employed to evaluate the reliability index of the typical joints, designed using API RP2A - WSD and ISO 19902.

Keywords: FORM, Reliability Analysis, Tubular Joints

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3493
4006 A Decision Boundary based Discretization Technique using Resampling

Authors: Taimur Qureshi, Djamel A Zighed

Abstract:

Many supervised induction algorithms require discrete data, even while real data often comes in a discrete and continuous formats. Quality discretization of continuous attributes is an important problem that has effects on speed, accuracy and understandability of the induction models. Usually, discretization and other types of statistical processes are applied to subsets of the population as the entire population is practically inaccessible. For this reason we argue that the discretization performed on a sample of the population is only an estimate of the entire population. Most of the existing discretization methods, partition the attribute range into two or several intervals using a single or a set of cut points. In this paper, we introduce a technique by using resampling (such as bootstrap) to generate a set of candidate discretization points and thus, improving the discretization quality by providing a better estimation towards the entire population. Thus, the goal of this paper is to observe whether the resampling technique can lead to better discretization points, which opens up a new paradigm to construction of soft decision trees.

Keywords: Bootstrap, discretization, resampling, soft decision trees.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1434
4005 Packet Forwarding with Multiprotocol Label Switching

Authors: R.N.Pise, S.A.Kulkarni, R.V.Pawar

Abstract:

MultiProtocol Label Switching (MPLS) is an emerging technology that aims to address many of the existing issues associated with packet forwarding in today-s Internetworking environment. It provides a method of forwarding packets at a high rate of speed by combining the speed and performance of Layer 2 with the scalability and IP intelligence of Layer 3. In a traditional IP (Internet Protocol) routing network, a router analyzes the destination IP address contained in the packet header. The router independently determines the next hop for the packet using the destination IP address and the interior gateway protocol. This process is repeated at each hop to deliver the packet to its final destination. In contrast, in the MPLS forwarding paradigm routers on the edge of the network (label edge routers) attach labels to packets based on the forwarding Equivalence class (FEC). Packets are then forwarded through the MPLS domain, based on their associated FECs , through swapping the labels by routers in the core of the network called label switch routers. The act of simply swapping the label instead of referencing the IP header of the packet in the routing table at each hop provides a more efficient manner of forwarding packets, which in turn allows the opportunity for traffic to be forwarded at tremendous speeds and to have granular control over the path taken by a packet. This paper deals with the process of MPLS forwarding mechanism, implementation of MPLS datapath , and test results showing the performance comparison of MPLS and IP routing. The discussion will focus primarily on MPLS IP packet networks – by far the most common application of MPLS today.

Keywords: Forwarding equivalence class, incoming label map, label, next hop label forwarding entry.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2693
4004 Mechanical Design and Theoretical Analysis of a Skip-Cycle Mechanism for an Internal Combustion Engine

Authors: Ismail Gerzeli, Cemal Baykara, Osman Akin Kutlar

Abstract:

Skip cycle is a working strategy for spark ignition engines, which allows changing the effective stroke of an engine through skipping some of the four stroke cycles. This study proposes a new mechanism to achieve the desired skip-cycle strategy for internal combustion engines. The air and fuel leakage, which occurs through the gas exchange, negatively affects the efficiency of the engine at high speeds and loads. An absolute sealing is assured by direct use of poppet valves, which are kept in fully closed position during the skipped mode. All the components of the mechanism were designed according to the real dimensions of the Anadolu Motor's gasoline engine and modeled in 3D by means of CAD software. As the mechanism operates in two modes, two dynamically equivalent models are established to obtain the force and strength analysis for critical components.

Keywords: Dynamic Model, Mechanical Design, Skip Cycle System (SCS), Valve Disabling Mechanism

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2007
4003 Impact of Revenue Gap on Budget Deficit, Debt Burden and Economic Growth: An Evidence from Pakistan

Authors: M. W. Siddiqi, M. Ilyas

Abstract:

Availability and mobilization of revenue is the main essential with which an economy is managed and run. While planning or while making the budgets nations set revenue targets to be achieved. But later when the accounts are closed the actual collections of revenue through taxes or even the non-tax revenue collection would invariably be different as compared to the initial estimates and targets set to be achieved. This revenue-gap distorts the whole system and the economy disturbing all the major macroeconomic indicators. This study is aimed to find out short and long term impact of revenue gap on budget deficit, debt burden and economic growth on the economy of Pakistan. For this purpose the study uses autoregressive distributed lag approach to cointegration and error correction mechanism on three different models for the period 1980 to 2009. The empirical results show that revenue gap has a short and long run relationship with economic growth and budget deficit. However, revenue gap has no impact on debt burden.

Keywords: Revenue Gap, Economic Growth, Budget Deficit, Debt Burden

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2820
4002 Boundary Segmentation of Microcalcification using Parametric Active Contours

Authors: Abdul Kadir Jumaat, Siti Salmah Yasiran, Wan Eny Zarina Wan Abd Rahman, Aminah Abdul Malek

Abstract:

A mammography image is composed of low contrast area where the breast tissues and the breast abnormalities such as microcalcification can hardly be differentiated by the medical practitioner. This paper presents the application of active contour models (Snakes) for the segmentation of microcalcification in mammography images. Comparison on the microcalcifiation areas segmented by the Balloon Snake, Gradient Vector Flow (GVF) Snake, and Distance Snake is done against the true value of the microcalcification area. The true area value is the average microcalcification area in the original mammography image traced by the expert radiologists. From fifty images tested, the result obtained shows that the accuracy of the Balloon Snake, GVF Snake, and Distance Snake in segmenting boundaries of microcalcification are 96.01%, 95.74%, and 95.70% accuracy respectively. This implies that the Balloon Snake is a better segmentation method to locate the exact boundary of a microcalcification region.

Keywords: Balloon Snake, GVF Snake, Distance Snake, Mammogram, Microcalcifications, Segmentation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1726
4001 Forecasting Models for Steel Demand Uncertainty Using Bayesian Methods

Authors: Watcharin Sangma, Onsiri Chanmuang, Pitsanu Tongkhow

Abstract:

 A forecasting model for steel demand uncertainty in Thailand is proposed. It consists of trend, autocorrelation, and outliers in a hierarchical Bayesian frame work. The proposed model uses a cumulative Weibull distribution function, latent first-order autocorrelation, and binary selection, to account for trend, time-varying autocorrelation, and outliers, respectively. The Gibbs sampling Markov Chain Monte Carlo (MCMC) is used for parameter estimation. The proposed model is applied to steel demand index data in Thailand. The root mean square error (RMSE), mean absolute percentage error (MAPE), and mean absolute error (MAE) criteria are used for model comparison. The study reveals that the proposed model is more appropriate than the exponential smoothing method.

Keywords: Forecasting model, Steel demand uncertainty, Hierarchical Bayesian framework, Exponential smoothing method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2535
4000 One-Class Support Vector Machine for Sentiment Analysis of Movie Review Documents

Authors: Chothmal, Basant Agarwal

Abstract:

Sentiment analysis means to classify a given review document into positive or negative polar document. Sentiment analysis research has been increased tremendously in recent times due to its large number of applications in the industry and academia. Sentiment analysis models can be used to determine the opinion of the user towards any entity or product. E-commerce companies can use sentiment analysis model to improve their products on the basis of users’ opinion. In this paper, we propose a new One-class Support Vector Machine (One-class SVM) based sentiment analysis model for movie review documents. In the proposed approach, we initially extract features from one class of documents, and further test the given documents with the one-class SVM model if a given new test document lies in the model or it is an outlier. Experimental results show the effectiveness of the proposed sentiment analysis model.

Keywords: Feature selection methods, Machine learning, NB, One-class SVM, Sentiment Analysis, Support Vector Machine.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3303
3999 Sizing the Protection Devices to Control Water Hammer Damage

Authors: I. Abuiziah, A. Oulhaj, K. Sebari, D. Ouazar

Abstract:

The primary objectives of transient analysis are to determine the values of transient pressures that can result from flow control operations and to establish the design criteria for system equipment and devices (such as control devices and pipe wall thickness) so as to provide an acceptable level of protection against system failure due to pipe collapse or bursting. Because of the complexity of the equations needed to describe transients, numerical computer models are used to analyze transient flow hydraulics. An effective numerical model allows the hydraulic engineer to analyze potential transient events and to identify and evaluate alternative solutions for controlling hydraulic transients, thereby protecting the integrity of the hydraulic system. This paper presents the influence of using the protection devices to control the adverse effects due to excessive and low pressure occurs in the transient.

Keywords: Flow Transient, Water hammer, Pipeline System, Surge Tank, Simulation Model, Protection Devices.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9494
3998 Evaluation of the Quality of Education Offered to Students with Special Needs in Public Schools in the City of Bauru, Brazil

Authors: V. L. M. F. Capellini, A. P. P. M. Maturana, N. C. M. Brondino, M. B. C. L. B. M. Peixoto, A. J. Broughton

Abstract:

A paradigm shift is a process. The process of implementing inclusive education, a system constructed to support all learners, requires planning, identification, experimentation, and evaluation. In this vein, the purpose of the present study was to evaluate the capacity of one Brazilian state school systems to provide special education students with a quality inclusive education. This study originated at the behest of concerned families of students with special needs who filed complaints with the Municipality of Bauru, São Paulo. These families claimed, 1) children with learning differences and educational needs had not been identified for services, and 2) those who had been identified had not received sufficient specialized educational assistance (SEA) in schools across the City of Bauru. Hence, the Office of Civil Rights for the state of São Paulo (Ministério Público de São Paulo) summoned the local higher education institution, UNESP, to design a research study to investigate these allegations. In this exploratory study, descriptive data were gathered from all elementary and middle schools including 58 state schools and 17 city schools, for a total of 75 schools overall. Data collection consisted of each school's annual strategic action plan, surveys and interviews with all school stakeholders to determine their perceptions of the inclusive education available to students with Special Education Needs (SEN). The data were collected as one of four stages in a larger study which also included field observations of a focal students' experience and a continuing education course for all teachers and administrators in both state and city schools. For the purposes of this study, the researchers were interested in understanding the perceptions of school staff, parents, and students across all schools. Therefore, documents and surveys from 75 schools were analyzed for adherence to federal legislation guaranteeing students with SEN the right to special education assistance within the regular school setting. Results shows that while some schools recognized the legal rights of SEN students to receive special education, the plans to actually deliver services were absent. In conclusion, the results of this study revealed both school staff and families have insufficient planning and accessibility resources, and the schools have inadequate infrastructure for full-time support to SEN students, i.e., structures and systems to support the identification of SEN and delivery of services within schools of Bauru, SP. Having identified the areas of need, the city is now prepared to take next steps in the process toward preparing all schools to be inclusive.

Keywords: Inclusive education, special education, special needs.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1019
3997 Ghost Frequency Noise Reduction through Displacement Deviation Analysis

Authors: Paua Ketan, Bhagate Rajkumar, Adiga Ganesh, M. Kiran

Abstract:

Low gear noise is an important sound quality feature in modern passenger cars. Annoying gear noise from the gearbox is influenced by the gear design, gearbox shaft layout, manufacturing deviations in the components, assembly errors and the mounting arrangement of the complete gearbox. Geometrical deviations in the form of profile and lead errors are often present on the flanks of the inspected gears. Ghost frequencies of a gear are very challenging to identify in standard gear measurement and analysis process due to small wavelengths involved. In this paper, gear whine noise occurring at non-integral multiples of gear mesh frequency of passenger car gearbox is investigated and the root cause is identified using the displacement deviation analysis (DDA) method. DDA method is applied to identify ghost frequency excitations on the flanks of gears arising out of generation grinding. Frequency identified through DDA correlated with the frequency of vibration and noise on the end-of-line machine as well as vehicle level measurements. With the application of DDA method along with standard lead profile measurement, gears with ghost frequency geometry deviations were identified on the production line to eliminate defective parts and thereby eliminate ghost frequency noise from a vehicle. Further, displacement deviation analysis can be used in conjunction with the manufacturing process simulation to arrive at suitable countermeasures for arresting the ghost frequency.

Keywords: Displacement deviation analysis, gear whine, ghost frequency, sound quality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 803
3996 Evaluating Efficiency of Nina Distribution Company Using Window Data Envelopment Analysis and Malmquist Index

Authors: Hossein Taherian Far, Ali Bazaee

Abstract:

Achieving continuous sustained economic growth and following economic development can be the target for all countries which are looking for it. In this regard, distribution industry plays an important role in growth and development of any nation. So, estimating the efficiency and productivity of the so called industry and identifying factors influencing it, is very necessary. The objective of the present study is to measure the efficiency and productivity of seven branches of Nina Distribution Company using window data envelopment analysis and Malmquist productivity index from spring 2013 to summer 2015. In this study, using criteria of fixed assets, payroll personnel, operating costs and duration of collection of receivables were selected as inputs and people and net sales, gross profit and percentage of coverage to customers were selected as outputs. Then, the process of performance window data envelopment analysis was driven and process efficiency has been measured using Malmquist index. The results indicate that the average technical efficiency of window Data Envelopment Analysis (DEA) model and fluctuating trend is sustainable. But the average management efficiency in window DEA model is related with negative growth (decline) of about 13%. The mean scale efficiency in all windows, except in the second one which is faced with 8%, shows growth of 18% compared to the first window. On the other hand, the mean change in total factor productivity in all branches of the industry shows average negative growth (decrease) of 12% which are the result of a negative change in technology.

Keywords: Nina Distribution Company branches, window data envelopment analysis, Malmquist productivity index.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1162
3995 Waste-Based Surface Modification to Enhance Corrosion Resistance of Aluminium Bronze Alloy

Authors: Wilson Handoko, Farshid Pahlevani, Isha Singla, Himanish Kumar, Veena Sahajwalla

Abstract:

Aluminium bronze alloys are well known for their superior abrasion, tensile strength and non-magnetic properties, due to the co-presence of iron (Fe) and aluminium (Al) as alloying elements and have been commonly used in many industrial applications. However, continuous exposure to the marine environment will accelerate the risk of a tendency to Al bronze alloys parts failures. Although a higher level of corrosion resistance properties can be achieved by modifying its elemental composition, it will come at a price through the complex manufacturing process and increases the risk of reducing the ductility of Al bronze alloy. In this research, the use of ironmaking slag and waste plastic as the input source for surface modification of Al bronze alloy was implemented. Microstructural analysis conducted using polarised light microscopy and scanning electron microscopy (SEM) that is equipped with energy dispersive spectroscopy (EDS). An electrochemical corrosion test was carried out through Tafel polarisation method and calculation of protection efficiency against the base-material was determined. Results have indicated that uniform modified surface which is as the result of selective diffusion process, has enhanced corrosion resistance properties up to 12.67%. This approach has opened a new opportunity to access various industrial utilisations in commercial scale through minimising the dependency on natural resources by transforming waste sources into the protective coating in environmentally friendly and cost-effective ways.

Keywords: Aluminium bronze, waste-based surface modification, Tafel polarisation, corrosion resistance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1054