Search results for: Emergence procedure
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1041

Search results for: Emergence procedure

231 Fighter Aircraft Evaluation and Selection Process Based on Triangular Fuzzy Numbers in Multiple Criteria Decision Making Analysis Using the Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS)

Authors: C. Ardil

Abstract:

This article presents a multiple criteria evaluation approach to uncertainty, vagueness, and imprecision analysis for ranking alternatives with fuzzy data for decision making using the Technique for Order Preference by Similarity to Ideal Solution (TOPSIS). The fighter aircraft evaluation and selection decision making problem is modeled in a fuzzy environment with triangular fuzzy numbers. The fuzzy decision information related to the fighter aircraft selection problem is taken into account in ordering the alternatives and selecting the best candidate. The basic fuzzy TOPSIS procedure steps transform fuzzy decision matrices into matrices of alternatives evaluated according to all decision criteria. A practical numerical example illustrates the proposed approach to the fighter aircraft selection problem.

Keywords: triangular fuzzy number (TFN), multiple criteria decision making analysis, decision making, aircraft selection, MCDMA, fuzzy TOPSIS

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 472
230 PCR based Detection of Food Borne Pathogens

Authors: Archana Panchapakesan Iyer, Taha Abdullah Kumosani

Abstract:

Many high-risk pathogens that cause disease in humans are transmitted through various food items. Food-borne disease constitutes a major public health problem. Assessment of the quality and safety of foods is important in human health. Rapid and easy detection of pathogenic organisms will facilitate precautionary measures to maintain healthy food. The Polymerase Chain Reaction (PCR) is a handy tool for rapid detection of low numbers of bacteria. We have designed gene specific primers for most common food borne pathogens such as Staphylococci, Salmonella and E.coli. Bacteria were isolated from food samples of various food outlets and identified using gene specific PCRs. We identified Staphylococci, Salmonella and E.coli O157 using gene specific primers by rapid and direct PCR technique in various food samples. This study helps us in getting a complete picture of the various pathogens that threaten to cause and spread food borne diseases and it would also enable establishment of a routine procedure and methodology for rapid identification of food borne bacteria using the rapid technique of direct PCR. This study will also enable us to judge the efficiency of present food safety steps taken by food manufacturers and exporters.

Keywords: food borne pathogens, PCR, food safety, rapiddetection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2818
229 Monitoring Blood Pressure Using Regression Techniques

Authors: Qasem Qananwah, Ahmad Dagamseh, Hiam AlQuran, Khalid Shaker Ibrahim

Abstract:

Blood pressure helps the physicians greatly to have a deep insight into the cardiovascular system. The determination of individual blood pressure is a standard clinical procedure considered for cardiovascular system problems. The conventional techniques to measure blood pressure (e.g. cuff method) allows a limited number of readings for a certain period (e.g. every 5-10 minutes). Additionally, these systems cause turbulence to blood flow; impeding continuous blood pressure monitoring, especially in emergency cases or critically ill persons. In this paper, the most important statistical features in the photoplethysmogram (PPG) signals were extracted to estimate the blood pressure noninvasively. PPG signals from more than 40 subjects were measured and analyzed and 12 features were extracted. The features were fed to principal component analysis (PCA) to find the most important independent features that have the highest correlation with blood pressure. The results show that the stiffness index means and standard deviation for the beat-to-beat heart rate were the most important features. A model representing both features for Systolic Blood Pressure (SBP) and Diastolic Blood Pressure (DBP) was obtained using a statistical regression technique. Surface fitting is used to best fit the series of data and the results show that the error value in estimating the SBP is 4.95% and in estimating the DBP is 3.99%.

Keywords: Blood pressure, noninvasive optical system, PCA, continuous monitoring.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 687
228 Characterization of the Microbial Induced Carbonate Precipitation Technique as a Biological Cementing Agent for Sand Deposits

Authors: Sameh Abu El-Soud, Zahra Zayed, Safwan Khedr, Adel M. Belal

Abstract:

The population increase in Egypt is urging for horizontal land development which became a demand to allow the benefit of different natural resources and expand from the narrow Nile valley. However, this development is facing challenges preventing land development and agriculture development. Desertification and moving sand dunes in the west sector of Egypt are considered the major obstacle that is blocking the ideal land use and development. In the proposed research, the sandy soil is treated biologically using Bacillus pasteurii bacteria as these bacteria have the ability to bond the sand partials to change its state of loose sand to cemented sand, which reduces the moving ability of the sand dunes. The procedure of implementing the Microbial Induced Carbonate Precipitation Technique (MICP) technique is examined, and the different factors affecting on this process such as the medium of bacteria sample preparation, the optical density (OD600), the reactant concentration, injection rates and intervals are highlighted. Based on the findings of the MICP treatment for sandy soil, conclusions and future recommendations are reached.

Keywords: Soil stabilization, biological treatment, MICP, sand cementation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1029
227 An Authentic Algorithm for Ciphering and Deciphering Called Latin Djokovic

Authors: Diogen Babuc

Abstract:

The question that is a motivation of writing is how many devote themselves to discovering something in the world of science where much is discerned and revealed, but at the same time, much is unknown. The insightful elements of this algorithm are the ciphering and deciphering algorithms of Playfair, Caesar, and Vigen`ere. Only a few of their main properties are taken and modified, with the aim of forming a specific functionality of the algorithm called Latin Djokovic. Specifically, a string is entered as input data. A key k is given, with a random value between the values a and b = a+3. The obtained value is stored in a variable with the aim of being constant during the run of the algorithm. In correlation to the given key, the string is divided into several groups of substrings, and each substring has a length of k characters. The next step involves encoding each substring from the list of existing substrings. Encoding is performed using the basis of Caesar algorithm, i.e. shifting with k characters. However, that k is incremented by 1 when moving to the next substring in that list. When the value of k becomes greater than b + 1, it will return to its initial value. The algorithm is executed, following the same procedure, until the last substring in the list is traversed. Using this polyalphabetic method, ciphering and deciphering of strings are achieved. The algorithm also works for a 100-character string. The x character is not used when the number of characters in a substring is incompatible with the expected length. The algorithm is simple to implement, but it is questionable if it works better than the other methods, from the point of view of execution time and storage space.

Keywords: Ciphering and deciphering, Authentic Algorithm, Polyalphabetic Cipher, Random Key, methods comparison.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 196
226 Optimization of Hemp Fiber Reinforced Concrete for Mix Design Method

Authors: Zoe Chang, Max Williams, Gautham Das

Abstract:

The purpose of this study is to evaluate the incorporation of hemp fibers (HF) in concrete. Hemp fiber reinforced concrete (HFRC) is becoming more popular as an alternative for regular mix designs. This study was done to evaluate the compressive strength of HFRC regarding mix procedure. HF were obtained from the manufacturer and hand processed to ensure uniformity in width and length. The fibers were added to concrete as both wet and dry mix to investigate and optimize the mix design process. Results indicated that the dry mix had a compressive strength of 1157 psi compared to the wet mix of 985 psi. This dry mix compressive strength was within range of the standard mix compressive strength of 1533 psi. The statistical analysis revealed that the mix design process needs further optimization and uniformity concerning the addition of HF. Regression analysis revealed that the standard mix design had a coefficient of 0.9 as compared to the dry mix of 0.375 indicating a variation in the mixing process. While completing the dry mix, the addition of plain HF caused them to intertwine creating lumps and inconsistency. However, during the wet mixing process, combining water and HF before incorporation allows the fibers to uniformly disperse within the mix hence the regression analysis indicated a better coefficient of 0.55. This study concludes that HRFC is a viable alternative to regular mixes however more research surrounding its characteristics needs to be conducted.

Keywords: hemp fibers, hemp reinforced concrete, wet and dry, freeze thaw testing, compressive strength

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 556
225 Structured Phospholipids from Commercial Soybean Lecithin Containing Omega-3 Fatty Acids Reduces Atherosclerosis Risk in Male Sprague dawley Rats which Fed with an Atherogenic Diet

Authors: Jaya Mahar Maligan, Teti Estiasih, Joni Kusnadi

Abstract:

Structured phospholipids from commercial soybean lecithin with oil enriched omega-3 fatty acid form by product of tuna canning is alternative procedure to provides the stability of omega-3 fatty acid structure and increase these bioactive function in metabolism. Best treatment condition was obtain in 18 hours acidolysis reaction with 30% enzyme concentration, which EPADHA incorporation level was 127,47 mg/g and incorporation percentage of EPA-DHA was 51,04% at phospholipids structure. This structured phospolipids could reduce atherosclerosis risk in male Sprague dawley rat. Provision of structured phospholipids has significant effect (α = 0.05) on changes in lipid profile, intima-media thickness of aorta rats (male Sprague dawley) fed atherogenic diet. Structured phospholipids intake can lower total cholesterol 78.36 mg/dL, total triglycerides 94,57 mg/dL, LDL levels 87.08 mg/dL and increased HDL level as much as 12,64 mg/dL in 10 weeks cares. Structured phospholipids intake also can prevent the thickening of the intima-media layer of the aorta.

Keywords: Structured phospholipids, commercial soybean lecithin, omega-3 fatty acid, atherosclerosis risk.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2562
224 Automated Optic Disc Detection in Retinal Images of Patients with Diabetic Retinopathy and Risk of Macular Edema

Authors: Arturo Aquino, Manuel Emilio Gegundez, Diego Marin

Abstract:

In this paper, a new automated methodology to detect the optic disc (OD) automatically in retinal images from patients with risk of being affected by Diabetic Retinopathy (DR) and Macular Edema (ME) is presented. The detection procedure comprises two independent methodologies. On one hand, a location methodology obtains a pixel that belongs to the OD using image contrast analysis and structure filtering techniques and, on the other hand, a boundary segmentation methodology estimates a circular approximation of the OD boundary by applying mathematical morphology, edge detection techniques and the Circular Hough Transform. The methodologies were tested on a set of 1200 images composed of 229 retinographies from patients affected by DR with risk of ME, 431 with DR and no risk of ME and 540 images of healthy retinas. The location methodology obtained 98.83% success rate, whereas the OD boundary segmentation methodology obtained good circular OD boundary approximation in 94.58% of cases. The average computational time measured over the total set was 1.67 seconds for OD location and 5.78 seconds for OD boundary segmentation.

Keywords: Diabetic retinopathy, macular edema, optic disc, automated detection, automated segmentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2790
223 An Optimal Algorithm for Finding (r, Q) Policy in a Price-Dependent Order Quantity Inventory System with Soft Budget Constraint

Authors: S. Hamid Mirmohammadi, Shahrazad Tamjidzad

Abstract:

This paper is concerned with the single-item continuous review inventory system in which demand is stochastic and discrete. The budget consumed for purchasing the ordered items is not restricted but it incurs extra cost when exceeding specific value. The unit purchasing price depends on the quantity ordered under the all-units discounts cost structure. In many actual systems, the budget as a resource which is occupied by the purchased items is limited and the system is able to confront the resource shortage by charging more costs. Thus, considering the resource shortage costs as a part of system costs, especially when the amount of resource occupied by the purchased item is influenced by quantity discounts, is well motivated by practical concerns. In this paper, an optimization problem is formulated for finding the optimal (r, Q) policy, when the system is influenced by the budget limitation and a discount pricing simultaneously. Properties of the cost function are investigated and then an algorithm based on a one-dimensional search procedure is proposed for finding an optimal (r, Q) policy which minimizes the expected system costs.

Keywords: (r, Q) policy, Stochastic demand, backorders, limited resource, quantity discounts.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1862
222 An Angioplasty Intervention Simulator with a Specific Virtual Environment

Authors: G. Aloisio, L. T. De Paolis, A. De Mauro, A. Mongelli

Abstract:

One of the essential requirements of a realistic surgical simulator is to reproduce haptic sensations due to the interactions in the virtual environment. However, the interaction need to be performed in real-time, since a delay between the user action and the system reaction reduces the immersion sensation. In this paper, a prototype of a coronary stent implant simulator is present; this system allows real-time interactions with an artery by means of a specific haptic device. To improve the realism of the simulation, the building of the virtual environment is based on real patients- images and a Web Portal is used to search in the geographically remote medical centres a virtual environment with specific features in terms of pathology or anatomy. The functional architecture of the system defines several Medical Centres in which virtual environments built from the real patients- images and related metadata with specific features in terms of pathology or anatomy are stored. The searched data are downloaded from the Medical Centre to the Training Centre provided with a specific haptic device and with the software necessary both to manage the interaction in the virtual environment. After the integration of the virtual environment in the simulation system it is possible to perform training on the specific surgical procedure.

Keywords: Medical Simulation, Web Portal, Virtual Reality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1797
221 Nonlinear Effects in Stiffness Modeling of Robotic Manipulators

Authors: A. Pashkevich, A. Klimchik, D. Chablat

Abstract:

The paper focuses on the enhanced stiffness modeling of robotic manipulators by taking into account influence of the external force/torque acting upon the end point. It implements the virtual joint technique that describes the compliance of manipulator elements by a set of localized six-dimensional springs separated by rigid links and perfect joints. In contrast to the conventional formulation, which is valid for the unloaded mode and small displacements, the proposed approach implicitly assumes that the loading leads to the non-negligible changes of the manipulator posture and corresponding amendment of the Jacobian. The developed numerical technique allows computing the static equilibrium and relevant force/torque reaction of the manipulator for any given displacement of the end-effector. This enables designer detecting essentially nonlinear effects in elastic behavior of manipulator, similar to the buckling of beam elements. It is also proposed the linearization procedure that is based on the inversion of the dedicated matrix composed of the stiffness parameters of the virtual springs and the Jacobians/Hessians of the active and passive joints. The developed technique is illustrated by an application example that deals with the stiffness analysis of a parallel manipulator of the Orthoglide family

Keywords: Robotic manipulators, Stiffness model, Loaded mode, Nonlinear effects, Buckling, Orthoglide manipulator

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1458
220 Identity Management in Virtual Worlds Based on Biometrics Watermarking

Authors: S. Bader, N. Essoukri Ben Amara

Abstract:

With the technological development and rise of virtual worlds, these spaces are becoming more and more attractive for cybercriminals, hidden behind avatars and fictitious identities. Since access to these spaces is not restricted or controlled, some impostors take advantage of gaining unauthorized access and practicing cyber criminality. This paper proposes an identity management approach for securing access to virtual worlds. The major purpose of the suggested solution is to install a strong security mechanism to protect virtual identities represented by avatars. Thus, only legitimate users, through their corresponding avatars, are allowed to access the platform resources. Access is controlled by integrating an authentication process based on biometrics. In the request process for registration, a user fingerprint is enrolled and then encrypted into a watermark utilizing a cancelable and non-invertible algorithm for its protection. After a user personalizes their representative character, the biometric mark is embedded into the avatar through a watermarking procedure. The authenticity of the avatar identity is verified when it requests authorization for access. We have evaluated the proposed approach on a dataset of avatars from various virtual worlds, and we have registered promising performance results in terms of authentication accuracy, acceptation and rejection rates.

Keywords: Identity management, security, biometrics authentication and authorization, avatar, virtual world.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1656
219 Experimental and Theoretical Study on Hygrothermal Aging Effect on Mechanical Behavior of Fiber Reinforced Plastic Laminates

Authors: S. Larbi, R. Bensaada, S. Djebali, A. Bilek

Abstract:

The manufacture of composite parts is a major issue in many industrial domains. Polymer composite materials are ideal for structural applications where high strength-to-weight and stiffness-to-weight ratios are required. However, exposition to extreme environment conditions (temperature, humidity) affects mechanical properties of organic composite materials and lead to an undesirable degradation. Aging mechanisms in organic matrix are very diverse and vary according to the polymer and the aging conditions such as temperature, humidity etc. This paper studies the hygrothermal aging effect on the mechanical properties of fiber reinforced plastics laminates at 40 °C in different environment exposure. Two composite materials are used to conduct the study (carbon fiber/epoxy and glass fiber/vinyl ester with two stratifications for both the materials [904/04] and [454/04]). The experimental procedure includes a mechanical characterization of the materials in a virgin state and exposition of specimens to two environments (seawater and demineralized water). Absorption kinetics for the two materials and both the stratifications are determined. Three-point bending test is performed on the aged materials in order to determine the hygrothermal effect on the mechanical properties of the materials.

Keywords: FRP laminates, hygrothermal aging, mechanical properties, theory of laminates.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1232
218 Supplier Selection in a Scenario Based Stochastic Model with Uncertain Defectiveness and Delivery Lateness Rates

Authors: Abeer Amayri, Akif A. Bulgak

Abstract:

Due to today’s globalization as well as outsourcing practices of the companies, the Supply Chain (SC) performances have become more dependent on the efficient movement of material among places that are geographically dispersed, where there is more chance for disruptions. One such disruption is the quality and delivery uncertainties of outsourcing. These uncertainties could lead the products to be unsafe and, as is the case in a number of recent examples, companies may have to end up in recalling their products. As a result of these problems, there is a need to develop a methodology for selecting suppliers globally in view of risks associated with low quality and late delivery. Accordingly, we developed a two-stage stochastic model that captures the risks associated with uncertainty in quality and delivery as well as a solution procedure for the model. The stochastic model developed simultaneously optimizes supplier selection and purchase quantities under price discounts over a time horizon. In particular, our target is the study of global organizations with multiple sites and multiple overseas suppliers, where the pricing is offered in suppliers’ local currencies. Our proposed methodology is applied to a case study for a US automotive company having two assembly plants and four potential global suppliers to illustrate how the proposed model works in practice.

Keywords: Global supply chains, quality, stochastic programming, supplier selection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1568
217 The Comparation of Activation Nuclear Factor Kappa Beta (NFKB) at Rattus Novergicus Strain Wistar Induced by Various Duration High Fat Diet (HFD)

Authors: Titin Andri Wihastuti, Djanggan Sargowo

Abstract:

NFκB is a transcription factor regulating many function of the vessel wall. In the normal condition , NFκB is revealed diffuse cytoplasmic expressionsuggesting that the system is inactive. The presence of activation NFκB provide a potential pathway for the rapid transcriptional of a variety of genes encoding cytokines, growth factors, adhesion molecules and procoagulatory factors. It is likely to play an important role in chronic inflamatory disease involved atherosclerosis. There are many stimuli with the potential to active NFκB, including hyperlipidemia. We used 24 mice which was divided in 6 groups. The HFD given by et libitum procedure during 2, 4, and 6 months. The parameters in this study were the amount of NFKB activation ,H2O2 as ROS and VCAM-1 as a product of NFKB activation. H2O2 colorimetryc assay performed directly using Anti Rat H2O2 ELISA Kit. The NFKB and VCAM-1 detection obtained from aorta mice, measured by ELISA kit and imunohistochemistry. There was a significant difference activation of H2O2, NFKB and VCAM-1 level at induce HFD after 2, 4 and 6 months. It suggest that HFD induce ROS formation and increase the activation of NFKB as one of atherosclerosis marker that caused by hyperlipidemia as classical atheroschlerosis risk factor.

Keywords: High Fat Diet, NFKB, H2O2, atherosclerosis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2031
216 Dynamic Variational Multiscale LES of Bluff Body Flows on Unstructured Grids

Authors: Carine Moussaed, Stephen Wornom, Bruno Koobus, Maria Vittoria Salvetti, Alain Dervieux,

Abstract:

The effects of dynamic subgrid scale (SGS) models are investigated in variational multiscale (VMS) LES simulations of bluff body flows. The spatial discretization is based on a mixed finite element/finite volume formulation on unstructured grids. In the VMS approach used in this work, the separation between the largest and the smallest resolved scales is obtained through a variational projection operator and a finite volume cell agglomeration. The dynamic version of Smagorinsky and WALE SGS models are used to account for the effects of the unresolved scales. In the VMS approach, these effects are only modeled in the smallest resolved scales. The dynamic VMS-LES approach is applied to the simulation of the flow around a circular cylinder at Reynolds numbers 3900 and 20000 and to the flow around a square cylinder at Reynolds numbers 22000 and 175000. It is observed as in previous studies that the dynamic SGS procedure has a smaller impact on the results within the VMS approach than in LES. But improvements are demonstrated for important feature like recirculating part of the flow. The global prediction is improved for a small computational extra cost.

Keywords: variational multiscale LES, dynamic SGS model, unstructured grids, circular cylinder, square cylinder.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1824
215 Multi-matrix Real-coded Genetic Algorithm for Minimising Total Costs in Logistics Chain Network

Authors: Pupong Pongcharoen, Aphirak Khadwilard, Anothai Klakankhai

Abstract:

The importance of supply chain and logistics management has been widely recognised. Effective management of the supply chain can reduce costs and lead times and improve responsiveness to changing customer demands. This paper proposes a multi-matrix real-coded Generic Algorithm (MRGA) based optimisation tool that minimises total costs associated within supply chain logistics. According to finite capacity constraints of all parties within the chain, Genetic Algorithm (GA) often produces infeasible chromosomes during initialisation and evolution processes. In the proposed algorithm, chromosome initialisation procedure, crossover and mutation operations that always guarantee feasible solutions were embedded. The proposed algorithm was tested using three sizes of benchmarking dataset of logistic chain network, which are typical of those faced by most global manufacturing companies. A half fractional factorial design was carried out to investigate the influence of alternative crossover and mutation operators by varying GA parameters. The analysis of experimental results suggested that the quality of solutions obtained is sensitive to the ways in which the genetic parameters and operators are set.

Keywords: Genetic Algorithm, Logistics, Optimisation, Supply Chain.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1810
214 Exploring the Activity Fabric of an Intelligent Environment with Hierarchical Hidden Markov Theory

Authors: Chiung-Hui Chen

Abstract:

The Internet of Things (IoT) was designed for widespread convenience. With the smart tag and the sensing network, a large quantity of dynamic information is immediately presented in the IoT. Through the internal communication and interaction, meaningful objects provide real-time services for users. Therefore, the service with appropriate decision-making has become an essential issue. Based on the science of human behavior, this study employed the environment model to record the time sequences and locations of different behaviors and adopted the probability module of the hierarchical Hidden Markov Model for the inference. The statistical analysis was conducted to achieve the following objectives: First, define user behaviors and predict the user behavior routes with the environment model to analyze user purposes. Second, construct the hierarchical Hidden Markov Model according to the logic framework, and establish the sequential intensity among behaviors to get acquainted with the use and activity fabric of the intelligent environment. Third, establish the intensity of the relation between the probability of objects’ being used and the objects. The indicator can describe the possible limitations of the mechanism. As the process is recorded in the information of the system created in this study, these data can be reused to adjust the procedure of intelligent design services.

Keywords: Behavior, big data, hierarchical Hidden Markov Model, intelligent object.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 764
213 Interoperability and Performance Analysis of IEC61850 Based Substation Protection System

Authors: Ming-Ta Yang, Jyh-Cherng Gu, Po-Chun Lin, Yen-Lin Huang, Chun-Wei Huang, Jin-Lung Guan

Abstract:

Since IEC61850 substation communication standard represents the trend to develop new generations of Substation Automation System (SAS), many IED manufacturers pursue this technique and apply for KEMA. In order to put on the market to meet customer demand as fast as possible, manufacturers often apply their products only for basic environment standard certification but claim to conform to IEC61850 certification. Since verification institutes generally perform verification tests only on specific IEDs of the manufacturers, the interoperability between all certified IEDs cannot be guaranteed. Therefore the interoperability between IEDs from different manufacturers needs to be tested. Based upon the above reasons, this study applies the definitions of the information models, communication service, GOOSE functionality and Substation Configuration Language (SCL) of the IEC61850 to build the concept of communication protocols, and build the test environment. The procedures of the test of the data collection and exchange of the P2P communication mode and Client / Server communication mode in IEC61850 are outlined as follows. First, test the IED GOOSE messages communication capability from different manufacturers. Second, collect IED data from each IED with SCADA system and use HMI to display the SCADA platform. Finally, problems generally encountered in the test procedure are summarized.

Keywords: GOOSE, IEC61850, IED, SCADA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5368
212 State Estimation of a Biotechnological Process Using Extended Kalman Filter and Particle Filter

Authors: R. Simutis, V. Galvanauskas, D. Levisauskas, J. Repsyte, V. Grincas

Abstract:

This paper deals with advanced state estimation algorithms for estimation of biomass concentration and specific growth rate in a typical fed-batch biotechnological process. This biotechnological process was represented by a nonlinear mass-balance based process model. Extended Kalman Filter (EKF) and Particle Filter (PF) was used to estimate the unmeasured state variables from oxygen uptake rate (OUR) and base consumption (BC) measurements. To obtain more general results, a simplified process model was involved in EKF and PF estimation algorithms. This model doesn’t require any special growth kinetic equations and could be applied for state estimation in various bioprocesses. The focus of this investigation was concentrated on the comparison of the estimation quality of the EKF and PF estimators by applying different measurement noises. The simulation results show that Particle Filter algorithm requires significantly more computation time for state estimation but gives lower estimation errors both for biomass concentration and specific growth rate. Also the tuning procedure for Particle Filter is simpler than for EKF. Consequently, Particle Filter should be preferred in real applications, especially for monitoring of industrial bioprocesses where the simplified implementation procedures are always desirable.

Keywords: Biomass concentration, Extended Kalman Filter, Particle Filter, State estimation, Specific growth rate.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2953
211 Power System with PSS and FACTS Controller: Modelling, Simulation and Simultaneous Tuning Employing Genetic Algorithm

Authors: Sidhartha Panda, Narayana Prasad Padhy

Abstract:

This paper presents a systematic procedure for modelling and simulation of a power system installed with a power system stabilizer (PSS) and a flexible ac transmission system (FACTS)-based controller. For the design purpose, the model of example power system which is a single-machine infinite-bus power system installed with the proposed controllers is developed in MATLAB/SIMULINK. In the developed model synchronous generator is represented by model 1.1. which includes both the generator main field winding and the damper winding in q-axis so as to evaluate the impact of PSS and FACTS-based controller on power system stability. The model can be can be used for teaching the power system stability phenomena, and also for research works especially to develop generator controllers using advanced technologies. Further, to avoid adverse interactions, PSS and FACTS-based controller are simultaneously designed employing genetic algorithm (GA). The non-linear simulation results are presented for the example power system under various disturbance conditions to validate the effectiveness of the proposed modelling and simultaneous design approach.

Keywords: Genetic algorithm, modelling and simulation, MATLAB/SIMULINK, power system stabilizer, thyristor controlledseries compensator, simultaneous design, power system stability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3157
210 Detecting Email Forgery using Random Forests and Naïve Bayes Classifiers

Authors: Emad E Abdallah, A.F. Otoom, ArwaSaqer, Ola Abu-Aisheh, Diana Omari, Ghadeer Salem

Abstract:

As emails communications have no consistent authentication procedure to ensure the authenticity, we present an investigation analysis approach for detecting forged emails based on Random Forests and Naïve Bays classifiers. Instead of investigating the email headers, we use the body content to extract a unique writing style for all the possible suspects. Our approach consists of four main steps: (1) The cybercrime investigator extract different effective features including structural, lexical, linguistic, and syntactic evidence from previous emails for all the possible suspects, (2) The extracted features vectors are normalized to increase the accuracy rate. (3) The normalized features are then used to train the learning engine, (4) upon receiving the anonymous email (M); we apply the feature extraction process to produce a feature vector. Finally, using the machine learning classifiers the email is assigned to one of the suspects- whose writing style closely matches M. Experimental results on real data sets show the improved performance of the proposed method and the ability of identifying the authors with a very limited number of features.

Keywords: Digital investigation, cybercrimes, emails forensics, anonymous emails, writing style, and authorship analysis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5254
209 Delay Preserving Substructures in Wireless Networks Using Edge Difference between a Graph and its Square Graph

Authors: T. N. Janakiraman, J. Janet Lourds Rani

Abstract:

In practice, wireless networks has the property that the signal strength attenuates with respect to the distance from the base station, it could be better if the nodes at two hop away are considered for better quality of service. In this paper, we propose a procedure to identify delay preserving substructures for a given wireless ad-hoc network using a new graph operation G 2 – E (G) = G* (Edge difference of square graph of a given graph and the original graph). This operation helps to analyze some induced substructures, which preserve delay in communication among them. This operation G* on a given graph will induce a graph, in which 1- hop neighbors of any node are at 2-hop distance in the original network. In this paper, we also identify some delay preserving substructures in G*, which are (i) set of all nodes, which are mutually at 2-hop distance in G that will form a clique in G*, (ii) set of nodes which forms an odd cycle C2k+1 in G, will form an odd cycle in G* and the set of nodes which form a even cycle C2k in G that will form two disjoint companion cycles ( of same parity odd/even) of length k in G*, (iii) every path of length 2k+1 or 2k in G will induce two disjoint paths of length k in G*, and (iv) set of nodes in G*, which induces a maximal connected sub graph with radius 1 (which identifies a substructure with radius equal 2 and diameter at most 4 in G). The above delay preserving sub structures will behave as good clusters in the original network.

Keywords: Clique, cycles, delay preserving substructures, maximal connected sub graph.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1254
208 Ethanol Production from Sugarcane Bagasse by Means of Enzymes Produced by Solid State Fermentation Method

Authors: Nasim Shaibani, Saba Ghazvini, Mohammad R. Andalibi, Soheila Yaghmaei

Abstract:

Nowadays there is a growing interest in biofuel production in most countries because of the increasing concerns about hydrocarbon fuel shortage and global climate changes, also for enhancing agricultural economy and producing local needs for transportation fuel. Ethanol can be produced from biomass by the hydrolysis and sugar fermentation processes. In this study ethanol was produced without using expensive commercial enzymes from sugarcane bagasse. Alkali pretreatment was used to prepare biomass before enzymatic hydrolysis. The comparison between NaOH, KOH and Ca(OH)2 shows NaOH is more effective on bagasse. The required enzymes for biomass hydrolysis were produced from sugarcane solid state fermentation via two fungi: Trichoderma longibrachiatum and Aspergillus niger. The results show that the produced enzyme solution via A. niger has functioned better than T. longibrachiatum. Ethanol was produced by simultaneous saccharification and fermentation (SSF) with crude enzyme solution from T. longibrachiatum and Saccharomyces cerevisiae yeast. To evaluate this procedure, SSF of pretreated bagasse was also done using Celluclast 1.5L by Novozymes. The yield of ethanol production by commercial enzyme and produced enzyme solution via T. longibrachiatum was 81% and 50% respectively.

Keywords: Alkali pretreatment, bioethanol, cellulase, simultaneous saccharification and fermentation, solid statefermentation, sugarcane bagasse

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3019
207 BIDENS: Iterative Density Based Biclustering Algorithm With Application to Gene Expression Analysis

Authors: Mohamed A. Mahfouz, M. A. Ismail

Abstract:

Biclustering is a very useful data mining technique for identifying patterns where different genes are co-related based on a subset of conditions in gene expression analysis. Association rules mining is an efficient approach to achieve biclustering as in BIMODULE algorithm but it is sensitive to the value given to its input parameters and the discretization procedure used in the preprocessing step, also when noise is present, classical association rules miners discover multiple small fragments of the true bicluster, but miss the true bicluster itself. This paper formally presents a generalized noise tolerant bicluster model, termed as μBicluster. An iterative algorithm termed as BIDENS based on the proposed model is introduced that can discover a set of k possibly overlapping biclusters simultaneously. Our model uses a more flexible method to partition the dimensions to preserve meaningful and significant biclusters. The proposed algorithm allows discovering biclusters that hard to be discovered by BIMODULE. Experimental study on yeast, human gene expression data and several artificial datasets shows that our algorithm offers substantial improvements over several previously proposed biclustering algorithms.

Keywords: Machine learning, biclustering, bi-dimensional clustering, gene expression analysis, data mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1963
206 Development of Face Surrogate for Impact Protection Design for Cyclist

Authors: Sanga Monthatipkul, Pio Iovenitti, Igor Sbarski

Abstract:

Bicycle usage for exercise, recreation, and commuting to work in Australia shows that pedal cycling is the fourth most popular activity with 10.6% increase in participants between 2001 and 2007. As with other means of transport, accident and injury becomes common although mandatory bicycle helmet wearing has been introduced. The research aims to develop a face surrogate made of sandwich of rigid foam and rubber sheets to represent human facial bone under blunt impact. The facial surrogate will serve as an important test device for further development of facial-impact protection for cyclist. A test procedure was developed to simulate the energy of impact and record data to evaluate the effect of impact on facial bones. Drop tests were performed to establish a suitable combination of materials. It was found that the sandwich structure of rigid extruded-polystyrene foam (density of 40 kg/m3 with a pattern of 6-mm-holes), Neoprene rubber sponge, and Abrasaflex rubber backing, had impact characteristics comparable to that of human facial bone. In particular, the foam thickness of 30 mm and 25 mm was found suitable to represent human zygoma (cheekbone) and maxilla (upper-jaw bone), respectively.

Keywords: Facial impact protection, face surrogate, cyclist, accident prevention

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1526
205 Fuzzy Uncertainty Theory for Stealth Fighter Aircraft Selection in Entropic Fuzzy TOPSIS Decision Analysis Process

Authors: C. Ardil

Abstract:

The purpose of this paper is to present fuzzy TOPSIS in an entropic fuzzy environment. Due to the ambiguous concepts often represented in decision data, exact values are insufficient to model real-life situations. In this paper, the rating of each alternative is defined in fuzzy linguistic terms, which can be expressed with triangular fuzzy numbers. The weight of each criterion is then derived from the decision matrix using the entropy weighting method. Next, a vertex method is proposed to calculate the distance between two triangular fuzzy numbers. According to the TOPSIS concept, a closeness coefficient is defined to determine the ranking order of all alternatives by simultaneously calculating the distances to both the fuzzy positive-ideal solution (FPIS) and the fuzzy negative-ideal solution (FNIS). Finally, an illustrative example of selecting stealth fighter aircraft is shown at the end of this article to highlight the procedure of the proposed method. Correlation analysis and validation analysis using TOPSIS, WSM, and WPM methods were performed to compare the ranking order of the alternatives.

Keywords: stealth fighter aircraft selection, fuzzy uncertainty theory (FUT), fuzzy entropic decision (FED), fuzzy linguistic variables, triangular fuzzy numbers, multiple criteria decision making analysis, MCDMA, TOPSIS, WSM, WPM

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 601
204 Evaluation of Medication Administration Process in a Paediatric Ward

Authors: Zayed N. Alsulami, Asma F. Aldosseri, Ahmed S. Ezziden, Abdulrahman K. Alonazi

Abstract:

Children are more susceptible to medication errors than adults. Medication administration process is the last stage in the medication treatment process and most of the errors detected in this stage. Little research has been undertaken about medication errors in children in the Middle East countries. This study was aimed to evaluate how the paediatric nurses adhere to the medication administration policy and also to identify any medication preparation and administration errors or any risk factors. An observational, prospective study of medication administration process from when the nurses preparing patient medication until administration stage (May to August 2014) was conducted in Saudi Arabia. Twelve paediatric nurses serving 90 paediatric patients were observed. 456 drug administered doses were evaluated. Adherence rate was variable in 7 steps out of 16 steps. Patient allergy information, dose calculation, drug expiry date were the steps in medication administration with lowest adherence rates. 63 medication preparation and administration errors were identified with error rate 13.8% of medication administrations. No potentially life-threating errors were witnessed. Few logistic and administrative factors were reported. The results showed that the medication administration policy and procedure need an urgent revision to be more sensible for nurses in practice. Nurses’ knowledge and skills regarding to the medication administration process should be improved.

Keywords: Double checking, Medication administration errors, Medication safety, Nurses.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2820
203 Conceptual Design of the TransAtlantic as a Research Platform for the Development of “Green” Aircraft Technologies

Authors: Victor Maldonado

Abstract:

Recent concerns of the growing impact of aviation on climate change has prompted the emergence of a field referred to as Sustainable or “Green” Aviation dedicated to mitigating the harmful impact of aviation related CO2 emissions and noise pollution on the environment. In the current paper, a unique “green” business jet aircraft called the TransAtlantic was designed (using analytical formulation common in conceptual design) in order to show the feasibility for transatlantic passenger air travel with an aircraft weighing less than 10,000 pounds takeoff weight. Such an advance in fuel efficiency will require development and integration of advanced and emerging aerospace technologies. The TransAtlantic design is intended to serve as a research platform for the development of technologies such as active flow control. Recent advances in the field of active flow control and how this technology can be integrated on a sub-scale flight demonstrator are discussed in this paper. Flow control is a technique to modify the behavior of coherent structures in wall-bounded flows (over aerodynamic surfaces such as wings and turbine nozzles) resulting in improved aerodynamic cruise and flight control efficiency. One of the key challenges to application in manned aircraft is development of a robust high-momentum actuator that can penetrate the boundary layer flowing over aerodynamic surfaces. These deficiencies may be overcome in the current development and testing of a novel electromagnetic synthetic jet actuator which replaces piezoelectric materials as the driving diaphragm. One of the overarching goals of the TranAtlantic research platform include fostering national and international collaboration to demonstrate (in numerical and experimental models) reduced CO2/ noise pollution via development and integration of technologies and methodologies in design optimization, fluid dynamics, structures/ composites, propulsion, and controls.

Keywords: Aircraft Design, Sustainable “Green” Aviation, Active Flow Control, Aerodynamics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2533
202 3D Human Reconstruction over Cloud Based Image Data via AI and Machine Learning

Authors: Kaushik Sathupadi, Sandesh Achar

Abstract:

Human action recognition (HAR) modeling is a critical task in machine learning. These systems require better techniques for recognizing body parts and selecting optimal features based on vision sensors to identify complex action patterns efficiently. Still, there is a considerable gap and challenges between images and videos, such as brightness, motion variation, and random clutters. This paper proposes a robust approach for classifying human actions over cloud-based image data. First, we apply pre-processing and detection, human and outer shape detection techniques. Next, we extract valuable information in terms of cues. We extract two distinct features: fuzzy local binary patterns and sequence representation. Then, we applied a greedy, randomized adaptive search procedure for data optimization and dimension reduction, and for classification, we used a random forest. We tested our model on two benchmark datasets, AAMAZ and the KTH Multi-view Football datasets. Our HAR framework significantly outperforms the other state-of-the-art approaches and achieves a better recognition rate of 91% and 89.6% over the AAMAZ and KTH Multi-view Football datasets, respectively.

Keywords: Computer vision, human motion analysis, random forest, machine learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 36