Search results for: wasteless method of ores processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 21636

Search results for: wasteless method of ores processing

20736 Modification of Newton Method in Two Points Block Differentiation Formula

Authors: Khairil Iskandar Othman, Nadhirah Kamal, Zarina Bibi Ibrahim

Abstract:

Block methods for solving stiff systems of ordinary differential equations (ODEs) are based on backward differential formulas (BDF) with PE(CE)2 and Newton method. In this paper, we introduce Modified Newton as a new strategy to get more efficient result. The derivation of BBDF using modified block Newton method is presented. This new block method with predictor-corrector gives more accurate result when compared to the existing BBDF.

Keywords: modified Newton, stiff, BBDF, Jacobian matrix

Procedia PDF Downloads 377
20735 System Identification of Timber Masonry Walls Using Shaking Table Test

Authors: Timir Baran Roy, Luis Guerreiro, Ashutosh Bagchi

Abstract:

Dynamic study is important in order to design, repair and rehabilitation of structures. It has played an important role in the behavior characterization of structures; such as bridges, dams, high-rise buildings etc. There had been a substantial development in this area over the last few decades, especially in the field of dynamic identification techniques of structural systems. Frequency Domain Decomposition (FDD) and Time Domain Decomposition are most commonly used methods to identify modal parameters; such as natural frequency, modal damping, and mode shape. The focus of the present research is to study the dynamic characteristics of typical timber masonry walls commonly used in Portugal. For that purpose, a multi-storey structural prototypes of such walls have been tested on a seismic shake table at the National Laboratory for Civil Engineering, Portugal (LNEC). Signal processing has been performed of the output response, which is collected from the shaking table experiment of the prototype using accelerometers. In the present work signal processing of the output response, based on the input response has been done in two ways: FDD and Stochastic Subspace Identification (SSI). In order to estimate the values of the modal parameters, algorithms for FDD are formulated, and parametric functions for the SSI are computed. Finally, estimated values from both the methods are compared to measure the accuracy of both the techniques.

Keywords: frequency domain decomposition (fdd), modal parameters, signal processing, stochastic subspace identification (ssi), time domain decomposition

Procedia PDF Downloads 263
20734 Reinforced Concrete Bridge Deck Condition Assessment Methods Using Ground Penetrating Radar and Infrared Thermography

Authors: Nicole M. Martino

Abstract:

Reinforced concrete bridge deck condition assessments primarily use visual inspection methods, where an inspector looks for and records locations of cracks, potholes, efflorescence and other signs of probable deterioration. Sounding is another technique used to diagnose the condition of a bridge deck, however this method listens for damage within the subsurface as the surface is struck with a hammer or chain. Even though extensive procedures are in place for using these inspection techniques, neither one provides the inspector with a comprehensive understanding of the internal condition of a bridge deck – the location where damage originates from.  In order to make accurate estimates of repair locations and quantities, in addition to allocating the necessary funding, a total understanding of the deck’s deteriorated state is key. The research presented in this paper collected infrared thermography and ground penetrating radar data from reinforced concrete bridge decks without an asphalt overlay. These decks were of various ages and their condition varied from brand new, to in need of replacement. The goals of this work were to first verify that these nondestructive evaluation methods could identify similar areas of healthy and damaged concrete, and then to see if combining the results of both methods would provide a higher confidence than if the condition assessment was completed using only one method. The results from each method were presented as plan view color contour plots. The results from one of the decks assessed as a part of this research, including these plan view plots, are presented in this paper. Furthermore, in order to answer the interest of transportation agencies throughout the United States, this research developed a step-by-step guide which demonstrates how to collect and assess a bridge deck using these nondestructive evaluation methods. This guide addresses setup procedures on the deck during the day of data collection, system setups and settings for different bridge decks, data post-processing for each method, and data visualization and quantification.

Keywords: bridge deck deterioration, ground penetrating radar, infrared thermography, NDT of bridge decks

Procedia PDF Downloads 153
20733 Numerical Wave Solutions for Nonlinear Coupled Equations Using Sinc-Collocation Method

Authors: Kamel Al-Khaled

Abstract:

In this paper, numerical solutions for the nonlinear coupled Korteweg-de Vries, (abbreviated as KdV) equations are calculated by Sinc-collocation method. This approach is based on a global collocation method using Sinc basis functions. First, discretizing time derivative of the KdV equations by a classic finite difference formula, while the space derivatives are approximated by a $\theta-$weighted scheme. Sinc functions are used to solve these two equations. Soliton solutions are constructed to show the nature of the solution. The numerical results are shown to demonstrate the efficiency of the newly proposed method.

Keywords: Nonlinear coupled KdV equations, Soliton solutions, Sinc-collocation method, Sinc functions

Procedia PDF Downloads 523
20732 Studying the Spatial Aspects of Visual Attention Processing in Global Precedence Paradigm

Authors: Shreya Borthakur, Aastha Vartak

Abstract:

This behavioral experiment aimed to investigate the global precedence phenomenon in a South Asian sample and its correlation with mobile screen time. The global precedence effect refers to the tendency to process overall structure before attending to specific details. Participants completed attention tasks involving global and local stimuli with varying consistencies. The results showed a tendency towards local precedence, but no significant differences in reaction times were found between consistency levels or attention conditions. However, the correlation analysis revealed that participants with higher screen time exhibited a stronger negative correlation with local attention, suggesting that excessive screen usage may impact perceptual organization. Further research is needed to explore this relationship and understand the influence of screen time on cognitive processing.

Keywords: global precedence, visual attention, perceptual organization, screen time, cognition

Procedia PDF Downloads 66
20731 Numerical Simulation and Laboratory Tests for Rebar Detection in Reinforced Concrete Structures using Ground Penetrating Radar

Authors: Maha Al-Soudani, Gilles Klysz, Jean-Paul Balayssac

Abstract:

The aim of this paper is to use Ground Penetrating Radar (GPR) as a non-destructive testing (NDT) method to increase its accuracy in recognizing the geometric reinforced concrete structures and in particular, the position of steel bars. This definition will help the managers to assess the state of their structures on the one hand vis-a-vis security constraints and secondly to quantify the need for maintenance and repair. Several configurations of acquisition and processing of the simulated signal were tested to propose and develop an appropriate imaging algorithm in the propagation medium to locate accurately the rebar. A subsequent experimental validation was used by testing the imaging algorithm on real reinforced concrete structures. The results indicate that, this algorithm is capable of estimating the reinforcing steel bar position to within (0-1) mm.

Keywords: GPR, NDT, Reinforced concrete structures, Rebar location.

Procedia PDF Downloads 502
20730 Arabic Light Word Analyser: Roles with Deep Learning Approach

Authors: Mohammed Abu Shquier

Abstract:

This paper introduces a word segmentation method using the novel BP-LSTM-CRF architecture for processing semantic output training. The objective of web morphological analysis tools is to link a formal morpho-syntactic description to a lemma, along with morpho-syntactic information, a vocalized form, a vocalized analysis with morpho-syntactic information, and a list of paradigms. A key objective is to continuously enhance the proposed system through an inductive learning approach that considers semantic influences. The system is currently under construction and development based on data-driven learning. To evaluate the tool, an experiment on homograph analysis was conducted. The tool also encompasses the assumption of deep binary segmentation hypotheses, the arbitrary choice of trigram or n-gram continuation probabilities, language limitations, and morphology for both Modern Standard Arabic (MSA) and Dialectal Arabic (DA), which provide justification for updating this system. Most Arabic word analysis systems are based on the phonotactic morpho-syntactic analysis of a word transmitted using lexical rules, which are mainly used in MENA language technology tools, without taking into account contextual or semantic morphological implications. Therefore, it is necessary to have an automatic analysis tool taking into account the word sense and not only the morpho-syntactic category. Moreover, they are also based on statistical/stochastic models. These stochastic models, such as HMMs, have shown their effectiveness in different NLP applications: part-of-speech tagging, machine translation, speech recognition, etc. As an extension, we focus on language modeling using Recurrent Neural Network (RNN); given that morphological analysis coverage was very low in dialectal Arabic, it is significantly important to investigate deeply how the dialect data influence the accuracy of these approaches by developing dialectal morphological processing tools to show that dialectal variability can support to improve analysis.

Keywords: NLP, DL, ML, analyser, MSA, RNN, CNN

Procedia PDF Downloads 41
20729 Study of the Stability of Underground Mines by Numerical Method: The Mine Chaabet El Hamra, Algeria

Authors: Nakache Radouane, M. Boukelloul, M. Fredj

Abstract:

Method room and pillar sizes are key factors for safe mining and their recovery in open-stop mining. This method is advantageous due to its simplicity and requirement of little information to be used. It is probably the most representative method among the total load approach methods although it also remains a safe design method. Using a finite element software (PLAXIS 3D), analyses were carried out with an elasto-plastic model and comparisons were made with methods based on the total load approach. The results were presented as the optimization for improving the ore recovery rate while maintaining a safe working environment.

Keywords: room and pillar, mining, total load approach, elasto-plastic

Procedia PDF Downloads 328
20728 A Novel Method for Face Detection

Authors: H. Abas Nejad, A. R. Teymoori

Abstract:

Facial expression recognition is one of the open problems in computer vision. Robust neutral face recognition in real time is a major challenge for various supervised learning based facial expression recognition methods. This is due to the fact that supervised methods cannot accommodate all appearance variability across the faces with respect to race, pose, lighting, facial biases, etc. in the limited amount of training data. Moreover, processing each and every frame to classify emotions is not required, as the user stays neutral for the majority of the time in usual applications like video chat or photo album/web browsing. Detecting neutral state at an early stage, thereby bypassing those frames from emotion classification would save the computational power. In this work, we propose a light-weight neutral vs. emotion classification engine, which acts as a preprocessor to the traditional supervised emotion classification approaches. It dynamically learns neutral appearance at Key Emotion (KE) points using a textural statistical model, constructed by a set of reference neutral frames for each user. The proposed method is made robust to various types of user head motions by accounting for affine distortions based on a textural statistical model. Robustness to dynamic shift of KE points is achieved by evaluating the similarities on a subset of neighborhood patches around each KE point using the prior information regarding the directionality of specific facial action units acting on the respective KE point. The proposed method, as a result, improves ER accuracy and simultaneously reduces the computational complexity of ER system, as validated on multiple databases.

Keywords: neutral vs. emotion classification, Constrained Local Model, procrustes analysis, Local Binary Pattern Histogram, statistical model

Procedia PDF Downloads 336
20727 The Influence of Machine Tool Composite Stiffness to the Surface Waviness When Processing Posture Constantly Switching

Authors: Song Zhiyong, Zhao Bo, Du Li, Wang Wei

Abstract:

Aircraft structures generally have complex surface. Because of constantly switching postures of motion axis, five-axis CNC machine’s composite stiffness changes during CNC machining. It gives rise to different amplitude of vibration of processing system, which further leads to the different effects on surface waviness. In order to provide a solution for this problem, we take the “S” shape test specimen’s CNC machining for the object, through calculate the five axis CNC machine’s composite stiffness and establish vibration model, we analysis of the influence mechanism between vibration amplitude and surface waviness. Through carry out the surface quality measurement experiments, verify the validity and accuracy of the theoretical analysis. This paper’s research results provide a theoretical basis for surface waviness control.

Keywords: five axis CNC machine, “S” shape test specimen, composite stiffness, surface waviness

Procedia PDF Downloads 389
20726 Effects of Safety Intervention Program towards Behaviors among Rubber Wood Processing Workers Using Theory of Planned Behavior

Authors: Junjira Mahaboon, Anongnard Boonpak, Nattakarn Worrasan, Busma Kama, Mujalin Saikliang, Siripor Dankachatarn

Abstract:

Rubber wood processing is one of the most important industries in southern Thailand. The process has several safety hazards for example unsafe wood cutting machine guarding, wood dust, noise, and heavy lifting. However, workers’ occupational health and safety measures to promote their behaviors are still limited. This quasi-experimental research was to determine factors affecting workers’ safety behaviors using theory of planned behavior after implementing job safety intervention program. The purposes were to (1) determine factors affecting workers’ behaviors and (2) to evaluate effectiveness of the intervention program. The sample of study was 66 workers from a rubber wood processing factory. Factors in the Theory of Planned Behavior model (TPB) were measured before and after the intervention. The factors of TPB included attitude towards behavior, subjective norm, perceived behavioral control, intention, and behavior. Firstly, Job Safety Analysis (JSA) was conducted and Safety Standard Operation Procedures (SSOP) were established. The questionnaire was also used to collect workers’ characteristics and TPB factors. Then, job safety intervention program to promote workers’ behavior according to SSOP were implemented for a four month period. The program included SSOP training, personal protective equipment use, and safety promotional campaign. After that, the TPB factors were again collected. Paired sample t-test and independent t-test were used to analyze the data. The result revealed that attitude towards behavior and intention increased significantly after the intervention at p<0.05. These factors also significantly determined the workers’ safety behavior according to SSOP at p<0.05. However, subjective norm, and perceived behavioral control were not significantly changed nor related to safety behaviors. In conclusion, attitude towards behavior and workers’ intention should be promoted to encourage workers’ safety behaviors. SSOP intervention program e.g. short meeting, safety training, and promotional campaign should be continuously implemented in a routine basis to improve workers’ behavior.

Keywords: job safety analysis, rubber wood processing workers, safety standard operation procedure, theory of planned behavior

Procedia PDF Downloads 193
20725 Reasons for the Selection of Information-Processing Framework and the Philosophy of Mind as a General Account for an Error Analysis and Explanation on Mathematics

Authors: Michael Lousis

Abstract:

This research study is concerned with learner’s errors on Arithmetic and Algebra. The data resulted from a broader international comparative research program called Kassel Project. However, its conceptualisation differed from and contrasted with that of the main program, which was mostly based on socio-demographic data. The way in which the research study was conducted, was not dependent on the researcher’s discretion, but was absolutely dictated by the nature of the problem under investigation. This is because the phenomenon of learners’ mathematical errors is due neither to the intentions of learners nor to institutional processes, rules and norms, nor to the educators’ intentions and goals; but rather to the way certain information is presented to learners and how their cognitive apparatus processes this information. Several approaches for the study of learners’ errors have been developed from the beginning of the 20th century, encompassing different belief systems. These approaches were based on the behaviourist theory, on the Piagetian- constructivist research framework, the perspective that followed the philosophy of science and the information-processing paradigm. The researcher of the present study was forced to disclose the learners’ course of thinking that led them in specific observable actions with the result of showing particular errors in specific problems, rather than analysing scripts with the students’ thoughts presented in a written form. This, in turn, entailed that the choice of methods would have to be appropriate and conducive to seeing and realising the learners’ errors from the perspective of the participants in the investigation. This particular fact determined important decisions to be made concerning the selection of an appropriate framework for analysing the mathematical errors and giving explanations. Thus the rejection of the belief systems concerning behaviourism, the Piagetian-constructivist, and philosophy of science perspectives took place, and the information-processing paradigm in conjunction with the philosophy of mind were adopted as a general account for the elaboration of data. This paper explains why these decisions were appropriate and beneficial for conducting the present study and for the establishment of the ensued thesis. Additionally, the reasons for the adoption of the information-processing paradigm in conjunction with the philosophy of mind give sound and legitimate bases for the development of future studies concerning mathematical error analysis are explained.

Keywords: advantages-disadvantages of theoretical prospects, behavioral prospect, critical evaluation of theoretical prospects, error analysis, information-processing paradigm, opting for the appropriate approach, philosophy of science prospect, Piagetian-constructivist research frameworks, review of research in mathematical errors

Procedia PDF Downloads 189
20724 Method Development and Validation for Quantification of Active Content and Impurities of Clodinafop Propargyl and Its Enantiomeric Separation by High-Performance Liquid Chromatography

Authors: Kamlesh Vishwakarma, Bipul Behari Saha, Sunilkumar Sing, Abhishek Mishra, Sreenivas Rao

Abstract:

A rapid, sensitive and inexpensive method has been developed for complete analysis of Clodinafop Propargyl. Clodinafop Propargyl enantiomers were separated on chiral column, Chiral Pak AS-H (250 mm. 4.6mm x 5µm) with mobile phase n-hexane: IPA (96:4) at flow rate 1.5 ml/min. The effluent was monitored by UV detector at 230 nm. Clodinafop Propagyl content and impurity quantification was done with reverse phase HPLC. The present study describes a HPLC method using simple mobile phase for the quantification of Clodinafop Propargyl and its impurities. The method was validated and found to be accurate, precise, convenient and effective. Moreover, the lower solvent consumption along with short analytical run time led to a cost effective analytical method.

Keywords: Clodinafop Propargyl, method, validation, HPLC-UV

Procedia PDF Downloads 369
20723 Numerical Investigation of Turbulent Inflow Strategy in Wind Energy Applications

Authors: Arijit Saha, Hassan Kassem, Leo Hoening

Abstract:

Ongoing climate change demands the increasing use of renewable energies. Wind energy plays an important role in this context since it can be applied almost everywhere in the world. To reduce the costs of wind turbines and to make them more competitive, simulations are very important since experiments are often too costly if at all possible. The wind turbine on a vast open area experiences the turbulence generated due to the atmosphere, so it was of utmost interest from this research point of view to generate the turbulence through various Inlet Turbulence Generation methods like Precursor cyclic and Kaimal Spectrum Exponential Coherence (KSEC) in the computational simulation domain. To be able to validate computational fluid dynamic simulations of wind turbines with the experimental data, it is crucial to set up the conditions in the simulation as close to reality as possible. This present work, therefore, aims at investigating the turbulent inflow strategy and boundary conditions of KSEC and providing a comparative analysis alongside the Precursor cyclic method for Large Eddy Simulation within the context of wind energy applications. For the generation of the turbulent box through KSEC method, firstly, the constrained data were collected from an auxiliary channel flow, and later processing was performed with the open-source tool PyconTurb, whereas for the precursor cyclic, only the data from the auxiliary channel were sufficient. The functionality of these methods was studied through various statistical properties such as variance, turbulent intensity, etc with respect to different Bulk Reynolds numbers, and a conclusion was drawn on the feasibility of KSEC method. Furthermore, it was found necessary to verify the obtained data with DNS case setup for its applicability to use it as a real field CFD simulation.

Keywords: Inlet Turbulence Generation, CFD, precursor cyclic, KSEC, large Eddy simulation, PyconTurb

Procedia PDF Downloads 94
20722 Jordan Curves in the Digital Plane with Respect to the Connectednesses given by Certain Adjacency Graphs

Authors: Josef Slapal

Abstract:

Digital images are approximations of real ones and, therefore, to be able to study them, we need the digital plane Z2 to be equipped with a convenient structure that behaves analogously to the Euclidean topology on the real plane. In particular, it is required that such a structure allows for a digital analogue of the Jordan curve theorem. We introduce certain adjacency graphs on the digital plane and prove digital Jordan curves for them thus showing that the graphs provide convenient structures on Z2 for the study and processing of digital images. Further convenient structures including the wellknown Khalimsky and Marcus-Wyse adjacency graphs may be obtained as quotients of the graphs introduced. Since digital Jordan curves represent borders of objects in digital images, the adjacency graphs discussed may be used as background structures on the digital plane for solving the problems of digital image processing that are closely related to borders like border detection, contour filling, pattern recognition, thinning, etc.

Keywords: digital plane, adjacency graph, Jordan curve, quotient adjacency

Procedia PDF Downloads 377
20721 Reduction of Speckle Noise in Echocardiographic Images: A Survey

Authors: Fathi Kallel, Saida Khachira, Mohamed Ben Slima, Ahmed Ben Hamida

Abstract:

Speckle noise is a main characteristic of cardiac ultrasound images, it corresponding to grainy appearance that degrades the image quality. For this reason, the ultrasound images are difficult to use automatically in clinical use, then treatments are required for this type of images. Then a filtering procedure of these images is necessary to eliminate the speckle noise and to improve the quality of ultrasound images which will be then segmented to extract the necessary forms that exist. In this paper, we present the importance of the pre-treatment step for segmentation. This work is applied to cardiac ultrasound images. In a first step, a comparative study of speckle filtering method will be presented and then we use a segmentation algorithm to locate and extract cardiac structures.

Keywords: medical image processing, ultrasound images, Speckle noise, image enhancement, speckle filtering, segmentation, snakes

Procedia PDF Downloads 525
20720 Comparison of Yb and Tm-Fiber Laser Cutting Processes of Fiber Reinforced Plastics

Authors: Oktay Celenk, Ugur Karanfil, Iskender Demir, Samir Lamrini, Jorg Neumann, Arif Demir

Abstract:

Due to its favourable material characteristics, fiber reinforced plastics are amongst the main topics of all actual lightweight construction megatrends. Especially in transportation trends ranging from aeronautics over the automotive industry to naval transportation (yachts, cruise liners) the expected economic and environmental impact is huge. In naval transportation components like yacht bodies, antenna masts, decorative structures like deck lamps, light houses and pool areas represent cheap and robust solutions. Commercially available laser tools like carbon dioxide gas lasers (CO₂), frequency tripled solid state UV lasers, and Neodymium-YAG (Nd:YAG) lasers can be used. These tools have emission wavelengths of 10 µm, 0.355 µm, and 1.064 µm, respectively. The scientific goal is first of all the generation of a parameter matrix for laser processing of each used material for a Tm-fiber laser system (wavelength 2 µm). These parameters are the heat affected zone, process gas pressure, work piece feed velocity, intensity, irradiation time etc. The results are compared with results obtained with well-known material processing lasers, such as a Yb-fiber lasers (wavelength 1 µm). Compared to the CO₂-laser, the Tm-laser offers essential advantages for future laser processes like cutting, welding, ablating for repair and drilling in composite part manufacturing (components of cruise liners, marine pipelines). Some of these are the possibility of beam delivery in a standard fused silica fiber which enables hand guided processing, eye safety which results from the wavelength, excellent beam quality and brilliance due to the fiber nature. There is one more feature that is economically absolutely important for boat, automotive and military projects manufacturing that the wavelength of 2 µm is highly absorbed by the plastic matrix and thus enables selective removal of it for repair procedures.

Keywords: Thulium (Tm) fiber laser, laser processing of fiber-reinforced plastics (FRP), composite, heat affected zone

Procedia PDF Downloads 192
20719 Preparation and Characterization of Iron/Titanium-Pillared Clays

Authors: Rezala Houria, Valverde Jose Luis, Romero Amaya, Molinari Alessandra, Maldotti Andrea

Abstract:

The escalation of oil prices in 1973 confronted the oil industry with the problem of how to maximize the processing of crude oil, especially the heavy fractions, to give gasoline components. Strong impetus was thus given to the development of catalysts with relatively large pore sizes, which were able to deal with larger molecules than the existing molecular sieves, and with good thermal and hydrothermal stability. The oil embargo in 1973 therefore acted as a stimulus for the investigation and development of pillared clays. Iron doped titania-pillared montmorillonite clays was prepared using bentonite from deposits of Maghnia in western-Algeria. The preparation method consists of differents steps (purification of the raw bentonite, preparation of a pillaring agent solution and exchange of the cations located between the clay layers with the previously formed iron/titanium solution). The characterization of this material was carried out by X-ray fluorescence spectrometry, X-ray diffraction, textural measures by BET method, inductively coupled plasma atomic emission spectroscopy, diffuse reflectance UV visible spectroscopy, temperature- programmed desorption of ammonia and atomic absorption.This new material was investigated as photocatalyst for selective oxygenation of the liquid alkylaromatics such as: toluene, paraxylene and orthoxylene and the photocatalytic properties of it were compared with those of the titanium-pillared clays.

Keywords: iron doping, montmorillonite clays, pillared clays, oil industry

Procedia PDF Downloads 302
20718 Static Light Scattering Method for the Analysis of Raw Cow's Milk

Authors: V. Villa-Cruz, H. Pérez-Ladron de Guevara, J. E. Diaz-Díaz

Abstract:

Static Light Scattering (SLS) was used as a method to analyse cow's milk raw, coming from the town of Lagos de Moreno, Jalisco, Mexico. This method is based on the analysis of the dispersion of light laser produced by a set of particles in solution. Based on the above, raw milk, which contains particles of fat globules, with a diameter of 2000 nm and particles of micelles of protein with 300 nm in diameter were analyzed. For this, dilutions of commercial milk were made (1.0%, 2.0% and 3.3%) to obtain a pattern of laser light scattering and also made measurements of raw cow's milk. Readings were taken in a sweep initial angle 10° to 170°, results were analyzed with the program OriginPro 7. The SLS method gives us an estimate of the percentage of fat content in milk samples. It can be concluded that the SLS method, is a quick method of analysis to detect adulteration in raw cow's milk.

Keywords: light scattering, milk analysis, adulteration in milk, micelles, OriginPro

Procedia PDF Downloads 373
20717 Software Engineering Inspired Cost Estimation for Process Modelling

Authors: Felix Baumann, Aleksandar Milutinovic, Dieter Roller

Abstract:

Up to this point business process management projects in general and business process modelling projects in particular could not rely on a practical and scientifically validated method to estimate cost and effort. Especially the model development phase is not covered by a cost estimation method or model. Further phases of business process modelling starting with implementation are covered by initial solutions which are discussed in the literature. This article proposes a method of filling this gap by deriving a cost estimation method from available methods in similar domains namely software development or software engineering. Software development is regarded as closely similar to process modelling as we show. After the proposition of this method different ideas for further analysis and validation of the method are proposed. We derive this method from COCOMO II and Function Point which are established methods of effort estimation in the domain of software development. For this we lay out similarities of the software development rocess and the process of process modelling which is a phase of the Business Process Management life-cycle.

Keywords: COCOMO II, busines process modeling, cost estimation method, BPM COCOMO

Procedia PDF Downloads 438
20716 Selection the Most Suitable Method for DNA Extraction from Muscle of Iran's Canned Tuna by Comparison of Different DNA Extraction Methods

Authors: Marjan Heidarzadeh

Abstract:

High quality and purity of DNA isolated from canned tuna is essential for species identification. In this study, the efficiency of five different methods for DNA extraction was compared. Method of national standard in Iran, the CTAB precipitation method, Wizard DNA Clean Up system, Nucleospin and GenomicPrep were employed. DNA was extracted from two different canned tuna in brine and oil of the same tuna species. Three samples of each type of product were analyzed with the different methods. The quantity and quality of DNA extracted was evaluated using the 260 nm absorbance and ratio A260/A280 by spectrophotometer picodrop. Results showed that the DNA extraction from canned tuna preserved in different liquid media could be optimized by employing a specific DNA extraction method in each case. Best results were obtained with CTAB method for canned tuna in oil and with Wizard method for canned tuna in brine.

Keywords: canned tuna PCR, DNA, DNA extraction methods, species identification

Procedia PDF Downloads 654
20715 Importance of Ethics in Cloud Security

Authors: Pallavi Malhotra

Abstract:

This paper examines the importance of ethics in cloud computing. In the modern society, cloud computing is offering individuals and businesses an unlimited space for storing and processing data or information. Most of the data and information stored in the cloud by various users such as banks, doctors, architects, engineers, lawyers, consulting firms, and financial institutions among others require a high level of confidentiality and safeguard. Cloud computing offers centralized storage and processing of data, and this has immensely contributed to the growth of businesses and improved sharing of information over the internet. However, the accessibility and management of data and servers by a third party raise concerns regarding the privacy of clients’ information and the possible manipulations of the data by third parties. This document suggests the approaches various stakeholders should take to address various ethical issues involving cloud-computing services. Ethical education and training is key to all stakeholders involved in the handling of data and information stored or being processed in the cloud.

Keywords: IT ethics, cloud computing technology, cloud privacy and security, ethical education

Procedia PDF Downloads 323
20714 An Optimized Method for Calculating the Linear and Nonlinear Response of SDOF System Subjected to an Arbitrary Base Excitation

Authors: Hossein Kabir, Mojtaba Sadeghi

Abstract:

Finding the linear and nonlinear responses of a typical single-degree-of-freedom system (SDOF) is always being regarded as a time-consuming process. This study attempts to provide modifications in the renowned Newmark method in order to make it more time efficient than it used to be and make it more accurate by modifying the system in its own non-linear state. The efficacy of the presented method is demonstrated by assigning three base excitations such as Tabas 1978, El Centro 1940, and MEXICO CITY/SCT 1985 earthquakes to a SDOF system, that is, SDOF, to compute the strength reduction factor, yield pseudo acceleration, and ductility factor.

Keywords: single-degree-of-freedom system (SDOF), linear acceleration method, nonlinear excited system, equivalent displacement method, equivalent energy method

Procedia PDF Downloads 318
20713 Rehabilitation of the Blind Using Sono-Visualization Tool

Authors: Ashwani Kumar

Abstract:

In human beings, eyes play a vital role. A very less research has been done for rehabilitation of blindness for the blind people. This paper discusses the work that helps blind people for recognizing the basic shapes of the objects like circle, square, triangle, horizontal lines, vertical lines, diagonal lines and the wave forms like sinusoidal, square, triangular etc. This is largely achieved by using a digital camera, which is used to capture the visual information present in front of the blind person and a software program, which achieves the image processing operations, and finally the processed image is converted into sound. After the sound generation process, the generated sound is fed to the blind person through headphones for visualizing the imaginary image of the object. For visualizing the imaginary image of the object, it needs to train the blind person. Various training process methods had been applied for recognizing the object.

Keywords: image processing, pixel, pitch, loudness, sound generation, edge detection, brightness

Procedia PDF Downloads 387
20712 Modeling and Simulation of Fluid Catalytic Cracking Process

Authors: Sungho Kim, Dae Shik Kim, Jong Min Lee

Abstract:

Fluid catalytic cracking (FCC) process is one of the most important process in modern refinery industry. This paper focuses on the fluid catalytic cracking (FCC) process. As the FCC process is difficult to model well, due to its non linearities and various interactions between its process variables, rigorous process modeling of whole FCC plant is demanded for control and plant-wide optimization of the plant. In this study, a process design for the FCC plant includes riser reactor, main fractionator, and gas processing unit was developed. A reactor model was described based on four-lumped kinetic scheme. Main fractionator, gas processing unit and other process units are designed to simulate real plant data, using a process flow sheet simulator, Aspen PLUS. The custom reactor model was integrated with the process flow sheet simulator to develop an integrated process model.

Keywords: fluid catalytic cracking, simulation, plant data, process design

Procedia PDF Downloads 527
20711 Recognition of Grocery Products in Images Captured by Cellular Phones

Authors: Farshideh Einsele, Hassan Foroosh

Abstract:

In this paper, we present a robust algorithm to recognize extracted text from grocery product images captured by mobile phone cameras. Recognition of such text is challenging since text in grocery product images varies in its size, orientation, style, illumination, and can suffer from perspective distortion. Pre-processing is performed to make the characters scale and rotation invariant. Since text degradations can not be appropriately defined using wellknown geometric transformations such as translation, rotation, affine transformation and shearing, we use the whole character black pixels as our feature vector. Classification is performed with minimum distance classifier using the maximum likelihood criterion, which delivers very promising Character Recognition Rate (CRR) of 89%. We achieve considerably higher Word Recognition Rate (WRR) of 99% when using lower level linguistic knowledge about product words during the recognition process.

Keywords: camera-based OCR, feature extraction, document, image processing, grocery products

Procedia PDF Downloads 405
20710 Iris Feature Extraction and Recognition Based on Two-Dimensional Gabor Wavelength Transform

Authors: Bamidele Samson Alobalorun, Ifedotun Roseline Idowu

Abstract:

Biometrics technologies apply the human body parts for their unique and reliable identification based on physiological traits. The iris recognition system is a biometric–based method for identification. The human iris has some discriminating characteristics which provide efficiency to the method. In order to achieve this efficiency, there is a need for feature extraction of the distinct features from the human iris in order to generate accurate authentication of persons. In this study, an approach for an iris recognition system using 2D Gabor for feature extraction is applied to iris templates. The 2D Gabor filter formulated the patterns that were used for training and equally sent to the hamming distance matching technique for recognition. A comparison of results is presented using two iris image subjects of different matching indices of 1,2,3,4,5 filter based on the CASIA iris image database. By comparing the two subject results, the actual computational time of the developed models, which is measured in terms of training and average testing time in processing the hamming distance classifier, is found with best recognition accuracy of 96.11% after capturing the iris localization or segmentation using the Daughman’s Integro-differential, the normalization is confined to the Daugman’s rubber sheet model.

Keywords: Daugman rubber sheet, feature extraction, Hamming distance, iris recognition system, 2D Gabor wavelet transform

Procedia PDF Downloads 63
20709 Imaging Based On Bi-Static SAR Using GPS L5 Signal

Authors: Tahir Saleem, Mohammad Usman, Nadeem Khan

Abstract:

GPS signals are used for navigation and positioning purposes by a diverse set of users. However, this project intends to utilize the reflected GPS L5 signals for location of target in a region of interest by generating an image that highlights the positions of targets in the area of interest. The principle of bi-static radar is used to detect the targets or any movement or changes. The idea is confirmed by the results obtained during MATLAB simulations. A matched filter based technique is employed in the signal processing to improve the system resolution. The simulation is carried out under different conditions with moving receiver and targets. Noise and attenuation is also induced and atmospheric conditions that affect the direct and reflected GPS signals have been simulated to generate a more practical scenario. A realistic GPS L5 signal has been simulated, the simulation results verify that the detection and imaging of targets is possible by employing reflected GPS using L5 signals and matched filter processing technique with acceptable spatial resolution.

Keywords: GPS, L5 Signal, SAR, spatial resolution

Procedia PDF Downloads 532
20708 Optimization of Waste Plastic to Fuel Oil Plants' Deployment Using Mixed Integer Programming

Authors: David Muyise

Abstract:

Mixed Integer Programming (MIP) is an approach that involves the optimization of a range of decision variables in order to minimize or maximize a particular objective function. The main objective of this study was to apply the MIP approach to optimize the deployment of waste plastic to fuel oil processing plants in Uganda. The processing plants are meant to reduce plastic pollution by pyrolyzing the waste plastic into a cleaner fuel that can be used to power diesel/paraffin engines, so as (1) to reduce the negative environmental impacts associated with plastic pollution and also (2) to curb down the energy gap by utilizing the fuel oil. A programming model was established and tested in two case study applications that are, small-scale applications in rural towns and large-scale deployment across major cities in the country. In order to design the supply chain, optimal decisions on the types of waste plastic to be processed, size, location and number of plants, and downstream fuel applications were concurrently made based on the payback period, investor requirements for capital cost and production cost of fuel and electricity. The model comprises qualitative data gathered from waste plastic pickers at landfills and potential investors, and quantitative data obtained from primary research. It was found out from the study that a distributed system is suitable for small rural towns, whereas a decentralized system is only suitable for big cities. Small towns of Kalagi, Mukono, Ishaka, and Jinja were found to be the ideal locations for the deployment of distributed processing systems, whereas Kampala, Mbarara, and Gulu cities were found to be the ideal locations initially utilize the decentralized pyrolysis technology system. We conclude that the model findings will be most important to investors, engineers, plant developers, and municipalities interested in waste plastic to fuel processing in Uganda and elsewhere in developing economy.

Keywords: mixed integer programming, fuel oil plants, optimisation of waste plastics, plastic pollution, pyrolyzing

Procedia PDF Downloads 127
20707 Neural Network Monitoring Strategy of Cutting Tool Wear of Horizontal High Speed Milling

Authors: Kious Mecheri, Hadjadj Abdechafik, Ameur Aissa

Abstract:

The wear of cutting tool degrades the quality of the product in the manufacturing processes. The online monitoring of the cutting tool wear level is very necessary to prevent the deterioration of the quality of machining. Unfortunately there is not a direct manner to measure the cutting tool wear online. Consequently we must adopt an indirect method where wear will be estimated from the measurement of one or more physical parameters appearing during the machining process such as the cutting force, the vibrations, or the acoustic emission etc. In this work, a neural network system is elaborated in order to estimate the flank wear from the cutting force measurement and the cutting conditions.

Keywords: flank wear, cutting forces, high speed milling, signal processing, neural network

Procedia PDF Downloads 392