Search results for: Order of model reduction
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 12221

Search results for: Order of model reduction

10991 Reduction of Content of Lead and Zinc from Wastewater by Using of Metallurgical Waste

Authors: L. Rozumová, J. Seidlerová

Abstract:

The aim of this paper was to study the sorption properties of a blast furnace sludge used as the sorbent. The sorbent was utilized for reduction of content of lead and zinc ions. Sorbent utilized in this work was obtained from metallurgical industry from process of wet gas treatment in iron production. The blast furnace sludge was characterized by X-Ray diffraction, scanning electron microscopy, and XRFS spectroscopy. Sorption experiments were conducted in batch mode. The sorption of metal ions in the sludge was determined by correlation of adsorption isotherm models. The adsorption of lead and zinc ions was best fitted with Langmuir adsorption isotherms. The adsorption capacity of lead and zinc ions was 53.8 mg.g-1 and 10.7 mg.g-1, respectively. The results indicated that blast furnace sludge could be effectively used as secondary material and could be also employed as a low-cost alternative for the removal of heavy metals ions from wastewater.

Keywords: Blast furnace sludge, lead, zinc, sorption.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 903
10990 Adaptive Digital Watermarking Integrating Fuzzy Inference HVS Perceptual Model

Authors: Sherin M. Youssef, Ahmed Abouelfarag, Noha M. Ghatwary

Abstract:

An adaptive Fuzzy Inference Perceptual model has been proposed for watermarking of digital images. The model depends on the human visual characteristics of image sub-regions in the frequency multi-resolution wavelet domain. In the proposed model, a multi-variable fuzzy based architecture has been designed to produce a perceptual membership degree for both candidate embedding sub-regions and strength watermark embedding factor. Different sizes of benchmark images with different sizes of watermarks have been applied on the model. Several experimental attacks have been applied such as JPEG compression, noises and rotation, to ensure the robustness of the scheme. In addition, the model has been compared with different watermarking schemes. The proposed model showed its robustness to attacks and at the same time achieved a high level of imperceptibility.

Keywords: Watermarking, The human visual system (HVS), Fuzzy Inference System (FIS), Local Binary Pattern (LBP), Discrete Wavelet Transform (DWT).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1817
10989 Application of Generalized Autoregressive Score Model to Stock Returns

Authors: Katleho Daniel Makatjane, Diteboho Lawrence Xaba, Ntebogang Dinah Moroke

Abstract:

The current study investigates the behaviour of time-varying parameters that are based on the score function of the predictive model density at time t. The mechanism to update the parameters over time is the scaled score of the likelihood function. The results revealed that there is high persistence of time-varying, as the location parameter is higher and the skewness parameter implied the departure of scale parameter from the normality with the unconditional parameter as 1.5. The results also revealed that there is a perseverance of the leptokurtic behaviour in stock returns which implies the returns are heavily tailed. Prior to model estimation, the White Neural Network test exposed that the stock price can be modelled by a GAS model. Finally, we proposed further researches specifically to model the existence of time-varying parameters with a more detailed model that encounters the heavy tail distribution of the series and computes the risk measure associated with the returns.

Keywords: Generalized autoregressive score model, stock returns, time-varying.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1033
10988 A New Shock Model for Systems Subject to Random Threshold Failure

Authors: A. Rangan, A. Tansu

Abstract:

This paper generalizes Yeh Lam-s shock model for renewal shock arrivals and random threshold. Several interesting statistical measures are explicitly obtained. A few special cases and an optimal replacement problem are also discussed.

Keywords: shock model, optimal replacement, random threshold, shocks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1583
10987 Improving the Effectiveness of Software Testing through Test Case Reduction

Authors: R. P. Mahapatra, Jitendra Singh

Abstract:

This paper proposes a new technique for improving the efficiency of software testing, which is based on a conventional attempt to reduce test cases that have to be tested for any given software. The approach utilizes the advantage of Regression Testing where fewer test cases would lessen time consumption of the testing as a whole. The technique also offers a means to perform test case generation automatically. Compared to one of the techniques in the literature where the tester has no option but to perform the test case generation manually, the proposed technique provides a better option. As for the test cases reduction, the technique uses simple algebraic conditions to assign fixed values to variables (Maximum, minimum and constant variables). By doing this, the variables values would be limited within a definite range, resulting in fewer numbers of possible test cases to process. The technique can also be used in program loops and arrays.

Keywords: Software Testing, Test Case Generation, Test CaseReduction

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3015
10986 A survey Method and new design Lecture Chair for Complied Ergonomics Guideline at Classroom Building 2 Suranaree University of Technology, Thailand

Authors: Sumalee B., Sirinapa L., Jenjira T., Jr., Setasak S.

Abstract:

The paper describes ergonomics problems trend of student at B5101 classroom building 2, Suranaree University of Technology. The objective to survey ergonomics problems and effect from use chairs for sitting in class room. The result from survey method 100 student they use lecture chair for sitting in classroom more than 2 hours/ day by RULA[1]. and Body discomfort survey[2]. The result from Body discomfort survey contribute fatigue problems at neck, lower back, upper back and right shoulder 2.93, 2.91, 2.33, 1.75 respectively and result from RULA contribute fatigue problems at neck, body and right upper arm 4.00, 3.75 and 3.00 respectively are consistent. After that the researcher provide improvement plan for design new chair support student fatigue reduction by prepare data of sample anthropometry and design ergonomics chair prototype 3 unit. Then sample 100 student trial to use new chair and evaluate again by RULA, Body discomfort and satisfaction. The result from trial new chair after improvement by RULA present fatigue reduction average of head and neck from 4.00 to 2.25 , body and trunk from 3.75 to 2.00 and arm force from 1.00 to 0.25 respectively. The result from trial new chair after improvement by Body discomfort present fatigue reduction average of lower back from 2.91 to 0.87, neck from 2.93 to 1.24, upper back 2.33 to 0.84 and right upper arm from 1.75 to 0.74. That statistical of RULA and Body discomfort survey present fatigue reduction after improvement significance with a confidence level of 95% (p-value 0.05). When analyzing the relationship of fatigue as part of the body by Chi – square test during RULA and Body discomfort that before and after improvements were consistent with the significant level of confidence 95% (p-value 0.05) . Moreover the students satisfaction result from trial with a new chair for 30 minutes [3]. 72 percent very satisfied of the folding of the secondary writing simple 66% the width of the writing plate, 64% the suitability of the writing plate, 62% of soft seat cushion and 61% easy to seat the chair.

Keywords: Ergonomics, Work station design, ErgonomicsChair, Student, Fatigue

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3495
10985 Analysis of Model in Pregnant and Non-Pregnant Dengue Patients

Authors: R. Kongnuy, P. Pongsumpun

Abstract:

We used mathematical model to study the transmission of dengue disease. The model is developed in which the human population is separated into two populations, pregnant and non-pregnant humans. The dynamical analysis method is used for analyzing this modified model. Two equilibrium states are found and the conditions for stability of theses two equilibrium states are established. Numerical results are shown for each equilibrium state. The basic reproduction numbers are found and they are compared by using numerical simulations.

Keywords: Basic reproductive number, dengue disease, equilibrium states, pregnancy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1592
10984 Simultaneous Saccharification and Fermentation(SSF) of Sugarcane Bagasse - Kinetics and Modeling

Authors: E.Sasikumar, T.Viruthagiri

Abstract:

Simultaneous Saccharification and Fermentation (SSF) of sugarcane bagasse by cellulase and Pachysolen tannophilus MTCC *1077 were investigated in the present study. Important process variables for ethanol production form pretreated bagasse were optimized using Response Surface Methodology (RSM) based on central composite design (CCD) experiments. A 23 five level CCD experiments with central and axial points was used to develop a statistical model for the optimization of process variables such as incubation temperature (25–45°) X1, pH (5.0–7.0) X2 and fermentation time (24–120 h) X3. Data obtained from RSM on ethanol production were subjected to the analysis of variance (ANOVA) and analyzed using a second order polynomial equation and contour plots were used to study the interactions among three relevant variables of the fermentation process. The fermentation experiments were carried out using an online monitored modular fermenter 2L capacity. The processing parameters setup for reaching a maximum response for ethanol production was obtained when applying the optimum values for temperature (32°C), pH (5.6) and fermentation time (110 h). Maximum ethanol concentration (3.36 g/l) was obtained from 50 g/l pretreated sugarcane bagasse at the optimized process conditions in aerobic batch fermentation. Kinetic models such as Monod, Modified Logistic model, Modified Logistic incorporated Leudeking – Piret model and Modified Logistic incorporated Modified Leudeking – Piret model have been evaluated and the constants were predicted.

Keywords: Sugarcane bagasse, ethanol, optimization, Pachysolen tannophilus.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2300
10983 A BIM-Based Approach to Assess COVID-19 Risk Management Regarding Indoor Air Ventilation and Pedestrian Dynamics

Authors: T. Delval, C. Sauvage, Q. Jullien, R. Viano, T. Diallo, B. Collignan, G. Picinbono

Abstract:

In the context of the international spread of COVID-19, the Centre Scientifique et Technique du Bâtiment (CSTB) has led a joint research with the French government authorities Hauts-de-Seine department, to analyse the risk in school spaces according to their configuration, ventilation system and spatial segmentation strategy. This paper describes the main results of this joint research. A multidisciplinary team involving experts in indoor air quality/ventilation, pedestrian movements and IT domains was established to develop a COVID risk analysis tool based on Building Information Model. The work started with specific analysis on two pilot schools in order to provide for the local administration specifications to minimize the spread of the virus. Different recommendations were published to optimize/validate the use of ventilation systems and the strategy of student occupancy and student flow segmentation within the building. This COVID expertise has been digitized in order to manage a quick risk analysis on the entire building that could be used by the public administration through an easy user interface implemented in a free BIM Management software. One of the most interesting results is to enable a dynamic comparison of different ventilation system scenarios and space occupation strategy inside the BIM model. This concurrent engineering approach provides users with the optimal solution according to both ventilation and pedestrian flow expertise.

Keywords: BIM, knowledge management, system expert, risk management, indoor ventilation, pedestrian movement, integrated design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 760
10982 ANN Based Model Development for Material Removal Rate in Dry Turning in Indian Context

Authors: Mangesh R. Phate, V. H. Tatwawadi

Abstract:

This paper is intended to develop an artificial neural network (ANN) based model of material removal rate (MRR) in the turning of ferrous and nonferrous material in a Indian small-scale industry. MRR of the formulated model was proved with the testing data and artificial neural network (ANN) model was developed for the analysis and prediction of the relationship between inputs and output parameters during the turning of ferrous and nonferrous materials. The input parameters of this model are operator, work-piece, cutting process, cutting tool, machine and the environment.

The ANN model consists of a three layered feedforward back propagation neural network. The network is trained with pairs of independent/dependent datasets generated when machining ferrous and nonferrous material. A very good performance of the neural network, in terms of contract with experimental data, was achieved. The model may be used for the testing and forecast of the complex relationship between dependent and the independent parameters in turning operations.

Keywords: Field data based model, Artificial neural network, Simulation, Convectional Turning, Material removal rate.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1969
10981 Parameters Extraction for Pseudomorphic HEMTs Using Genetic Algorithms

Authors: Mazhar B. Tayel, Amr H. Yassin

Abstract:

A proposed small-signal model parameters for a pseudomorphic high electron mobility transistor (PHEMT) is presented. Both extrinsic and intrinsic circuit elements of a smallsignal model are determined using genetic algorithm (GA) as a stochastic global search and optimization tool. The parameters extraction of the small-signal model is performed on 200-μm gate width AlGaAs/InGaAs PHEMT. The equivalent circuit elements for a proposed 18 elements model are determined directly from the measured S- parameters. The GA is used to extract the parameters of the proposed small-signal model from 0.5 up to 18 GHz.

Keywords: PHEMT, Genetic Algorithms, small signal modeling, optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2261
10980 Lookup Table Reduction and Its Error Analysis of Hall Sensor-Based Rotation Angle Measurement

Authors: Young-San Shin, Seongsoo Lee

Abstract:

Hall sensor is widely used to measure rotation angle. When the Hall voltage is measured for linear displacement, it is converted to angular displacement using arctangent function, which requires a large lookup table. In this paper, a lookup table reduction technique is presented for angle measurement. When the input of the lookup table is small within a certain threshold, the change of the outputs with respect to the change of the inputs is relatively small. Thus, several inputs can share same output, which significantly reduce the lookup table size. Its error analysis was also performed, and the threshold was determined so as to maintain the error less than 1°. When the Hall voltage has 11-bit resolution, the lookup table size is reduced from 1,024 samples to 279 samples.

Keywords: Hall sensor, angle measurement, lookup table, arctangent.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1528
10979 Developing a Model for the Relation between Heritage and Place Identity

Authors: A. Arjomand Kermani, N. Charbgoo, M. Alalhesabi

Abstract:

In the situation of great acceleration of changes and the need for new developments in the cities on one hand and conservation and regeneration approaches on the other hand, place identity and its relation with heritage context have taken on new importance. This relation is generally mutual and complex one. The significant point in this relation is that the process of identifying something as heritage rather than just historical  phenomena, brings that which may be inherited into the realm of identity. In planning and urban design as well as environmental psychology and phenomenology domain, place identity and its attributes and components were studied and discussed. However, the relation between physical environment (especially heritage) and identity has been neglected in the planning literature. This article aims to review the knowledge on this field and develop a model on the influence and relation of these two major concepts (heritage and identity). To build this conceptual model, we draw on available literature in environmental psychology as well as planning on place identity and heritage environment using a descriptive-analytical methodology to understand how they can inform the planning strategies and governance policies. A cross-disciplinary analysis is essential to understand the nature of place identity and heritage context and develop a more holistic model of their relationship in order to be employed in planning process and decision making. Moreover, this broader and more holistic perspective would enable both social scientists and planners to learn from one another’s expertise for a fuller understanding of community dynamics. The result indicates that a combination of these perspectives can provide a richer understanding—not only of how planning impacts our experience of place, but also how place identity can impact community planning and development.

Keywords: heritage, Inter-disciplinary study, Place identity, planning

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1911
10978 The Long Run Relationship between Exports and Imports in South Africa: Evidence from Cointegration Analysis

Authors: Sagaren Pillay

Abstract:

This study empirically examines the long run equilibrium relationship between South Africa’s exports and imports using quarterly data from 1985 to 2012. The theoretical framework used for the study is based on Johansen’s Maximum Likelihood cointegration technique which tests for both the existence and number of cointegration vectors that exists. The study finds that both the series are integrated of order one and are cointegrated. A statistically significant cointegrating relationship is found to exist between exports and imports. The study models this unique linear and lagged relationship using a Vector Error Correction Model (VECM). The findings of the study confirm the existence of a long run equilibrium relationship between exports and imports.

Keywords: Cointegration lagged, linear, maximum likelihood, vector error correction model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2783
10977 Predictive Semi-Empirical NOx Model for Diesel Engine

Authors: Saurabh Sharma, Yong Sun, Bruce Vernham

Abstract:

Accurate prediction of NOx emission is a continuous challenge in the field of diesel engine-out emission modeling. Performing experiments for each conditions and scenario cost significant amount of money and man hours, therefore model-based development strategy has been implemented in order to solve that issue. NOx formation is highly dependent on the burn gas temperature and the O2 concentration inside the cylinder. The current empirical models are developed by calibrating the parameters representing the engine operating conditions with respect to the measured NOx. This makes the prediction of purely empirical models limited to the region where it has been calibrated. An alternative solution to that is presented in this paper, which focus on the utilization of in-cylinder combustion parameters to form a predictive semi-empirical NOx model. The result of this work is shown by developing a fast and predictive NOx model by using the physical parameters and empirical correlation. The model is developed based on the steady state data collected at entire operating region of the engine and the predictive combustion model, which is developed in Gamma Technology (GT)-Power by using Direct Injected (DI)-Pulse combustion object. In this approach, temperature in both burned and unburnt zone is considered during the combustion period i.e. from Intake Valve Closing (IVC) to Exhaust Valve Opening (EVO). Also, the oxygen concentration consumed in burnt zone and trapped fuel mass is also considered while developing the reported model.  Several statistical methods are used to construct the model, including individual machine learning methods and ensemble machine learning methods. A detailed validation of the model on multiple diesel engines is reported in this work. Substantial numbers of cases are tested for different engine configurations over a large span of speed and load points. Different sweeps of operating conditions such as Exhaust Gas Recirculation (EGR), injection timing and Variable Valve Timing (VVT) are also considered for the validation. Model shows a very good predictability and robustness at both sea level and altitude condition with different ambient conditions. The various advantages such as high accuracy and robustness at different operating conditions, low computational time and lower number of data points requires for the calibration establishes the platform where the model-based approach can be used for the engine calibration and development process. Moreover, the focus of this work is towards establishing a framework for the future model development for other various targets such as soot, Combustion Noise Level (CNL), NO2/NOx ratio etc.

Keywords: Diesel engine, machine learning, NOx emission, semi-empirical.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 854
10976 An Analytical Electron Mobility Model based on Particle Swarm Computation for Siliconbased Devices

Authors: F. Djeffal, N. Lakhdar, T. Bendib

Abstract:

The study of the transport coefficients in electronic devices is currently carried out by analytical and empirical models. This study requires several simplifying assumptions, generally necessary to lead to analytical expressions in order to study the different characteristics of the electronic silicon-based devices. Further progress in the development, design and optimization of Silicon-based devices necessarily requires new theory and modeling tools. In our study, we use the PSO (Particle Swarm Optimization) technique as a computational tool to develop analytical approaches in order to study the transport phenomenon of the electron in crystalline silicon as function of temperature and doping concentration. Good agreement between our results and measured data has been found. The optimized analytical models can also be incorporated into the circuits simulators to study Si-based devices without impact on the computational time and data storage.

Keywords: Particle Swarm, electron mobility, Si-based devices, Optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1537
10975 Feeder Reconfiguration for Loss Reduction in Unbalanced Distribution System Using Genetic Algorithm

Authors: Ganesh. Vulasala, Sivanagaraju. Sirigiri, Ramana. Thiruveedula

Abstract:

This paper presents an efficient approach to feeder reconfiguration for power loss reduction and voltage profile imprvement in unbalanced radial distribution systems (URDS). In this paper Genetic Algorithm (GA) is used to obtain solution for reconfiguration of radial distribution systems to minimize the losses. A forward and backward algorithm is used to calculate load flows in unbalanced distribution systems. By simulating the survival of the fittest among the strings, the optimum string is searched by randomized information exchange between strings by performing crossover and mutation. Results have shown that proposed algorithm has advantages over previous algorithms The proposed method is effectively tested on 19 node and 25 node unbalanced radial distribution systems.

Keywords: Distribution system, Load flows, Reconfiguration, Genetic Algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3249
10974 CFD of Oscillating Airfoil Pitch Cycle by using PISO Algorithm

Authors: Muhammad Amjad Sohail, Rizwan Ullah

Abstract:

This research paper presents the CFD analysis of oscillating airfoil during pitch cycle. Unsteady subsonic flow is simulated for pitching airfoil at Mach number 0.283 and Reynolds number 3.45 millions. Turbulent effects are also considered for this study by using K-ω SST turbulent model. Two-dimensional unsteady compressible Navier-Stokes code including two-equation turbulence model and PISO pressure velocity coupling is used. Pressure based implicit solver with first order implicit unsteady formulation is used. The simulated pitch cycle results are compared with the available experimental data. The results have a good agreement with the experimental data. Aerodynamic characteristics during pitch cycles have been studied and validated.

Keywords: Angle of attack, Centre of pressure, subsonic flow, pitching moment coefficient, turbulence mode

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2392
10973 Multi-Agent Systems Applied in the Modeling and Simulation of Biological Problems: A Case Study in Protein Folding

Authors: Pedro Pablo González Pérez, Hiram I. Beltrán, Arturo Rojo-Domínguez, Máximo EduardoSánchez Gutiérrez

Abstract:

Multi-agent system approach has proven to be an effective and appropriate abstraction level to construct whole models of a diversity of biological problems, integrating aspects which can be found both in "micro" and "macro" approaches when modeling this type of phenomena. Taking into account these considerations, this paper presents the important computational characteristics to be gathered into a novel bioinformatics framework built upon a multiagent architecture. The version of the tool presented herein allows studying and exploring complex problems belonging principally to structural biology, such as protein folding. The bioinformatics framework is used as a virtual laboratory to explore a minimalist model of protein folding as a test case. In order to show the laboratory concept of the platform as well as its flexibility and adaptability, we studied the folding of two particular sequences, one of 45-mer and another of 64-mer, both described by an HP model (only hydrophobic and polar residues) and coarse grained 2D-square lattice. According to the discussion section of this piece of work, these two sequences were chosen as breaking points towards the platform, in order to determine the tools to be created or improved in such a way to overcome the needs of a particular computation and analysis of a given tough sequence. The backwards philosophy herein is that the continuous studying of sequences provides itself important points to be added into the platform, to any time improve its efficiency, as is demonstrated herein.

Keywords: multi-agent systems, blackboard-based agent architecture, bioinformatics framework, virtual laboratory, protein folding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2205
10972 Air Cargo Overbooking Model under Stochastic Weight and Volume Cancellation

Authors: N. Phumchusri, K. Roekdethawesab, M. Lohatepanont

Abstract:

Overbooking is an approach of selling more goods or services than available capacities because sellers anticipate that some buyers will not show-up or may cancel their bookings. At present, many airlines deploy overbooking strategy in order to deal with the uncertainty of their customers. Particularly, some airlines sell more cargo capacity than what they have available to freight forwarders with beliefs that some of them will cancel later. In this paper, we propose methods to find the optimal overbooking level of volume and weight for air cargo in order to minimize the total cost, containing cost of spoilage and cost of offloaded. Cancellations of volume and weight are jointly random variables with a known joint distribution. Heuristic approaches applying the idea of weight and volume independency is considered to find an appropriate answer to the full problem. Computational experiments are used to explore the performance of approaches presented in this paper, as compared to a naïve method under different scenarios.

Keywords: Air cargo overbooking, offloaded capacity, optimal overbooking level, revenue management, spoilage capacity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2186
10971 The Evaluation of Production Line Performance by Using ARENA – A Case Study

Authors: Muhammad Marsudi, Hani Shafeek

Abstract:

The purpose of this paper is to simulate the production process of a metal stamping industry and to evaluate the utilization of the production line by using ARENA simulation software. The process time and the standard time for each process of the production line is obtained from data given by the company management. Other data are collected through direct observation of the line. There are three work stations performing ten different types of processes in order to produce a single product type. Arena simulation model is then developed based on the collected data. Verification and validation are done to the Arena model, and finally the result of Arena simulation can be analyzed. It is found that utilization at each workstation will increase if batch size is increased although throughput rate remains/is kept constant. This study is very useful for the company because the company needs to improve the efficiency and utilization of its production lines.

Keywords: Arena software, case study, production line, utilization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5373
10970 Wavelet-Based Despeckling of Synthetic Aperture Radar Images Using Adaptive and Mean Filters

Authors: Syed Musharaf Ali, Muhammad Younus Javed, Naveed Sarfraz Khattak

Abstract:

In this paper we introduced new wavelet based algorithm for speckle reduction of synthetic aperture radar images, which uses combination of undecimated wavelet transformation, wiener filter (which is an adaptive filter) and mean filter. Further more instead of using existing thresholding techniques such as sure shrinkage, Bayesian shrinkage, universal thresholding, normal thresholding, visu thresholding, soft and hard thresholding, we use brute force thresholding, which iteratively run the whole algorithm for each possible candidate value of threshold and saves each result in array and finally selects the value for threshold that gives best possible results. That is why it is slow as compared to existing thresholding techniques but gives best results under the given algorithm for speckle reduction.

Keywords: Brute force thresholding, directional smoothing, direction dependent mask, undecimated wavelet transformation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2878
10969 Treatment of Tannery Effluents by the Process of Coagulation

Authors: G. Shegani

Abstract:

Coagulation is a process that sanitizes leather effluents. It aims to reduce pollutants such as Chemical Oxygen Demand (COD), chloride, sulfate, chromium, suspended solids, and other dissolved solids. The current study aimed to evaluate coagulation efficiency of tannery wastewater by analyzing the change in organic matter, odor, color, ammonium ions, nutrients, chloride, H2S, sulfate, suspended solids, total dissolved solids, fecal pollution, and chromium hexavalent before and after treatment. Effluent samples were treated with coagulants Ca(OH)2 and FeSO4 .7H2O. The best advantages of this treatment included the removal of: COD (81.60%); ammonia ions (98.34%); nitrate ions (92%); chromium hexavalent (75.00%); phosphate (70.00%); chloride (69.20%); and H₂S (50%). Results also indicated a high level of efficiency in the reduction of fecal pollution indicators. Unfortunately, only a modest reduction of sulfate (19.00%) and TSS (13.00%) and an increase in TDS (15.60%) was observed. 

Keywords: Coagulation, Effluent, Tannery, Treatment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4161
10968 The Relationship between Business-model Innovation and Firm Value: A Dynamic Perspective

Authors: Yung C. Ho, Hui C. Fang, Ming J. Hsieh

Abstract:

When consistently innovative business-models can give companies a competitive advantage, longitudinal empirical research, which can reflect dynamic business-model changes, has yet to prove a definitive connection. This study consequently employs a dynamic perspective in conjunction with innovation theory to examine the relationship between the types of business-model innovation and firm value. This study tries to examine various types of business-model innovation in high-end and low-end technology industries such as HTC and the 7-Eleven chain stores with research periods of 14 years and 32 years, respectively. The empirical results suggest that adopting radical business-model innovation in addition to expanding new target markets can successfully lead to a competitive advantage. Sustained advanced technological competences and service/product innovation are the key successful factors in high-end and low-end technology industry business-models respectively. In sum up, the business-model innovation can yield a higher market value and financial value in high-end technology industries than low-end ones.

Keywords: Business-model, Dynamic Perspective, Firm Value, Innovation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2743
10967 Manual Testing of Web Software Systems Supported by Direct Guidance of the Tester Based On Design Model

Authors: Karel Frajtak, Miroslav Bures, Ivan Jelinek

Abstract:

Software testing is important stage of software development cycle. Current testing process involves tester and electronic documents with test case scenarios. In this paper we focus on new approach to testing process using automated test case generation and tester guidance through the system based on the model of the system. Test case generation and model-based testing is not possible without proper system model. We aim on providing better feedback from the testing process thus eliminating the unnecessary paper work.

Keywords: Model based testing, test automation, test generating, tester support.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1958
10966 Biosynthesis and In vitro Studies of Silver Bionanoparticles Synthesized from Aspergillusspecies and its Antimicrobial Activity against Multi Drug Resistant Clinical Isolates

Authors: M. Saravanan

Abstract:

Antimicrobial resistant is becoming a major factor in virtually all hospital acquired infection may soon untreatable is a serious public health problem. These concerns have led to major research effort to discover alternative strategies for the treatment of bacterial infection. Nanobiotehnology is an upcoming and fast developing field with potential application for human welfare. An important area of nanotechnology for development of reliable and environmental friendly process for synthesis of nanoscale particles through biological systems In the present studies are reported on the use of fungal strain Aspergillus species for the extracellular synthesis of bionanoparticles from 1 mM silver nitrate (AgNO3) solution. The report would be focused on the synthesis of metallic bionanoparticles of silver using a reduction of aqueous Ag+ ion with the culture supernatants of Microorganisms. The bio-reduction of the Ag+ ions in the solution would be monitored in the aqueous component and the spectrum of the solution would measure through UV-visible spectrophotometer The bionanoscale particles were further characterized by Atomic Force Microscopy (AFM), Fourier Transform Infrared Spectroscopy (FTIR) and Thin layer chromatography. The synthesized bionanoscale particle showed a maximum absorption at 385 nm in the visible region. Atomic Force Microscopy investigation of silver bionanoparticles identified that they ranged in the size of 250 nm - 680 nm; the work analyzed the antimicrobial efficacy of the silver bionanoparticles against various multi drug resistant clinical isolates. The present Study would be emphasizing on the applicability to synthesize the metallic nanostructures and to understand the biochemical and molecular mechanism of nanoparticles formation by the cell filtrate in order to achieve better control over size and polydispersity of the nanoparticles. This would help to develop nanomedicine against various multi drug resistant human pathogens.

Keywords: Bionanoparticles, UV-visible spectroscopy, AtomicForce Microscopy, Extracellular synthesis, Multi drug resistant, antimicrobial activity, Nanomedicine

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2237
10965 An Extension of Multi-Layer Perceptron Based on Layer-Topology

Authors: Jānis Zuters

Abstract:

There are a lot of extensions made to the classic model of multi-layer perceptron (MLP). A notable amount of them has been designed to hasten the learning process without considering the quality of generalization. The paper proposes a new MLP extension based on exploiting topology of the input layer of the network. Experimental results show the extended model to improve upon generalization capability in certain cases. The new model requires additional computational resources to compare to the classic model, nevertheless the loss in efficiency isn-t regarded to be significant.

Keywords: Learning algorithm, multi-layer perceptron, topology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1511
10964 Using the Monte Carlo Simulation to Predict the Assembly Yield

Authors: C. Chahin, M. C. Hsu, Y. H. Lin, C. Y. Huang

Abstract:

Electronics Products that achieve high levels of integrated communications, computing and entertainment, multimedia features in small, stylish and robust new form factors are winning in the market place. Due to the high costs that an industry may undergo and how a high yield is directly proportional to high profits, IC (Integrated Circuit) manufacturers struggle to maximize yield, but today-s customers demand miniaturization, low costs, high performance and excellent reliability making the yield maximization a never ending research of an enhanced assembly process. With factors such as minimum tolerances, tighter parameter variations a systematic approach is needed in order to predict the assembly process. In order to evaluate the quality of upcoming circuits, yield models are used which not only predict manufacturing costs but also provide vital information in order to ease the process of correction when the yields fall below expectations. For an IC manufacturer to obtain higher assembly yields all factors such as boards, placement, components, the material from which the components are made of and processes must be taken into consideration. Effective placement yield depends heavily on machine accuracy and the vision of the system which needs the ability to recognize the features on the board and component to place the device accurately on the pads and bumps of the PCB. There are currently two methods for accurate positioning, using the edge of the package and using solder ball locations also called footprints. The only assumption that a yield model makes is that all boards and devices are completely functional. This paper will focus on the Monte Carlo method which consists in a class of computational algorithms (information processed algorithms) which depends on repeated random samplings in order to compute the results. This method utilized in order to recreate the simulation of placement and assembly processes within a production line.

Keywords: Monte Carlo simulation, placement yield, PCBcharacterization, electronics assembly

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2165
10963 Linking Business Process Models and System Models Based on Business Process Modelling

Authors: Faisal A. Aburub

Abstract:

Organizations today need to invest in software in order to run their businesses, and to the organizations’ objectives, the software should be in line with the business process. This research presents an approach for linking process models and system models. Particularly, the new approach aims to synthesize sequence diagram based on role activity diagram (RAD) model. The approach includes four steps namely: Create business process model using RAD, identify computerized activities, identify entities in sequence diagram and identify messages in sequence diagram. The new approach has been validated using the process of student registration in University of Petra as a case study. Further research is required to validate the new approach using different domains.

Keywords: Business process modelling, system models, role activity diagrams, sequence diagrams.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1524
10962 A Methodological Test to Study the Concrete Workability with the Fractal Model

Authors: F. Achouri, K. Chouicha

Abstract:

The main parameters affecting the workability are the water content, particle size, and the total surface of the grains, as long as the mixing water begins by wetting the surface of the grains and then fills the voids between the grains to form entrapped water, the quantity of water remaining is called free water. The aim of this study is to undertake a fractal approach through the relationship between the concrete formulation parameters and workability. To develop this approach a series of concrete taken from the literature was investigated by varying formulation parameters such as G/S, the quantity of cement C and the quantity of water W. We also call another model as the model of water layer thickness and model of paste layer thickness to judge their relevance, hence the following results: the relevance of the water layer thickness model is considered as a relevant when there is a variation in the water quantity. The model of the paste layer thickness is only applicable if we considered that the paste is made with the grain value Dmax = 2.85: value from which we see a stability of the model.

Keywords: Concrete, fractal method, paste layer thickness, water layer thickness, workability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1635