Search results for: moving average curve
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6481

Search results for: moving average curve

6211 Learning Curve Effect on Materials Procurement Schedule of Multiple Sister Ships

Authors: Vijaya Dixit Aasheesh Dixit

Abstract:

Shipbuilding industry operates in Engineer Procure Construct (EPC) context. Product mix of a shipyard comprises of various types of ships like bulk carriers, tankers, barges, coast guard vessels, sub-marines etc. Each order is unique based on the type of ship and customized requirements, which are engineered into the product right from design stage. Thus, to execute every new project, a shipyard needs to upgrade its production expertise. As a result, over the long run, holistic learning occurs across different types of projects which contributes to the knowledge base of the shipyard. Simultaneously, in the short term, during execution of a project comprising of multiple sister ships, repetition of similar tasks leads to learning at activity level. This research aims to capture above learnings of a shipyard and incorporate learning curve effect in project scheduling and materials procurement to improve project performance. Extant literature provides support for the existence of such learnings in an organization. In shipbuilding, there are sequences of similar activities which are expected to exhibit learning curve behavior. For example, the nearly identical structural sub-blocks which are successively fabricated, erected, and outfitted with piping and electrical systems. Learning curve representation can model not only a decrease in mean completion time of an activity, but also a decrease in uncertainty of activity duration. Sister ships have similar material requirements. The same supplier base supplies materials for all the sister ships within a project. On one hand, this provides an opportunity to reduce transportation cost by batching the order quantities of multiple ships. On the other hand, it increases the inventory holding cost at shipyard and the risk of obsolescence. Further, due to learning curve effect the production scheduled of each consequent ship gets compressed. Thus, the material requirement schedule of every next ship differs from its previous ship. As more and more ships get constructed, compressed production schedules increase the possibility of batching the orders of sister ships. This work aims at integrating materials management with project scheduling of long duration projects for manufacturing of multiple sister ships. It incorporates the learning curve effect on progressively compressing material requirement schedules and addresses the above trade-off of transportation cost and inventory holding and shortage costs while satisfying budget constraints of various stages of the project. The activity durations and lead time of items are not crisp and are available in the form of probabilistic distribution. A Stochastic Mixed Integer Programming (SMIP) model is formulated which is solved using evolutionary algorithm. Its output provides ordering dates of items and degree of order batching for all types of items. Sensitivity analysis determines the threshold number of sister ships required in a project to leverage the advantage of learning curve effect in materials management decisions. This analysis will help materials managers to gain insights about the scenarios: when and to what degree is it beneficial to treat a multiple ship project as an integrated one by batching the order quantities and when and to what degree to practice distinctive procurement for individual ship.

Keywords: learning curve, materials management, shipbuilding, sister ships

Procedia PDF Downloads 485
6210 Perforation Analysis of the Aluminum Alloy Sheets Subjected to High Rate of Loading and Heated Using Thermal Chamber: Experimental and Numerical Approach

Authors: A. Bendarma, T. Jankowiak, A. Rusinek, T. Lodygowski, M. Klósak, S. Bouslikhane

Abstract:

The analysis of the mechanical characteristics and dynamic behavior of aluminum alloy sheet due to perforation tests based on the experimental tests coupled with the numerical simulation is presented. The impact problems (penetration and perforation) of the metallic plates have been of interest for a long time. Experimental, analytical as well as numerical studies have been carried out to analyze in details the perforation process. Based on these approaches, the ballistic properties of the material have been studied. The initial and residual velocities laser sensor is used during experiments to obtain the ballistic curve and the ballistic limit. The energy balance is also reported together with the energy absorbed by the aluminum including the ballistic curve and ballistic limit. The high speed camera helps to estimate the failure time and to calculate the impact force. A wide range of initial impact velocities from 40 up to 180 m/s has been covered during the tests. The mass of the conical nose shaped projectile is 28 g, its diameter is 12 mm, and the thickness of the aluminum sheet is equal to 1.0 mm. The ABAQUS/Explicit finite element code has been used to simulate the perforation processes. The comparison of the ballistic curve was obtained numerically and was verified experimentally, and the failure patterns are presented using the optimal mesh densities which provide the stability of the results. A good agreement of the numerical and experimental results is observed.

Keywords: aluminum alloy, ballistic behavior, failure criterion, numerical simulation

Procedia PDF Downloads 295
6209 Some Statistical Properties of Residual Sea Level along the Coast of Vietnam

Authors: Doan Van Chinh, Bui Thi Kien Trinh

Abstract:

This paper outlines some statistical properties of residual sea level (RSL) at six representative tidal stations located along the coast of Vietnam. It was found that the positive RSL varied on average between 9.82 and 19.96cm and the negative RSL varied on average between -16.62 and -9.02cm. The maximum positive RSL varied on average between 102.8 and 265.5cm with the maximum negative RSL varied on average between -250.4 and -66.4cm. It is seen that the biggest positive RSL ere appeared in the summer months and the biggest negative RSL ere appeared in the winter months. The cumulative frequency of RSL less than 50 cm occurred between 95 and 99% of the times while the frequency of RSL higher than 100 cm accounted for between 0.01 and 0.2%. It also was found that the cumulative frequency of duration of RSL less than 24 hours occurred between 90 and 99% while the frequency of duration longer than 72 hours was in the order of 0.1 and 1%.

Keywords: coast of Vietnam, residual sea level, residual water, surge, cumulative frequency

Procedia PDF Downloads 268
6208 Computational Simulations on Stability of Model Predictive Control for Linear Discrete-Time Stochastic Systems

Authors: Tomoaki Hashimoto

Abstract:

Model predictive control is a kind of optimal feedback control in which control performance over a finite future is optimized with a performance index that has a moving initial time and a moving terminal time. This paper examines the stability of model predictive control for linear discrete-time systems with additive stochastic disturbances. A sufficient condition for the stability of the closed-loop system with model predictive control is derived by means of a linear matrix inequality. The objective of this paper is to show the results of computational simulations in order to verify the validity of the obtained stability condition.

Keywords: computational simulations, optimal control, predictive control, stochastic systems, discrete-time systems

Procedia PDF Downloads 413
6207 Application of Groundwater Level Data Mining in Aquifer Identification

Authors: Liang Cheng Chang, Wei Ju Huang, You Cheng Chen

Abstract:

Investigation and research are keys for conjunctive use of surface and groundwater resources. The hydrogeological structure is an important base for groundwater analysis and simulation. Traditionally, the hydrogeological structure is artificially determined based on geological drill logs, the structure of wells, groundwater levels, and so on. In Taiwan, groundwater observation network has been built and a large amount of groundwater-level observation data are available. The groundwater level is the state variable of the groundwater system, which reflects the system response combining hydrogeological structure, groundwater injection, and extraction. This study applies analytical tools to the observation database to develop a methodology for the identification of confined and unconfined aquifers. These tools include frequency analysis, cross-correlation analysis between rainfall and groundwater level, groundwater regression curve analysis, and decision tree. The developed methodology is then applied to groundwater layer identification of two groundwater systems: Zhuoshui River alluvial fan and Pingtung Plain. The abovementioned frequency analysis uses Fourier Transform processing time-series groundwater level observation data and analyzing daily frequency amplitude of groundwater level caused by artificial groundwater extraction. The cross-correlation analysis between rainfall and groundwater level is used to obtain the groundwater replenishment time between infiltration and the peak groundwater level during wet seasons. The groundwater regression curve, the average rate of groundwater regression, is used to analyze the internal flux in the groundwater system and the flux caused by artificial behaviors. The decision tree uses the information obtained from the above mentioned analytical tools and optimizes the best estimation of the hydrogeological structure. The developed method reaches training accuracy of 92.31% and verification accuracy 93.75% on Zhuoshui River alluvial fan and training accuracy 95.55%, and verification accuracy 100% on Pingtung Plain. This extraordinary accuracy indicates that the developed methodology is a great tool for identifying hydrogeological structures.

Keywords: aquifer identification, decision tree, groundwater, Fourier transform

Procedia PDF Downloads 141
6206 Solar-Powered Adsorption Cooling System: A Case Study on the Climatic Conditions of Al Minya

Authors: El-Sadek H. Nour El-deen, K. Harby

Abstract:

Energy saving and environment friendly applications are turning out to be one of the most important topics nowadays. In this work, a simulation analysis using TRNSYS software has been carried out to study the benefit of employing a solar adsorption cooling system under the climatic conditions of Al-Minya city, Egypt. A theoretical model was carried out on a two bed adsorption cooling system employing granular activated carbon-HFC-404A as working pair. Temporal and averaged history of solar collector, adsorbent beds, evaporator and condenser has been shown. System performance in terms of daily average cooling capacity and average coefficient of performance around the year has been investigated. The results showed that maximum yearly average coefficient of performance (COP) and cooling capacity are about 0.26 and 8 kW respectively. The maximum value of the both average cooling capacity and COP cyclic is directly proportional to the maximum solar radiation. The system performance was found to be increased with the average ambient temperature. Finally, the proposed solar powered adsorption cooling systems can be used effectively under Al-Minya climatic conditions.

Keywords: adsorption, cooling, Egypt, environment, solar energy

Procedia PDF Downloads 146
6205 Tumor Detection of Cerebral MRI by Multifractal Analysis

Authors: S. Oudjemia, F. Alim, S. Seddiki

Abstract:

This paper shows the application of multifractal analysis for additional help in cancer diagnosis. The medical image processing is a very important discipline in which many existing methods are in search of solutions to real problems of medicine. In this work, we present results of multifractal analysis of brain MRI images. The purpose of this analysis was to separate between healthy and cancerous tissue of the brain. A nonlinear method based on multifractal detrending moving average (MFDMA) which is a generalization of the detrending fluctuations analysis (DFA) is used for the detection of abnormalities in these images. The proposed method could make separation of the two types of brain tissue with success. It is very important to note that the choice of this non-linear method is due to the complexity and irregularity of tumor tissue that linear and classical nonlinear methods seem difficult to characterize completely. In order to show the performance of this method, we compared its results with those of the conventional method box-counting.

Keywords: irregularity, nonlinearity, MRI brain images, multifractal analysis, brain tumor

Procedia PDF Downloads 430
6204 Extraction of Dyes Using an Aqueous Two-Phase System in Stratified and Slug Flow Regimes of a Microchannel

Authors: Garima, S. Pushpavanam

Abstract:

In this work, analysis of an Aqueous two-phase (polymer-salt) system for extraction of sunset yellow dye is carried out. A polymer-salt ATPS i.e.; Polyethylene glycol-600 and anhydrous sodium sulfate is used for the extraction. Conditions are chosen to ensure that the extraction results in a concentration of the dye in one of the phases. The dye has a propensity to come to the Polyethylene glycol-600 phase. This extracted sunset yellow dye is degraded photo catalytically into less harmful components. The cloud point method was used to obtain the binodal curve of ATPS. From the binodal curve, the composition of salt and Polyethylene glycol -600 was chosen such that the volume of Polyethylene glycol-600 rich phase is low. This was selected to concentrate the dye from a dilute solution in a large volume of contaminated solution into a small volume. This pre-concentration step provides a high reaction rate for photo catalytic degradation reaction. Experimentally the dye is extracted from the salt phase to Polyethylene glycol -600 phase in batch extraction. This was found to be very fast and all dye was extracted. The concentration of sunset yellow dye in salt and polymer phase is measured at 482nm by ultraviolet-visible spectrophotometry. The extraction experiment in micro channels under stratified flow is analyzed to determine factors which affect the dye extraction. Focus will be on obtaining slug flow by adding nanoparticles in micro channel. The primary aim is to exploit the fact that slug flow will help improve mass transfer rate from one phase to another through internal circulation in dispersed phase induced by shear.

Keywords: aqueous two phase system, binodal curve, extraction, sunset yellow dye

Procedia PDF Downloads 344
6203 The Research of 'Rope Coiling' Effect in Near-Field Electrospinning

Authors: Feiyu Fang, Han Wang, Xin Chen, Jun Zeng, Feng Liang, Peixuan Wu

Abstract:

The 'rope coiling' effect is a normal instability phenomenon widespread exists in viscous fluid, elastic rods and polymeric fibers owing to compressive stress when they fall into a moving belt. Near-field electro-spinning is the modified electro-spinning technique has the ability to direct write micro fibers. In this research, we study the “rope coiling” effect in near-field electro-spinning. By changing the distance between nozzle and collector or the speed ratio between the charge jet speed and the platform moving speed, we obtain a pile of different models coils including the meandering, alternating and coiling patterns. Therefore, this instability can be used to direct write micro structured fibers with a one-step process.

Keywords: rope coiling effects, near-field electrospinning, direct write, micro structure

Procedia PDF Downloads 329
6202 Research on Transmission Parameters Determination Method Based on Dynamic Characteristic Analysis

Authors: Baoshan Huang, Fanbiao Bao, Bing Li, Lianghua Zeng, Yi Zheng

Abstract:

Parameter control strategy based on statistical characteristics can analyze the choice of the transmission ratio of an automobile transmission. According to the difference of the transmission gear, the number and spacing of the gear can be determined. Transmission ratio distribution of transmission needs to satisfy certain distribution law. According to the statistic characteristics of driving parameters, the shift control strategy of the vehicle is analyzed. CVT shift schedule adjustment algorithm based on statistical characteristic parameters can be seen from the above analysis, if according to the certain algorithm to adjust the size of, can adjust the target point are in the best efficiency curve and dynamic curve between the location, to alter the vehicle characteristics. Based on the dynamic characteristics and the practical application of the vehicle, this paper presents the setting scheme of the transmission ratio.

Keywords: vehicle dynamics, transmission ratio, transmission parameters, statistical characteristics

Procedia PDF Downloads 375
6201 A Design of Elliptic Curve Cryptography Processor based on SM2 over GF(p)

Authors: Shiji Hu, Lei Li, Wanting Zhou, DaoHong Yang

Abstract:

The data encryption, is the foundation of today’s communication. On this basis, how to improve the speed of data encryption and decryption is always a problem that scholars work for. In this paper, we proposed an elliptic curve crypto processor architecture based on SM2 prime field. In terms of hardware implementation, we optimized the algorithms in different stages of the structure. In finite field modulo operation, we proposed an optimized improvement of Karatsuba-Ofman multiplication algorithm, and shorten the critical path through pipeline structure in the algorithm implementation. Based on SM2 recommended prime field, a fast modular reduction algorithm is used to reduce 512-bit wide data obtained from the multiplication unit. The radix-4 extended Euclidean algorithm was used to realize the conversion between affine coordinate system and Jacobi projective coordinate system. In the parallel scheduling of point operations on elliptic curves, we proposed a three-level parallel structure of point addition and point double based on the Jacobian projective coordinate system. Combined with the scalar multiplication algorithm, we added mutual pre-operation to the point addition and double point operation to improve the efficiency of the scalar point multiplication. The proposed ECC hardware architecture was verified and implemented on Xilinx Virtex-7 and ZYNQ-7 platforms, and each 256-bit scalar multiplication operation took 0.275ms. The performance for handling scalar multiplication is 32 times that of CPU(dual-core ARM Cortex-A9).

Keywords: Elliptic curve cryptosystems, SM2, modular multiplication, point multiplication.

Procedia PDF Downloads 76
6200 Traffic Sign Recognition System Using Convolutional Neural NetworkDevineni

Authors: Devineni Vijay Bhaskar, Yendluri Raja

Abstract:

We recommend a model for traffic sign detection stranded on Convolutional Neural Networks (CNN). We first renovate the unique image into the gray scale image through with support vector machines, then use convolutional neural networks with fixed and learnable layers for revealing and understanding. The permanent layer can reduction the amount of attention areas to notice and crop the limits very close to the boundaries of traffic signs. The learnable coverings can rise the accuracy of detection significantly. Besides, we use bootstrap procedures to progress the accuracy and avoid overfitting problem. In the German Traffic Sign Detection Benchmark, we obtained modest results, with an area under the precision-recall curve (AUC) of 99.49% in the group “Risk”, and an AUC of 96.62% in the group “Obligatory”.

Keywords: convolutional neural network, support vector machine, detection, traffic signs, bootstrap procedures, precision-recall curve

Procedia PDF Downloads 97
6199 Evaluation of Quick Covering Machine for Grain Drying Pavement

Authors: Fatima S. Rodriguez, Victorino T. Taylan, Manolito C. Bulaong, Helen F. Gavino, Vitaliana U. Malamug

Abstract:

In sundrying the quality of the grains are greatly reduced when paddy grains were caught by the rain unsacked and unstored resulting to reduced profit. The objectives of this study were to design and fabricate a quick covering machine for grain drying pavement; to test and evaluate the operating characteristics of the machine according to its deployment speed, recovery speed, deployment time, recovery time, power consumption, aesthetics of laminated sack; and to conduct partial budget and cost curve analysis. The machine was able to cover the grains in a 12.8 m x 22.5 m grain drying pavement at an average time of 17.13 s. It consumed 0.53 W-hr for the deployment and recovery of the cover. The machine entailed an investment cost of $1,344.40 and an annual cost charge of $647.32. Moreover, the savings per year using the quick covering machine was $101.83.

Keywords: quick covering machine, grain drying pavement, laminated polypropylene, recovery time

Procedia PDF Downloads 301
6198 Synthesis and Characterization of Non-Aqueous Electrodeposited ZnSe Thin Film

Authors: S. R. Kumar, Shashikant Rajpal

Abstract:

A nanocrystalline thin film of ZnSe was successfully electrodeposited on copper substrate using a non-aqueous solution and subsequently annealed in air at 400°C. XRD analysis indicates the polycrystalline deposit of (111) plane in both the cases. The sharpness of the peak increases due to annealing of the film and average grain size increases to 20 nm to 27nm. SEM photograph indicate that grains are uniform and densely distributed over the surface. Due to annealing the average grain size increased by 20%. The EDS spectroscopy shows the ratio of Zn & Se is 1.1 in case of annealed film. AFM analysis indicates the average roughness of the film reduces from 181nm to 165nm due to annealing of the film. The bandgap also decreases from 2.71eV to 2.62eV.

Keywords: electrodeposition, non-aqueous medium, SEM, XRD

Procedia PDF Downloads 471
6197 Development of an Integrated Route Information Management Software

Authors: Oluibukun G. Ajayi, Joseph O. Odumosu, Oladimeji T. Babafemi, Azeez Z. Opeyemi, Asaleye O. Samuel

Abstract:

The need for the complete automation of every procedure of surveying and most especially, its engineering applications cannot be overemphasized due to the many demerits of the conventional manual or analogue approach. This paper presents the summarized details of the development of a Route Information Management (RIM) software. The software, codenamed ‘AutoROUTE’, was encoded using Microsoft visual studio-visual basic package, and it offers complete automation of the computational procedures and plan production involved in route surveying. It was experimented using a route survey data (longitudinal profile and cross sections) of a 2.7 km road which stretches from Dama to Lunko village in Minna, Niger State, acquired with the aid of a Hi-Target DGPS receiver. The developed software (AutoROUTE) is capable of computing the various simple curve parameters, horizontal curve, and vertical curve, and it can also plot road alignment, longitudinal profile, and cross-section with a capability to store this on the SQL incorporated into the Microsoft visual basic software. The plotted plans with AutoROUTE were compared with the plans produced with the conventional AutoCAD Civil 3D software, and AutoROUTE proved to be more user-friendly and accurate because it plots in three decimal places whereas AutoCAD plots in two decimal places. Also, it was discovered that AutoROUTE software is faster in plotting and the stages involved is less cumbersome compared to AutoCAD Civil 3D software.

Keywords: automated systems, cross sections, curves, engineering construction, longitudinal profile, route surveying

Procedia PDF Downloads 119
6196 Cycle Number Estimation Method on Fatigue Crack Initiation Using Voronoi Tessellation and the Tanaka Mura Model

Authors: Mohammad Ridzwan Bin Abd Rahim, Siegfried Schmauder, Yupiter HP Manurung, Peter Binkele, Meor Iqram B. Meor Ahmad, Kiarash Dogahe

Abstract:

This paper deals with the short crack initiation of the material P91 under cyclic loading at two different temperatures, concluded with the estimation of the short crack initiation Wöhler (S/N) curve. An artificial but representative model microstructure was generated using Voronoi tessellation and the Finite Element Method, and the non-uniform stress distribution was calculated accordingly afterward. The number of cycles needed for crack initiation is estimated on the basis of the stress distribution in the model by applying the physically-based Tanaka-Mura model. Initial results show that the number of cycles to generate crack initiation is strongly correlated with temperature.

Keywords: short crack initiation, P91, Wöhler curve, Voronoi tessellation, Tanaka-Mura model

Procedia PDF Downloads 86
6195 Jordan Curves in the Digital Plane with Respect to the Connectednesses given by Certain Adjacency Graphs

Authors: Josef Slapal

Abstract:

Digital images are approximations of real ones and, therefore, to be able to study them, we need the digital plane Z2 to be equipped with a convenient structure that behaves analogously to the Euclidean topology on the real plane. In particular, it is required that such a structure allows for a digital analogue of the Jordan curve theorem. We introduce certain adjacency graphs on the digital plane and prove digital Jordan curves for them thus showing that the graphs provide convenient structures on Z2 for the study and processing of digital images. Further convenient structures including the wellknown Khalimsky and Marcus-Wyse adjacency graphs may be obtained as quotients of the graphs introduced. Since digital Jordan curves represent borders of objects in digital images, the adjacency graphs discussed may be used as background structures on the digital plane for solving the problems of digital image processing that are closely related to borders like border detection, contour filling, pattern recognition, thinning, etc.

Keywords: digital plane, adjacency graph, Jordan curve, quotient adjacency

Procedia PDF Downloads 359
6194 Design and Analysis for a 4-Stage Crash Energy Management System for Railway Vehicles

Authors: Ziwen Fang, Jianran Wang, Hongtao Liu, Weiguo Kong, Kefei Wang, Qi Luo, Haifeng Hong

Abstract:

A 4-stage crash energy management (CEM) system for subway rail vehicles used by Massachusetts Bay Transportation Authority (MBTA) in the USA is developed in this paper. The 4 stages of this new CEM system include 1) energy absorbing coupler (draft gear and shear bolts), 2) primary energy absorbers (aluminum honeycomb structured box), 3) secondary energy absorbers (crush tube), and 4) collision post and corner post. A sliding anti-climber and a fixed anti-climber are designed at the front of the vehicle cooperating with the 4-stage CEM to maximize the energy to be absorbed and minimize the damage to passengers and crews. In order to investigate the effectiveness of this CEM system, both finite element (FE) methods and crashworthiness test have been employed. The whole vehicle consists of 3 married pairs, i.e., six cars. In the FE approach, full-scale railway car models are developed and different collision cases such as a single moving car impacting a rigid wall, two moving cars into a rigid wall, two moving cars into two stationary cars, six moving cars into six stationary cars and so on are investigated. The FE analysis results show that the railway vehicle incorporating this CEM system has a superior crashworthiness performance. In the crashworthiness test, a simplified vehicle front end including the sliding anti-climber, the fixed anti-climber, the primary energy absorbers, the secondary energy absorber, the collision post and the corner post is built and impacted to a rigid wall. The same test model is also analyzed in the FE and the results such as crushing force, stress, and strain of critical components, acceleration and velocity curves are compared and studied. FE results show very good comparison to the test results.

Keywords: railway vehicle collision, crash energy management design, finite element method, crashworthiness test

Procedia PDF Downloads 377
6193 Microbial Quality of Raw Camel Milk Produced in South of Morocco

Authors: Maha Alaoui Ismaili, Bouchta Saidi, Mohamed Zahar, Abed Hamama

Abstract:

Thirty one samples of raw camel milk obtained from the region of Laâyoune (South of Morocco) were examined for their microbial quality and presence of some pathogenic bacteria (Staphylococcus aureus and Salmonella sp.). pH of the samples ranged from 6.31 to 6.64 and their titratable acidity had a mean value of 18.56 °Dornic. Data obtained showed a strong microbial contamination with an average total aerobic flora of 1.76 108 ufc ml-1 and a very high fecal counts: 1.82 107 ; 3.25 106 and 3.75 106 ufc.ml-1 in average for total coliforms, fecal coliforms and enterococci respectively. Yeasts and moulds were also found at average respective levels of 3.13 106 and 1.60 105 ufc.ml-1. Salmonella sp. and S. aureus was detected respectively in 13% and 30% of the milk samples. These results indicate clearly the lack of hygienic conditions of camel milk production and storage in this region. Lactic acid bacteria were found at the following average numbers: 4.25 107 ; 4.45 107 and 3.55 107 ufc.ml-1 for Lactococci, Leuconostocs and Lactobacilli respectively.

Keywords: camel milk, microbial quality, Salmonella, Staphylococcus aureus

Procedia PDF Downloads 451
6192 Application of Lean Six Sigma Tools to Minimize Time and Cost in Furniture Packaging

Authors: Suleiman Obeidat, Nabeel Mandahawi

Abstract:

In this work, the packaging process for a move is improved. The customers of this move need their household stuff to be moved from their current house to the new one with minimum damage, in an organized manner, on time and with the minimum cost. Our goal was to improve the process between 10% and 20% time efficiency, 90% reduction in damaged parts and an acceptable improvement in the cost of the total move process. The expected ROI was 833%. Many improvement techniques have been used in terms of the way the boxes are prepared, their preparation cost, packing the goods, labeling them and moving them to a place for moving out. DMAIC technique is used in this work: SIPOC diagram, value stream map of “As Is” process, Root Cause Analysis, Maps of “Future State” and “Ideal State” and an Improvement Plan. A value of ROI=624% is obtained which is lower than the expected value of 833%. The work explains the techniques of improvement and the deficiencies in the old process.

Keywords: packaging, lean tools, six sigma, DMAIC methodology, SIPOC

Procedia PDF Downloads 412
6191 Development of IDF Curves for Precipitation in Western Watershed of Guwahati, Assam

Authors: Rajarshi Sharma, Rashidul Alam, Visavino Seleyi, Yuvila Sangtam

Abstract:

The Intensity-Duration-Frequency (IDF) relationship of rainfall amounts is one of the most commonly used tools in water resources engineering for planning, design and operation of water resources project, or for various engineering projects against design floods. The establishment of such relationships was reported as early as in 1932 (Bernard). Since then many sets of relationships have been constructed for several parts of the globe. The objective of this research is to derive IDF relationship of rainfall for western watershed of Guwahati, Assam. These relationships are useful in the design of urban drainage works, e.g. storm sewers, culverts and other hydraulic structures. In the study, rainfall depth for 10 years viz. 2001 to 2010 has been collected from the Regional Meteorological Centre Borjhar, Guwahati. Firstly, the data has been used to construct the mass curve for duration of more than 7 hours rainfall to calculate the maximum intensity and to form the intensity duration curves. Gumbel’s frequency analysis technique has been used to calculate the probable maximum rainfall intensities for a period of 2 yr, 5 yr, 10 yr, 50 yr, 100 yr from the maximum intensity. Finally, regression analysis has been used to develop the intensity-duration-frequency (IDF) curve. Thus, from the analysis the values for the constants ‘a’,‘b’ &‘c’ have been found out. The values of ‘a’ for which the sum of the squared deviation is minimum has been found out to be 40 and when the corresponding value of ‘c’ and ‘b’ for the minimum squared deviation of ‘a’ are 0.744 and 1981.527 respectively. The results obtained showed that in all the cases the correlation coefficient is very high indicating the goodness of fit of the formulae to estimate IDF curves in the region of interest.

Keywords: intensity-duration-frequency relationship, mass curve, regression analysis, correlation coefficient

Procedia PDF Downloads 223
6190 Revealing Single Crystal Quality by Insight Diffraction Imaging Technique

Authors: Thu Nhi Tran Caliste

Abstract:

X-ray Bragg diffraction imaging (“topography”)entered into practical use when Lang designed an “easy” technical setup to characterise the defects / distortions in the high perfection crystals produced for the microelectronics industry. The use of this technique extended to all kind of high quality crystals, and deposited layers, and a series of publications explained, starting from the dynamical theory of diffraction, the contrast of the images of the defects. A quantitative version of “monochromatic topography” known as“Rocking Curve Imaging” (RCI) was implemented, by using synchrotron light and taking advantage of the dramatic improvement of the 2D-detectors and computerised image processing. The rough data is constituted by a number (~300) of images recorded along the diffraction (“rocking”) curve. If the quality of the crystal is such that a one-to-onerelation between a pixel of the detector and a voxel within the crystal can be established (this approximation is very well fulfilled if the local mosaic spread of the voxel is < 1 mradian), a software we developped provides, from the each rocking curve recorded on each of the pixels of the detector, not only the “voxel” integrated intensity (the only data provided by the previous techniques) but also its “mosaic spread” (FWHM) and peak position. We will show, based on many examples, that this new data, never recorded before, open the field to a highly enhanced characterization of the crystal and deposited layers. These examples include the characterization of dislocations and twins occurring during silicon growth, various growth features in Al203, GaNand CdTe (where the diffraction displays the Borrmannanomalous absorption, which leads to a new type of images), and the characterisation of the defects within deposited layers, or their effect on the substrate. We could also observe (due to the very high sensitivity of the setup installed on BM05, which allows revealing these faint effects) that, when dealing with very perfect crystals, the Kato’s interference fringes predicted by dynamical theory are also associated with very small modifications of the local FWHM and peak position (of the order of the µradian). This rather unexpected (at least for us) result appears to be in keeping with preliminary dynamical theory calculations.

Keywords: rocking curve imaging, X-ray diffraction, defect, distortion

Procedia PDF Downloads 116
6189 An Accelerated Stochastic Gradient Method with Momentum

Authors: Liang Liu, Xiaopeng Luo

Abstract:

In this paper, we propose an accelerated stochastic gradient method with momentum. The momentum term is the weighted average of generated gradients, and the weights decay inverse proportionally with the iteration times. Stochastic gradient descent with momentum (SGDM) uses weights that decay exponentially with the iteration times to generate the momentum term. Using exponential decay weights, variants of SGDM with inexplicable and complicated formats have been proposed to achieve better performance. However, the momentum update rules of our method are as simple as that of SGDM. We provide theoretical convergence analyses, which show both the exponential decay weights and our inverse proportional decay weights can limit the variance of the parameter moving directly to a region. Experimental results show that our method works well with many practical problems and outperforms SGDM.

Keywords: exponential decay rate weight, gradient descent, inverse proportional decay rate weight, momentum

Procedia PDF Downloads 144
6188 Integrating the Modbus SCADA Communication Protocol with Elliptic Curve Cryptography

Authors: Despoina Chochtoula, Aristidis Ilias, Yannis Stamatiou

Abstract:

Modbus is a protocol that enables the communication among devices which are connected to the same network. This protocol is, often, deployed in connecting sensor and monitoring units to central supervisory servers in Supervisory Control and Data Acquisition, or SCADA, systems. These systems monitor critical infrastructures, such as factories, power generation stations, nuclear power reactors etc. in order to detect malfunctions and ignite alerts and corrective actions. However, due to their criticality, SCADA systems are vulnerable to attacks that range from simple eavesdropping on operation parameters, exchanged messages, and valuable infrastructure information to malicious modification of vital infrastructure data towards infliction of damage. Thus, the SCADA research community has been active over strengthening SCADA systems with suitable data protection mechanisms based, to a large extend, on cryptographic methods for data encryption, device authentication, and message integrity protection. However, due to the limited computation power of many SCADA sensor and embedded devices, the usual public key cryptographic methods are not appropriate due to their high computational requirements. As an alternative, Elliptic Curve Cryptography has been proposed, which requires smaller key sizes and, thus, less demanding cryptographic operations. Until now, however, no such implementation has been proposed in the SCADA literature, to the best of our knowledge. In order to fill this gap, our methodology was focused on integrating Modbus, a frequently used SCADA communication protocol, with Elliptic Curve based cryptography and develop a server/client application to demonstrate the proof of concept. For the implementation we deployed two C language libraries, which were suitably modify in order to be successfully integrated: libmodbus (https://github.com/stephane/libmodbus) and ecc-lib https://www.ceid.upatras.gr/webpages/faculty/zaro/software/ecc-lib/). The first library provides a C implementation of the Modbus/TCP protocol while the second one offers the functionality to develop cryptographic protocols based on Elliptic Curve Cryptography. These two libraries were combined, after suitable modifications and enhancements, in order to give a modified version of the Modbus/TCP protocol focusing on the security of the data exchanged among the devices and the supervisory servers. The mechanisms we implemented include key generation, key exchange/sharing, message authentication, data integrity check, and encryption/decryption of data. The key generation and key exchange protocols were implemented with the use of Elliptic Curve Cryptography primitives. The keys established by each device are saved in their local memory and are retained during the whole communication session and are used in encrypting and decrypting exchanged messages as well as certifying entities and the integrity of the messages. Finally, the modified library was compiled for the Android environment in order to run the server application as an Android app. The client program runs on a regular computer. The communication between these two entities is an example of the successful establishment of an Elliptic Curve Cryptography based, secure Modbus wireless communication session between a portable device acting as a supervisor station and a monitoring computer. Our first performance measurements are, also, very promising and demonstrate the feasibility of embedding Elliptic Curve Cryptography into SCADA systems, filling in a gap in the relevant scientific literature.

Keywords: elliptic curve cryptography, ICT security, modbus protocol, SCADA, TCP/IP protocol

Procedia PDF Downloads 243
6187 Analysing the Behaviour of Local Hurst Exponent and Lyapunov Exponent for Prediction of Market Crashes

Authors: Shreemoyee Sarkar, Vikhyat Chadha

Abstract:

In this paper, the local fractal properties and chaotic properties of financial time series are investigated by calculating two exponents, the Local Hurst Exponent: LHE and Lyapunov Exponent in a moving time window of a financial series.y. For the purpose of this paper, the Dow Jones Industrial Average (DIJA) and S&P 500, two of the major indices of United States have been considered. The behaviour of the above-mentioned exponents prior to some major crashes (1998 and 2008 crashes in S&P 500 and 2002 and 2008 crashes in DIJA) is discussed. Also, the optimal length of the window for obtaining the best possible results is decided. Based on the outcomes of the above, an attempt is made to predict the crashes and accuracy of such an algorithm is decided.

Keywords: local hurst exponent, lyapunov exponent, market crash prediction, time series chaos, time series local fractal properties

Procedia PDF Downloads 132
6186 Optimization of Bills Assignment to Different Skill-Levels of Data Entry Operators in a Business Process Outsourcing Industry

Authors: M. S. Maglasang, S. O. Palacio, L. P. Ogdoc

Abstract:

Business Process Outsourcing has been one of the fastest growing and emerging industry in the Philippines today. Unlike most of the contact service centers, more popularly known as "call centers", The BPO Industry’s primary outsourced service is performing audits of the global clients' logistics. As a service industry, manpower is considered as the most important yet the most expensive resource in the company. Because of this, there is a need to maximize the human resources so people are effectively and efficiently utilized. The main purpose of the study is to optimize the current manpower resources through effective distribution and assignment of different types of bills to the different skill-level of data entry operators. The assignment model parameters include the average observed time matrix gathered from through time study, which incorporates the learning curve concept. Subsequently, a simulation model was made to duplicate the arrival rate of demand which includes the different batches and types of bill per day. Next, a mathematical linear programming model was formulated. Its objective is to minimize direct labor cost per bill by allocating the different types of bills to the different skill-levels of operators. Finally, a hypothesis test was done to validate the model, comparing the actual and simulated results. The analysis of results revealed that the there’s low utilization of effective capacity because of its failure to determine the product-mix, skill-mix, and simulated demand as model parameters. Moreover, failure to consider the effects of learning curve leads to overestimation of labor needs. From 107 current number of operators, the proposed model gives a result of 79 operators. This results to an increase of utilization of effective capacity to 14.94%. It is recommended that the excess 28 operators would be reallocated to the other areas of the department. Finally, a manpower capacity planning model is also recommended in support to management’s decisions on what to do when the current capacity would reach its limit with the expected increasing demand.

Keywords: optimization modelling, linear programming, simulation, time and motion study, capacity planning

Procedia PDF Downloads 495
6185 Unsupervised Segmentation Technique for Acute Leukemia Cells Using Clustering Algorithms

Authors: N. H. Harun, A. S. Abdul Nasir, M. Y. Mashor, R. Hassan

Abstract:

Leukaemia is a blood cancer disease that contributes to the increment of mortality rate in Malaysia each year. There are two main categories for leukaemia, which are acute and chronic leukaemia. The production and development of acute leukaemia cells occurs rapidly and uncontrollable. Therefore, if the identification of acute leukaemia cells could be done fast and effectively, proper treatment and medicine could be delivered. Due to the requirement of prompt and accurate diagnosis of leukaemia, the current study has proposed unsupervised pixel segmentation based on clustering algorithm in order to obtain a fully segmented abnormal white blood cell (blast) in acute leukaemia image. In order to obtain the segmented blast, the current study proposed three clustering algorithms which are k-means, fuzzy c-means and moving k-means algorithms have been applied on the saturation component image. Then, median filter and seeded region growing area extraction algorithms have been applied, to smooth the region of segmented blast and to remove the large unwanted regions from the image, respectively. Comparisons among the three clustering algorithms are made in order to measure the performance of each clustering algorithm on segmenting the blast area. Based on the good sensitivity value that has been obtained, the results indicate that moving k-means clustering algorithm has successfully produced the fully segmented blast region in acute leukaemia image. Hence, indicating that the resultant images could be helpful to haematologists for further analysis of acute leukaemia.

Keywords: acute leukaemia images, clustering algorithms, image segmentation, moving k-means

Procedia PDF Downloads 276
6184 The Improved Element Free Galerkin Method for 2D Heat Transfer Problems

Authors: Imen Debbabi, Hédi BelHadjSalah

Abstract:

The Improved Element Free Galerkin (IEFG) method is presented to treat the steady states and the transient heat transfer problems. As a result of a combination between the Improved Moving Least Square (IMLS) approximation and the Element Free Galerkin (EFG) method, the IEFG's shape functions don't have the Kronecker delta property and the penalty method is used to impose the Dirichlet boundary conditions. In this paper, two heat transfer problems, transient and steady states, are studied to improve the efficiency of this meshfree method for 2D heat transfer problems. The performance of the IEFG method is shown using the comparison between numerical and analytic results.

Keywords: meshfree methods, the Improved Moving Least Square approximation (IMLS), the Improved Element Free Galerkin method (IEFG), heat transfer problems

Procedia PDF Downloads 378
6183 Performance of LTE Multicast Systems in the Presence of the Colored Noise Jamming

Authors: S. Malisuwan, J. Sivaraks, N. Madan, N. Suriyakrai

Abstract:

The ever going evolution of advanced wireless technologies makes it financially impossible for military operations to completely manufacture their own equipment. Therefore, Commercial-Off-The-Shelf (COTS) and Modified-Off-The-Shelf (MOTS) are being considered in military mission with low-cost modifications. In this paper, we focus on the LTE multicast systems for military communication systems under tactical environments with jamming condition. We examine the influence of the colored noise jamming on the performance of the LTE multicast systems in terms of the average throughput. The simulation results demonstrate the degradation of the average throughput for different dynamic ranges of the colored noise jamming versus average SNR.

Keywords: performance, LTE, multicast, jamming, throughput

Procedia PDF Downloads 403
6182 A Critical Review of the Success Model of Indian Pharmaceutical Industry

Authors: Ekta Pandey

Abstract:

The Indian Pharmaceutical Industry is ranked third largest by volume and fourteenth by value. It thus accounts for 10% of world’s production by volume and 1.5% by value according to Department of Pharmaceuticals, Government of India. The industry has shown phenomenal growth over past few years, moving from US $ 1 billion turnover in 1990 to a turnover of around US $30 billion in 2015. The Indian pharmaceutical sector is ranked seventeenth in terms of export value of active pharmaceutical ingredients and dosage forms to more than 200 countries around the globe. It has shown tremendous changes especially after Trade Related Aspects of Intellectual Property Rights (TRIPS) agreement. Recognizing the immense potential for growth and its direct impact on Indian economy, it is important to look up the industrial policies adopted since Indian independence which turnaround the Indian pharmaceutical industry. A systematic review of changes in market structure of Indian pharmaceutical industry due to shift in policy regimes is done from 1850 to 2015 using secondary peer reviewed published research work. The aim is to understand the impact of anti-trust laws, intellectual property rights, industry competition acts and regulations are quite crucial in determining effective economic policy and have overall lasting effects on international trade and ties. The proposed paper examines the position of Indian domestic firms relative to multinational pharmaceutical firms tries to throw some light on the growth curve of Indian pharmaceutical sector.

Keywords: active pharmaceutical ingredients, competition act, pharmaceutical industry, TRIPS

Procedia PDF Downloads 429