Search results for: Desirability Function Approach.
4079 Research on Residential Block Fabric: A Case Study of Hangzhou West Area
Abstract:
Residential block construction of big cities in China began in the 1950s, and four models had far-reaching influence on modern residential block in its development process, including unit compound and residential district in 1950s to 1980s, and gated community and open community in 1990s to now. Based on analysis of the four models’ fabric, the article takes residential blocks in Hangzhou west area as an example and carries on the studies from urban structure level and block spacial level, mainly including urban road network, land use, community function, road organization, public space and building fabric. At last, the article puts forward “Semi-open Sub-community” strategy to improve the current fabric.Keywords: Hangzhou West Area, residential block model, residential block fabric, “Semi-open Sub-community” strategy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14304078 Teager-Huang Analysis Applied to Sonar Target Recognition
Authors: J.-C. Cexus, A.O. Boudraa
Abstract:
In this paper, a new approach for target recognition based on the Empirical mode decomposition (EMD) algorithm of Huang etal. [11] and the energy tracking operator of Teager [13]-[14] is introduced. The conjunction of these two methods is called Teager-Huang analysis. This approach is well suited for nonstationary signals analysis. The impulse response (IR) of target is first band pass filtered into subsignals (components) called Intrinsic mode functions (IMFs) with well defined Instantaneous frequency (IF) and Instantaneous amplitude (IA). Each IMF is a zero-mean AM-FM component. In second step, the energy of each IMF is tracked using the Teager energy operator (TEO). IF and IA, useful to describe the time-varying characteristics of the signal, are estimated using the Energy separation algorithm (ESA) algorithm of Maragos et al .[16]-[17]. In third step, a set of features such as skewness and kurtosis are extracted from the IF, IA and IMF energy functions. The Teager-Huang analysis is tested on set of synthetic IRs of Sonar targets with different physical characteristics (density, velocity, shape,? ). PCA is first applied to features to discriminate between manufactured and natural targets. The manufactured patterns are classified into spheres and cylinders. One hundred percent of correct recognition is achieved with twenty three echoes where sixteen IRs, used for training, are free noise and seven IRs, used for testing phase, are corrupted with white Gaussian noise.
Keywords: Target recognition, Empirical mode decomposition, Teager-Kaiser energy operator, Features extraction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22834077 A 3D Virtual Navigation System Integrating User Positioning and Pre-Download Mechanism
Authors: Ching-Sheng Wang, Yu-Hung Su, Ching-Yang Hong
Abstract:
This paper takes the actual scene of Aletheia University campus – the Class 2 national monument, the first educational institute in northern Taiwan as an example, to present a 3D virtual navigation system which supports user positioning and pre-download mechanism. The proposed system was designed based on the principle of Voronoi Diagra) to divide the virtual scenes and its multimedia information, which combining outdoor GPS positioning and the indoor RFID location detecting function. When users carry mobile equipments such as notebook computer, UMPC, EeePC...etc., walking around the actual scenes of indoor and outdoor areas of campus, this system can automatically detect the moving path of users and pre-download the needed data so that users will have a smooth and seamless navigation without waiting.Keywords: GPS, Positioning, RFID, Virtual Navigation, Voronoi Diagram.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14054076 A Computational Fluid Dynamic Model of Human Sniffing
Authors: M.V. Shyla, K.B. Naidu
Abstract:
The objective of this paper is to develop a computational model of human nasal cavity from computed tomography (CT) scans using MIMICS software. Computational fluid dynamic techniques were employed to understand nasal airflow. Gambit and Fluent software was used to perform CFD simulation. Velocity profiles, iteration plots, pressure distribution, streamline and pathline patterns for steady, laminar airflow inside the human nasal cavity of healthy and also infected persons are presented in detail. The implications for olfaction are visualized. Results are validated with the available numerical and experimental data. The graphs reveal that airflow varies with different anatomical nasal structures and only fraction of the inspired air reaches the olfactory region. The Deviations in the results suggest that the treatment of infected volunteers will improve the olfactory function.
Keywords: CFD techniques, Finite Volume Method, Fluid dynamic sniffing, Human nasal cavity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20614075 An Optimal Control Problem for Rigid Body Motions on Lie Group SO(2, 1)
Authors: Nemat Abazari, Ilgin Sager
Abstract:
In this paper smooth trajectories are computed in the Lie group SO(2, 1) as a motion planning problem by assigning a Frenet frame to the rigid body system to optimize the cost function of the elastic energy which is spent to track a timelike curve in Minkowski space. A method is proposed to solve a motion planning problem that minimize the integral of the square norm of Darboux vector of a timelike curve. This method uses the coordinate free Maximum Principle of Optimal control and results in the theory of integrable Hamiltonian systems. The presence of several conversed quantities inherent in these Hamiltonian systems aids in the explicit computation of the rigid body motions.
Keywords: Optimal control, Hamiltonian vector field, Darboux vector, maximum principle, lie group, Rigid body motion, Lorentz metric.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13344074 Application of Life Data Analysis for the Reliability Assessment of Numerical Overcurrent Relays
Authors: Mohd Iqbal Ridwan, Kerk Lee Yen, Aminuddin Musa, Bahisham Yunus
Abstract:
Protective relays are components of a protection system in a power system domain that provides decision making element for correct protection and fault clearing operations. Failure of the protection devices may reduce the integrity and reliability of the power system protection that will impact the overall performance of the power system. Hence it is imperative for power utilities to assess the reliability of protective relays to assure it will perform its intended function without failure. This paper will discuss the application of reliability analysis using statistical method called Life Data Analysis in Tenaga Nasional Berhad (TNB), a government linked power utility company in Malaysia, namely Transmission Division, to assess and evaluate the reliability of numerical overcurrent protective relays from two different manufacturers.Keywords: Life data analysis, Protective relays, Reliability, Weibull Distribution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 39824073 Through Biometric Card in Romania: Person Identification by Face, Fingerprint and Voice Recognition
Authors: Hariton N. Costin, Iulian Ciocoiu, Tudor Barbu, Cristian Rotariu
Abstract:
In this paper three different approaches for person verification and identification, i.e. by means of fingerprints, face and voice recognition, are studied. Face recognition uses parts-based representation methods and a manifold learning approach. The assessment criterion is recognition accuracy. The techniques under investigation are: a) Local Non-negative Matrix Factorization (LNMF); b) Independent Components Analysis (ICA); c) NMF with sparse constraints (NMFsc); d) Locality Preserving Projections (Laplacianfaces). Fingerprint detection was approached by classical minutiae (small graphical patterns) matching through image segmentation by using a structural approach and a neural network as decision block. As to voice / speaker recognition, melodic cepstral and delta delta mel cepstral analysis were used as main methods, in order to construct a supervised speaker-dependent voice recognition system. The final decision (e.g. “accept-reject" for a verification task) is taken by using a majority voting technique applied to the three biometrics. The preliminary results, obtained for medium databases of fingerprints, faces and voice recordings, indicate the feasibility of our study and an overall recognition precision (about 92%) permitting the utilization of our system for a future complex biometric card.Keywords: Biometry, image processing, pattern recognition, speech analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19444072 Dual-Polarized Multi-Antenna System for Massive MIMO Cellular Communications
Authors: Naser Ojaroudi Parchin, Haleh Jahanbakhsh Basherlou, Raed A. Abd-Alhameed, Peter S. Excell
Abstract:
In this paper, a multiple-input/multiple-output (MIMO) antenna design with polarization and radiation pattern diversity is presented for future smartphones. The configuration of the design consists of four double-fed circular-ring antenna elements located at different edges of the printed circuit board (PCB) with an FR-4 substrate and overall dimension of 75×150 mm2. The antenna elements are fed by 50-Ohm microstrip-lines and provide polarization and radiation pattern diversity function due to the orthogonal placement of their feed lines. A good impedance bandwidth (S11 ≤ -10 dB) of 3.4-3.8 GHz has been obtained for the smartphone antenna array. However, for S11 ≤ -6 dB, this value is 3.25-3.95 GHz. More than 3 dB realized gain and 80% total efficiency are achieved for the single-element radiator. The presented design not only provides the required radiation coverage but also generates the polarization diversity characteristic.
Keywords: Cellular communications, MIMO systems, mobile-phone antenna, polarization diversity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10104071 Automatic LV Segmentation with K-means Clustering and Graph Searching on Cardiac MRI
Authors: Hae-Yeoun Lee
Abstract:
Quantification of cardiac function is performed by calculating blood volume and ejection fraction in routine clinical practice. However, these works have been performed by manual contouring, which requires computational costs and varies on the observer. In this paper, an automatic left ventricle segmentation algorithm on cardiac magnetic resonance images (MRI) is presented. Using knowledge on cardiac MRI, a K-mean clustering technique is applied to segment blood region on a coil-sensitivity corrected image. Then, a graph searching technique is used to correct segmentation errors from coil distortion and noises. Finally, blood volume and ejection fraction are calculated. Using cardiac MRI from 15 subjects, the presented algorithm is tested and compared with manual contouring by experts to show outstanding performance.
Keywords: Cardiac MRI, Graph searching, Left ventricle segmentation, K-means clustering.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20944070 Inverse Dynamic Active Ground Motion Acceleration Inputs Estimation of the Retaining Structure
Authors: Ming-Hui Lee, Iau-Teh Wang
Abstract:
The innovative fuzzy estimator is used to estimate the ground motion acceleration of the retaining structure in this study. The Kalman filter without the input term and the fuzzy weighting recursive least square estimator are two main portions of this method. The innovation vector can be produced by the Kalman filter, and be applied to the fuzzy weighting recursive least square estimator to estimate the acceleration input over time. The excellent performance of this estimator is demonstrated by comparing it with the use of difference weighting function, the distinct levels of the measurement noise covariance and the initial process noise covariance. The availability and the precision of the proposed method proposed in this study can be verified by comparing the actual value and the one obtained by numerical simulation.Keywords: Earthquake, Fuzzy Estimator, Kalman Filter, Recursive Least Square Estimator.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15474069 Dielectric and Impedance Spectroscopy of Samarium and Lanthanum Doped Barium Titanate at Room Temperature
Authors: Sukhleen Bindra Narang, Dalveer Kaur, Kunal Pubby
Abstract:
Dielectric ceramic samples in the BaO-Re2O3-TiO2 ternary system were synthesized with structural formula Ba2- xRe4+2x/3Ti8O24 where Re= rare earth metal and Re= Sm and La where x varies from 0.0 to 0.6 with step size 0.1. Polycrystalline samples were prepared by the conventional solid state reaction technique. The dielectric, electrical and impedance analysis of all the samples in the frequency range 1KHz- 1MHz at room temperature (25°C) have been done to get the understanding of electrical conduction and dielectric relaxation and their correlation. Dielectric response of the samples at lower frequencies shows dielectric dispersion while at higher frequencies it shows dielectric relaxation. The ac conductivity is well fitted by the Jonscher law. The spectroscopic data in the impedance plane confirms the existence of grain contribution to the relaxation. All the properties are found out to be function of frequency as well as the amount of substitution.Keywords: Dielectric ceramics, Dielectric constant, Loss tangent, AC conductivity, Impedance spectroscopy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25214068 Intelligent Fuzzy Input Estimator for the Input Force on the Rigid Bar Structure System
Authors: Ming-Hui Lee, Tsung-Chien Chen, Yuh-Shiou Tai
Abstract:
The intelligent fuzzy input estimator is used to estimate the input force of the rigid bar structural system in this study. The fuzzy Kalman filter without the input term and the fuzzy weighting recursive least square estimator are two main portions of this method. The practicability and accuracy of the proposed method were verified with numerical simulations from which the input forces of a rigid bar structural system were estimated from the output responses. In order to examine the accuracy of the proposed method, a rigid bar structural system is subjected to periodic sinusoidal dynamic loading. The excellent performance of this estimator is demonstrated by comparing it with the use of difference weighting function and improper the initial process noise covariance. The estimated results have a good agreement with the true values in all cases tested.Keywords: Fuzzy Input Estimator, Kalman Filter, RecursiveLeast Square Estimator.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13974067 Feature Preserving Nonlinear Diffusion for Ultrasonic Image Denoising and Edge Enhancement
Authors: Shujun Fu, Qiuqi Ruan, Wenqia Wang, Yu Li
Abstract:
Utilizing echoic intension and distribution from different organs and local details of human body, ultrasonic image can catch important medical pathological changes, which unfortunately may be affected by ultrasonic speckle noise. A feature preserving ultrasonic image denoising and edge enhancement scheme is put forth, which includes two terms: anisotropic diffusion and edge enhancement, controlled by the optimum smoothing time. In this scheme, the anisotropic diffusion is governed by the local coordinate transformation and the first and the second order normal derivatives of the image, while the edge enhancement is done by the hyperbolic tangent function. Experiments on real ultrasonic images indicate effective preservation of edges, local details and ultrasonic echoic bright strips on denoising by our scheme.
Keywords: anisotropic diffusion, coordinate transformationdirectional derivatives, edge enhancement, hyperbolic tangentfunction, image denoising.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18134066 Requirements Driven Multiple View Paradigm for Developing Security Architecture
Authors: K. Chandra Sekaran
Abstract:
This paper describes a paradigmatic approach to develop architecture of secure systems by describing the requirements from four different points of view: that of the owner, the administrator, the user, and the network. Deriving requirements and developing architecture implies the joint elicitation and describing the problem and the structure of the solution. The view points proposed in this paper are those we consider as requirements towards their contributions as major parties in the design, implementation, usage and maintenance of secure systems. The dramatic growth of the technology of Internet and the applications deployed in World Wide Web have lead to the situation where the security has become a very important concern in the development of secure systems. Many security approaches are currently being used in organizations. In spite of the widespread use of many different security solutions, the security remains a problem. It is argued that the approach that is described in this paper for the development of secure architecture is practical by all means. The models representing these multiple points of view are termed the requirements model (views of owner and administrator) and the operations model (views of user and network). In this paper, this multiple view paradigm is explained by first describing the specific requirements and or characteristics of secure systems (particularly in the domain of networks) and the secure architecture / system development methodology.
Keywords: Multiple view paradigms, requirements model, operations model, secure system, owner, administrator, user, network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13714065 A Comparison between Heuristic and Meta-Heuristic Methods for Solving the Multiple Traveling Salesman Problem
Authors: San Nah Sze, Wei King Tiong
Abstract:
The multiple traveling salesman problem (mTSP) can be used to model many practical problems. The mTSP is more complicated than the traveling salesman problem (TSP) because it requires determining which cities to assign to each salesman, as well as the optimal ordering of the cities within each salesman's tour. Previous studies proposed that Genetic Algorithm (GA), Integer Programming (IP) and several neural network (NN) approaches could be used to solve mTSP. This paper compared the results for mTSP, solved with Genetic Algorithm (GA) and Nearest Neighbor Algorithm (NNA). The number of cities is clustered into a few groups using k-means clustering technique. The number of groups depends on the number of salesman. Then, each group is solved with NNA and GA as an independent TSP. It is found that k-means clustering and NNA are superior to GA in terms of performance (evaluated by fitness function) and computing time.Keywords: Multiple Traveling Salesman Problem, GeneticAlgorithm, Nearest Neighbor Algorithm, k-Means Clustering.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32344064 Featured based Segmentation of Color Textured Images using GLCM and Markov Random Field Model
Authors: Dipti Patra, Mridula J
Abstract:
In this paper, we propose a new image segmentation approach for colour textured images. The proposed method for image segmentation consists of two stages. In the first stage, textural features using gray level co-occurrence matrix(GLCM) are computed for regions of interest (ROI) considered for each class. ROI acts as ground truth for the classes. Ohta model (I1, I2, I3) is the colour model used for segmentation. Statistical mean feature at certain inter pixel distance (IPD) of I2 component was considered to be the optimized textural feature for further segmentation. In the second stage, the feature matrix obtained is assumed to be the degraded version of the image labels and modeled as Markov Random Field (MRF) model to model the unknown image labels. The labels are estimated through maximum a posteriori (MAP) estimation criterion using ICM algorithm. The performance of the proposed approach is compared with that of the existing schemes, JSEG and another scheme which uses GLCM and MRF in RGB colour space. The proposed method is found to be outperforming the existing ones in terms of segmentation accuracy with acceptable rate of convergence. The results are validated with synthetic and real textured images.
Keywords: Texture Image Segmentation, Gray Level Cooccurrence Matrix, Markov Random Field Model, Ohta colour space, ICM algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21734063 Investigation of Droplet Size Produced in Two-Phase Gravity Separators
Authors: Kul Pun, F. A. Hamad, T. Ahmed, J. O. Ugwu, J. Eyers, G. Lawson, P. A. Russell
Abstract:
Determining droplet size and distribution is essential when determining the separation efficiency of a two/three-phase separator. This paper investigates the effect of liquid flow and oil pad thickness on the droplet size at the lab scale. The findings show that increasing the inlet flow rates of the oil and water results in size reduction of the droplets and increasing the thickness of the oil pad increases the size of the droplets. The data were fitted with a simple Gaussian model, and the parameters of mean, standard deviation, and amplitude were determined. Trends have been obtained for the fitted parameters as a function of the Reynolds number, which suggest a way forward to better predict the starting parameters for population models when simulating separation using CFD packages. The key parameter to predict to fix the position of the Gaussian distribution was found to be the mean droplet size.
Keywords: Two-phase separator, average bubble droplet, bubble size distribution, liquid-liquid phase.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3254062 A Scatter Search and Help Policies Approaches for a New Mixed Model Assembly Lines Sequencing Problem
Authors: N. Manavizadeh , M. Rabbani , H. Sotudian , F. Jolai
Abstract:
Mixed Model Production is the practice of assembling several distinct and different models of a product on the same assembly line without changeovers and then sequencing those models in a way that smoothes the demand for upstream components. In this paper, we consider an objective function which minimizes total stoppage time and total idle time and it is presented sequence dependent set up time. Many studies have been done on the mixed model assembly lines. But in this paper we specifically focused on reducing the idle times. This is possible through various help policies. For improving the solutions, some cases developed and about 40 tests problem was considered. We use scatter search for optimization and for showing the efficiency of our algorithm, experimental results shows behavior of method. Scatter search and help policies can produce high quality answers, so it has been used in this paper.Keywords: mixed model assembly lines, Scatter search, help policies, idle time, Stoppage time
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14914061 Exterior Calculus: Economic Growth Dynamics
Authors: Troy L. Story
Abstract:
Mathematical models of dynamics employing exterior calculus are mathematical representations of the same unifying principle; namely, the description of a dynamic system with a characteristic differential one-form on an odd-dimensional differentiable manifold leads, by analysis with exterior calculus, to a set of differential equations and a characteristic tangent vector (vortex vector) which define transformations of the system. Using this principle, a mathematical model for economic growth is constructed by proposing a characteristic differential one-form for economic growth dynamics (analogous to the action in Hamiltonian dynamics), then generating a pair of characteristic differential equations and solving these equations for the rate of economic growth as a function of labor and capital. By contracting the characteristic differential one-form with the vortex vector, the Lagrangian for economic growth dynamics is obtained.
Keywords: Differential geometry, exterior calculus, Hamiltonian geometry, mathematical economics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14894060 A Case Study on the Numerical-Probability Approach for Deep Excavation Analysis
Authors: Komeil Valipourian
Abstract:
Urban advances and the growing need for developing infrastructures has increased the importance of deep excavations. In this study, after the introducing probability analysis as an important issue, an attempt has been made to apply it for the deep excavation project of Bangkok’s Metro as a case study. For this, the numerical probability model has been developed based on the Finite Difference Method and Monte Carlo sampling approach. The results indicate that disregarding the issue of probability in this project will result in an inappropriate design of the retaining structure. Therefore, probabilistic redesign of the support is proposed and carried out as one of the applications of probability analysis. A 50% reduction in the flexural strength of the structure increases the failure probability just by 8% in the allowable range and helps improve economic conditions, while maintaining mechanical efficiency. With regard to the lack of efficient design in most deep excavations, by considering geometrical and geotechnical variability, an attempt was made to develop an optimum practical design standard for deep excavations based on failure probability. On this basis, a practical relationship is presented for estimating the maximum allowable horizontal displacement, which can help improve design conditions without developing the probability analysis.
Keywords: Numerical probability modeling, deep excavation, allowable maximum displacement, finite difference method, FDM.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6924059 Computer-Aided Classification of Liver Lesions Using Contrasting Features Difference
Authors: Hussein Alahmer, Amr Ahmed
Abstract:
Liver cancer is one of the common diseases that cause the death. Early detection is important to diagnose and reduce the incidence of death. Improvements in medical imaging and image processing techniques have significantly enhanced interpretation of medical images. Computer-Aided Diagnosis (CAD) systems based on these techniques play a vital role in the early detection of liver disease and hence reduce liver cancer death rate. This paper presents an automated CAD system consists of three stages; firstly, automatic liver segmentation and lesion’s detection. Secondly, extracting features. Finally, classifying liver lesions into benign and malignant by using the novel contrasting feature-difference approach. Several types of intensity, texture features are extracted from both; the lesion area and its surrounding normal liver tissue. The difference between the features of both areas is then used as the new lesion descriptors. Machine learning classifiers are then trained on the new descriptors to automatically classify liver lesions into benign or malignant. The experimental results show promising improvements. Moreover, the proposed approach can overcome the problems of varying ranges of intensity and textures between patients, demographics, and imaging devices and settings.
Keywords: CAD system, difference of feature, Fuzzy c means, Liver segmentation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14214058 Near Perfect Reconstruction Quadrature Mirror Filter
Authors: A. Kumar, G. K. Singh, R. S. Anand
Abstract:
In this paper, various algorithms for designing quadrature mirror filter are reviewed and a new algorithm is presented for the design of near perfect reconstruction quadrature mirror filter bank. In the proposed algorithm, objective function is formulated using the perfect reconstruction condition or magnitude response condition of prototype filter at frequency (ω = 0.5π) in ideal condition. The cutoff frequency is iteratively changed to adjust the filters coefficients using optimization algorithm. The performances of the proposed algorithm are evaluated in term of computation time, reconstruction error and number of iterations. The design examples illustrate that the proposed algorithm is superior in term of peak reconstruction error, computation time, and number of iterations. The proposed algorithm is simple, easy to implement, and linear in nature.
Keywords: Aliasing cancellations filter bank, Filter banks, quadrature mirror filter (QMF), subband coding.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25314057 Strategic Mine Planning: A SWOT Analysis Applied to KOV Open Pit Mine in the Democratic Republic of Congo
Authors: Patrick May Mukonki
Abstract:
KOV pit (Kamoto Oliveira Virgule) is located 10 km from Kolwezi town, one of the mineral rich town in the Lualaba province of the Democratic Republic of Congo. The KOV pit is currently operating under the Katanga Mining Limited (KML), a Glencore-Gecamines (a State Owned Company) join venture. Recently, the mine optimization process provided a life of mine of approximately 10 years withnice pushbacks using the Datamine NPV Scheduler software. In previous KOV pit studies, we recently outlined the impact of the accuracy of the geological information on a long-term mine plan for a big copper mine such as KOV pit. The approach taken, discussed three main scenarios and outlined some weaknesses on the geological information side, and now, in this paper that we are going to develop here, we are going to highlight, as an overview, those weaknesses, strengths and opportunities, in a global SWOT analysis. The approach we are taking here is essentially descriptive in terms of steps taken to optimize KOV pit and, at every step, we categorized the challenges we faced to have a better tradeoff between what we called strengths and what we called weaknesses. The same logic is applied in terms of the opportunities and threats. The SWOT analysis conducted in this paper demonstrates that, despite a general poor ore body definition, and very rude ground water conditions, there is room for improvement for such high grade ore body.
Keywords: Mine planning, mine optimization, mine scheduling, SWOT analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15894056 Robust Coherent Noise Suppression by Point Estimation of the Cauchy Location Parameter
Authors: Ephraim Gower, Thato Tsalaile, Monageng Kgwadi, Malcolm Hawksford.
Abstract:
This paper introduces a new point estimation algorithm, with particular focus on coherent noise suppression, given several measurements of the device under test where it is assumed that 1) the noise is first-order stationery and 2) the device under test is linear and time-invariant. The algorithm exploits the robustness of the Pitman estimator of the Cauchy location parameter through the initial scaling of the test signal by a centred Gaussian variable of predetermined variance. It is illustrated through mathematical derivations and simulation results that the proposed algorithm is more accurate and consistently robust to outliers for different tailed density functions than the conventional methods of sample mean (coherent averaging technique) and sample median search.
Keywords: Central limit theorem, Fisher-Cramer Rao, gamma function, Pitman estimator.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19224055 Multifunctional Barcode Inventory System for Retailing. Are You Ready for It?
Authors: Ling Shi Cai, Leau Yu Beng, Charlie Albert Lasuin, Tan Soo Fun, Chin Pei Yee
Abstract:
This paper explains the development of Multifunctional Barcode Inventory Management System (MBIMS) to manage inventory and stock ordering. Today, most of the retailing market is still manually record their stocks and its effectiveness is quite low. By providing MBIMS, it will bring effectiveness to retailing market in inventory management. MBIMS will not only save time in recording input, output and refilling the inventory stock, but also in calculating remaining stock and provide auto-ordering function. This system is developed through System Development Life Cycle (SDLC) and the flow and structure of the system is fully built based on requirements of a retailing market. Furthermore, this system has been developed from methodical research and study where each part of the system is vigilantly designed. Thus, MBIMS will offer a good solution to the retailing market in achieving effectiveness and efficiency in inventory management.
Keywords: Inventory, Retailing Market, Barcode, Automated Alerting and Ordering
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20964054 Integrated Modeling Approach for Energy Planning and Climate Change Mitigation Assessment in the State of Florida
Authors: Kuntal Thakkar, Chaouki Ghenai, Ahmed Hachicha
Abstract:
An integrated modeling approach was used in this study for energy planning and climate change mitigation assessment. The main objective of this study was to develop various green-house gas (GHG) mitigations scenarios in the energy demand and supply sectors for the state of Florida. The Long range energy alternative planning (LEAP) model was used in this study to examine the energy alternative and GHG emissions reduction scenarios for short and long term (2010-2050). One of the energy analysis and GHG mitigation scenarios was developed by taking into account the available renewable energy resources potential for power generation in the state of Florida. This will help to compare and analyze the GHG reduction measure against “Business As Usual” and ‘State of Florida Policy” scenarios. Two master scenarios: “Electrification” and “Energy efficiency and Lifestyle” were developed through combination of various mitigation scenarios: technological changes and energy efficiency and conservation. The results show a net reduction of the energy demand and GHG emissions by adopting these two energy scenarios compared to the business as usual.
Keywords: Integrated modeling, energy planning, climate change mitigation assessment, greenhouse gas emissions, renewable energy, energy efficiency.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17844053 Particle Swarm Optimization with Reduction for Global Optimization Problems
Authors: Michiharu Maeda, Shinya Tsuda
Abstract:
This paper presents an algorithm of particle swarm optimization with reduction for global optimization problems. Particle swarm optimization is an algorithm which refers to the collective motion such as birds or fishes, and a multi-point search algorithm which finds a best solution using multiple particles. Particle swarm optimization is so flexible that it can adapt to a number of optimization problems. When an objective function has a lot of local minimums complicatedly, the particle may fall into a local minimum. For avoiding the local minimum, a number of particles are initially prepared and their positions are updated by particle swarm optimization. Particles sequentially reduce to reach a predetermined number of them grounded in evaluation value and particle swarm optimization continues until the termination condition is met. In order to show the effectiveness of the proposed algorithm, we examine the minimum by using test functions compared to existing algorithms. Furthermore the influence of best value on the initial number of particles for our algorithm is discussed.Keywords: Particle swarm optimization, Global optimization, Metaheuristics, Reduction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16234052 Investigating Feed Mix Problem Approaches: An Overview and Potential Solution
Authors: Rosshairy Abd Rahman, Chooi-Leng Ang, Razamin Ramli
Abstract:
Feed is one of the factors which play an important role in determining a successful development of an aquaculture industry. It is always critical to produce the best aquaculture diet at a minimum cost in order to trim down the operational cost and gain more profit. However, the feed mix problem becomes increasingly difficult since many issues need to be considered simultaneously. Thus, the purpose of this paper is to review the current techniques used by nutritionist and researchers to tackle the issues. Additionally, this paper introduce an enhance algorithm which is deemed suitable to deal with all the issues arise. The proposed technique refers to Hybrid Genetic Algorithm which is expected to obtain the minimum cost diet for farmed animal, while satisfying nutritional requirements. Hybrid GA technique with artificial bee algorithm is expected to reduce the penalty function and provide a better solution for the feed mix problem.
Keywords: Artificial bee algorithm, feed mix problem, hybrid genetic algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32114051 A Hybrid Radial-Based Neuro-GA Multiobjective Design of Laminated Composite Plates under Moisture and Thermal Actions
Authors: Mohammad Reza Ghasemi, Ali Ehsani
Abstract:
In this paper, the optimum weight and cost of a laminated composite plate is seeked, while it undergoes the heaviest load prior to a complete failure. Various failure criteria are defined for such structures in the literature. In this work, the Tsai-Hill theory is used as the failure criterion. The theory of analysis was based on the Classical Lamination Theory (CLT). A newly type of Genetic Algorithm (GA) as an optimization technique with a direct use of real variables was employed. Yet, since the optimization via GAs is a long process, and the major time is consumed through the analysis, Radial Basis Function Neural Networks (RBFNN) was employed in predicting the output from the analysis. Thus, the process of optimization will be carried out through a hybrid neuro-GA environment, and the procedure will be carried out until a predicted optimum solution is achieved.Keywords: Composite Laminates, GA, Multi-objectiveOptimization, Neural Networks, RBFNN.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14664050 Determination of the Gain in Learning the Free-Fall Motion of Bodies by Applying the Resource of Previous Concepts
Authors: Ricardo Merlo
Abstract:
In this paper, we analyzed the different didactic proposals for teaching about the free fall motion of bodies available online. An important aspect was the interpretation of the direction and sense of the acceleration of gravity and of the falling velocity of a body, which is why we found different applications of the Cartesian reference system used and also different graphical presentations of the velocity as a function of time and of the distance traveled vertically by the body in the period of time that it was dropped from a height h0. In this framework, a survey of previous concepts was applied to a voluntary group of first-year university students of an Engineering degree before and after the development of the class of the subject in question. Then, Hake's index (0.52) was determined, which resulted in an average learning gain from the meaningful use of the reference system and the respective graphs of velocity versus time and height versus time.
Keywords: Didactic gain, free–fall, physics teaching, previous knowledge.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 210