Search results for: centroid ranking technique
1344 A Comparison between Heuristic and Meta-Heuristic Methods for Solving the Multiple Traveling Salesman Problem
Authors: San Nah Sze, Wei King Tiong
Abstract:
The multiple traveling salesman problem (mTSP) can be used to model many practical problems. The mTSP is more complicated than the traveling salesman problem (TSP) because it requires determining which cities to assign to each salesman, as well as the optimal ordering of the cities within each salesman's tour. Previous studies proposed that Genetic Algorithm (GA), Integer Programming (IP) and several neural network (NN) approaches could be used to solve mTSP. This paper compared the results for mTSP, solved with Genetic Algorithm (GA) and Nearest Neighbor Algorithm (NNA). The number of cities is clustered into a few groups using k-means clustering technique. The number of groups depends on the number of salesman. Then, each group is solved with NNA and GA as an independent TSP. It is found that k-means clustering and NNA are superior to GA in terms of performance (evaluated by fitness function) and computing time.Keywords: Multiple Traveling Salesman Problem, GeneticAlgorithm, Nearest Neighbor Algorithm, k-Means Clustering.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32431343 Design of a Mould System for Horizontal Continuous Casting of Bilayer Aluminium Strips
Authors: Ch. Nerl, M. Wimmer, P. Hofer, E. Kaschnitz
Abstract:
The present article deals with a composite casting process that allows to produce bilayer AlSn6-Al strips based on the technique of horizontal continuous casting. In the first part experimental investigations on the production of a single layer AlSn6 strip are described. Afterwards essential results of basic compound casting trials using simple test specimen are presented to define the thermal conditions required for a metallurgical compound between the alloy AlSn6 and pure aluminium. Subsequently, numerical analyses are described. A finite element model was used to examine a continuous composite casting process. As a result of the simulations the main influencing parameters concerning the thermal conditions within the composite casting region could be pointed out. Finally, basic guidance is given for the design of an appropriate composite mould system.
Keywords: Aluminium alloys, composite casting, compound casting, continuous casting, numerical simulation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31681342 Strategic Risk Issues for Film Distributors of Hindi Film Industry in Mumbai: A Grounded Theory Approach
Abstract:
The purpose of the paper is to address the strategic risk issues surrounding Hindi film distribution in Mumbai for a film distributor, who acts as an entrepreneur when launching a product (movie) in the market (film territory).The paper undertakes a fundamental review of films and risk in the Hindi film industry and applies Grounded Theory technique to understand the complex phenomena of risk taking behavior of the film distributors (both independent and studios) in Mumbai. Rich in-depth interviews with distributors are coded to develop core categories through constant comparison leading to conceptualization of the phenomena of interest. This paper is a first-of-its-kind-attempt to understand risk behavior of a distributor, which is akin to entrepreneurial risk behavior under conditions of uncertainty.Keywords: Entrepreneurial Risk Behavior, Film Distribution Strategy, Hindi Film Industry, Risk.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26221341 Synthetic Transmit Aperture Method in Medical Ultrasonic Imaging
Authors: Ihor Trots, Andrzej Nowicki, Marcin Lewandowski
Abstract:
The work describes the use of a synthetic transmit aperture (STA) with a single element transmitting and all elements receiving in medical ultrasound imaging. STA technique is a novel approach to today-s commercial systems, where an image is acquired sequentially one image line at a time that puts a strict limit on the frame rate and the amount of data needed for high image quality. The STA imaging allows to acquire data simultaneously from all directions over a number of emissions, and the full image can be reconstructed. In experiments a 32-element linear transducer array with 0.48 mm inter-element spacing was used. Single element transmission aperture was used to generate a spherical wave covering the full image region. The 2D ultrasound images of wire phantom are presented obtained using the STA and commercial ultrasound scanner Antares to demonstrate the benefits of the SA imaging.Keywords: Ultrasound imaging, synthetic aperture, frame rate, beamforming.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21091340 Incremental Learning of Independent Topic Analysis
Authors: Takahiro Nishigaki, Katsumi Nitta, Takashi Onoda
Abstract:
In this paper, we present a method of applying Independent Topic Analysis (ITA) to increasing the number of document data. The number of document data has been increasing since the spread of the Internet. ITA was presented as one method to analyze the document data. ITA is a method for extracting the independent topics from the document data by using the Independent Component Analysis (ICA). ICA is a technique in the signal processing; however, it is difficult to apply the ITA to increasing number of document data. Because ITA must use the all document data so temporal and spatial cost is very high. Therefore, we present Incremental ITA which extracts the independent topics from increasing number of document data. Incremental ITA is a method of updating the independent topics when the document data is added after extracted the independent topics from a just previous the data. In addition, Incremental ITA updates the independent topics when the document data is added. And we show the result applied Incremental ITA to benchmark datasets.Keywords: Text mining, topic extraction, independent, incremental, independent component analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10641339 Active Imagination: The Effective Factor in the Practice of Psychotherapy
Authors: Sonia Regina Lyra
Abstract:
The desire for unequivocal clarity is understandable, but this can make one forget that things of the soul are experiential processes, or transformations, which should never be designated unilaterally if it is not wanted to transform something that moves, a living thing, into something static. Among the so-called ‘things of the soul’ there are especially spontaneous fantasies, that emerge during the processes, as a result from the use of the active imagination technique, for when fantasy is not forced, violated, or subjugated by an illegitimate, intellectually preconceived idea, then it is a legitimate and authentic product of the unconscious mind. This is how one can gain access to unadulterated information about everything that transcends the conscious mind. However, it is vital to discern between ego and non-ego, because this principle will result in a release of energy and a renewal of life, which will come to have meaning. This study will deal with the active imagination as a knowledge that depends on the individual experience of the therapist because the patient will be taken just to reach where the unconscious of the therapist was assimilated to his own conscience. In this way, the therapist becomes the method itself, being his personality, a fundamental part of an effective factor.
Keywords: Active imagination, effective factor, symptom, transformation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6511338 Hybrid Reliability-Similarity-Based Approach for Supervised Machine Learning
Authors: Walid Cherif
Abstract:
Data mining has, over recent years, seen big advances because of the spread of internet, which generates everyday a tremendous volume of data, and also the immense advances in technologies which facilitate the analysis of these data. In particular, classification techniques are a subdomain of Data Mining which determines in which group each data instance is related within a given dataset. It is used to classify data into different classes according to desired criteria. Generally, a classification technique is either statistical or machine learning. Each type of these techniques has its own limits. Nowadays, current data are becoming increasingly heterogeneous; consequently, current classification techniques are encountering many difficulties. This paper defines new measure functions to quantify the resemblance between instances and then combines them in a new approach which is different from actual algorithms by its reliability computations. Results of the proposed approach exceeded most common classification techniques with an f-measure exceeding 97% on the IRIS Dataset.
Keywords: Data mining, knowledge discovery, machine learning, similarity measurement, supervised classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15311337 New Multisensor Data Fusion Method Based on Probabilistic Grids Representation
Authors: Zhichao Zhao, Yi Liu, Shunping Xiao
Abstract:
A new data fusion method called joint probability density matrix (JPDM) is proposed, which can associate and fuse measurements from spatially distributed heterogeneous sensors to identify the real target in a surveillance region. Using the probabilistic grids representation, we numerically combine the uncertainty regions of all the measurements in a general framework. The NP-hard multisensor data fusion problem has been converted to a peak picking problem in the grids map. Unlike most of the existing data fusion method, the JPDM method dose not need association processing, and will not lead to combinatorial explosion. Its convergence to the CRLB with a diminishing grid size has been proved. Simulation results are presented to illustrate the effectiveness of the proposed technique.
Keywords: Cramer-Rao lower bound (CRLB), data fusion, probabilistic grids, joint probability density matrix, localization, sensor network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18111336 Fault Detection of Broken Rotor Bars Using Stator Current Spectrum for the Direct Torque Control Induction Motor
Authors: Ridha Kechida, Arezki Menacer, Abdelhamid Benakcha
Abstract:
The numerous qualities of squirrel cage induction machines enhance their use in industry. However, various faults can occur, such as stator short-circuits and rotor failures. In this paper, we use a technique based on the spectral analysis of stator current in order to detect the fault in the machine: broken rotor bars. Thus, the number effect of the breaks has been highlighted. The effect is highlighted by considering the machine controlled by the Direct Torque Control (DTC). The key to fault detection is the development of a simplified dynamic model of a squirrel cage induction motor taking account the broken bars fault and the stator current spectrum analysis (FFT).Keywords: Rotor faults, diagnosis, induction motor, DTC, statorcurrent spectrum.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31271335 Software Effort Estimation Using Soft Computing Techniques
Authors: Parvinder S. Sandhu, Porush Bassi, Amanpreet Singh Brar
Abstract:
Various models have been derived by studying large number of completed software projects from various organizations and applications to explore how project sizes mapped into project effort. But, still there is a need to prediction accuracy of the models. As Neuro-fuzzy based system is able to approximate the non-linear function with more precision. So, Neuro-Fuzzy system is used as a soft computing approach to generate model by formulating the relationship based on its training. In this paper, Neuro-Fuzzy technique is used for software estimation modeling of on NASA software project data and performance of the developed models are compared with the Halstead, Walston-Felix, Bailey-Basili and Doty Models mentioned in the literature.
Keywords: Effort Estimation, Neural-Fuzzy Model, Halstead Model, Walston-Felix Model, Bailey-Basili Model, Doty Model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20801334 Robust Coherent Noise Suppression by Point Estimation of the Cauchy Location Parameter
Authors: Ephraim Gower, Thato Tsalaile, Monageng Kgwadi, Malcolm Hawksford.
Abstract:
This paper introduces a new point estimation algorithm, with particular focus on coherent noise suppression, given several measurements of the device under test where it is assumed that 1) the noise is first-order stationery and 2) the device under test is linear and time-invariant. The algorithm exploits the robustness of the Pitman estimator of the Cauchy location parameter through the initial scaling of the test signal by a centred Gaussian variable of predetermined variance. It is illustrated through mathematical derivations and simulation results that the proposed algorithm is more accurate and consistently robust to outliers for different tailed density functions than the conventional methods of sample mean (coherent averaging technique) and sample median search.
Keywords: Central limit theorem, Fisher-Cramer Rao, gamma function, Pitman estimator.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19311333 Control and Simulation of FOPDT Food Processes with Constraints using PI Controller
Authors: M.Y. Pua, M.C. Tan, L.W. Tan, N. Ab.Aziz, F.S. Taip
Abstract:
The most common type of controller being used in the industry is PI(D) controller which has been used since 1945 and is still being widely used due to its efficiency and simplicity. In most cases, the PI(D) controller was tuned without taking into consideration of the effect of actuator saturation. In real processes, the most common actuator which is valve will act as constraint and restrict the controller output. Since the controller is not designed to encounter saturation, the process may windup and consequently resulted in large oscillation or may become unstable. Usually, an antiwindup compensator is added to the feedback control loop to reduce the deterioration effect of integral windup. This research aims to specifically control processes with constraints. The proposed method was applied to two different types of food processes, which are blending and spray drying. Simulations were done using MATLAB and the performances of the proposed method were compared with other conventional methods. The proposed technique was able to control the processes and avoid saturation such that no anti windup compensator is needed.Keywords: constraints, food process control, first order plusdead time process, PI
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20671332 Performance Evaluation of an Efficient Asynchronous Protocol for WDM Ring MANs
Authors: Peristera A. Baziana
Abstract:
The idea of the asynchronous transmission in wavelength division multiplexing (WDM) ring MANs is studied in this paper. Especially, we present an efficient access technique to coordinate the collisions-free transmission of the variable sizes of IP traffic in WDM ring core networks. Each node is equipped with a tunable transmitter and a tunable receiver. In this way, all the wavelengths are exploited for both transmission and reception. In order to evaluate the performance measures of average throughput, queuing delay and packet dropping probability at the buffers, a simulation model that assumes symmetric access rights among the nodes is developed based on Poisson statistics. Extensive numerical results show that the proposed protocol achieves apart from high bandwidth exploitation for a wide range of offered load, fairness of queuing delay and dropping events among the different packets size categories.
Keywords: Asynchronous transmission, collision avoidance, wavelength division multiplexing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20971331 Treatment of Leaden Sludge of Algiers Refinery by Electrooxidation
Authors: K. Ighilahriz, M. Taleb Ahmed, R. Maachi
Abstract:
Oil industries are responsible for most cases of contamination of our ecosystem by oil and heavy metals. They are toxic and considered carcinogenic and dangerous even when they exist in trace amounts. At Algiers refinery, production, transportation, and refining of crude oil generate considerable waste in storage tanks; these residues result from the gravitational settling. The composition of these residues is essentially a mixture of hydrocarbon and lead. We propose in this work the application of electrooxidation treatment for the leachate of the leaden sludge. The effect of pH, current density and the electrolysis time were studied, the effectiveness of the processes is evaluated by measuring the chemical oxygen demand (COD). The dissolution is the best way to mobilize pollutants from leaden mud, so we conducted leaching before starting the electrochemical treatment. The process was carried out in batch mode using graphite anode and a stainless steel cathode. The results clearly demonstrate the compatibility of the technique used with the type of pollution studied. In fact, it allowed COD removal about 80%.Keywords: Electrooxidation, leaching, leaden sludge, the oil industry.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7241330 A Hybrid Radial-Based Neuro-GA Multiobjective Design of Laminated Composite Plates under Moisture and Thermal Actions
Authors: Mohammad Reza Ghasemi, Ali Ehsani
Abstract:
In this paper, the optimum weight and cost of a laminated composite plate is seeked, while it undergoes the heaviest load prior to a complete failure. Various failure criteria are defined for such structures in the literature. In this work, the Tsai-Hill theory is used as the failure criterion. The theory of analysis was based on the Classical Lamination Theory (CLT). A newly type of Genetic Algorithm (GA) as an optimization technique with a direct use of real variables was employed. Yet, since the optimization via GAs is a long process, and the major time is consumed through the analysis, Radial Basis Function Neural Networks (RBFNN) was employed in predicting the output from the analysis. Thus, the process of optimization will be carried out through a hybrid neuro-GA environment, and the procedure will be carried out until a predicted optimum solution is achieved.Keywords: Composite Laminates, GA, Multi-objectiveOptimization, Neural Networks, RBFNN.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14711329 Prediction of Tool and Nozzle Flow Behavior in Ultrasonic Machining Process
Authors: Vinod Kumar, Jatinder Kumar
Abstract:
The use of hard and brittle material has become increasingly more extensive in recent years. Therefore processing of these materials for the parts fabrication has become a challenging problem. However, it is time-consuming to machine the hard brittle materials with the traditional metal-cutting technique that uses abrasive wheels. In addition, the tool would suffer excessive wear as well. However, if ultrasonic energy is applied to the machining process and coupled with the use of hard abrasive grits, hard and brittle materials can be effectively machined. Ultrasonic machining process is mostly used for the brittle materials. The present research work has developed models using finite element approach to predict the mechanical stresses sand strains produced in the tool during ultrasonic machining process. Also the flow behavior of abrasive slurry coming out of the nozzle has been studied for simulation using ANSYS CFX module. The different abrasives of different grit sizes have been used for the experimentation work.Keywords: Stress, MRR, Flow, Ultrasonic Machining
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28141328 High Performance VLSI Architecture of 2D Discrete Wavelet Transform with Scalable Lattice Structure
Authors: Juyoung Kim, Taegeun Park
Abstract:
In this paper, we propose a fully-utilized, block-based 2D DWT (discrete wavelet transform) architecture, which consists of four 1D DWT filters with two-channel QMF lattice structure. The proposed architecture requires about 2MN-3N registers to save the intermediate results for higher level decomposition, where M and N stand for the filter length and the row width of the image respectively. Furthermore, the proposed 2D DWT processes in horizontal and vertical directions simultaneously without an idle period, so that it computes the DWT for an N×N image in a period of N2(1-2-2J)/3. Compared to the existing approaches, the proposed architecture shows 100% of hardware utilization and high throughput rates. To mitigate the long critical path delay due to the cascaded lattices, we can apply the pipeline technique with four stages, while retaining 100% of hardware utilization. The proposed architecture can be applied in real-time video signal processing.
Keywords: discrete wavelet transform, VLSI architecture, QMF lattice filter, pipelining.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17851327 Mining Implicit Knowledge to Predict Political Risk by Providing Novel Framework with Using Bayesian Network
Authors: Siavash Asadi Ghajarloo
Abstract:
Nowadays predicting political risk level of country has become a critical issue for investors who intend to achieve accurate information concerning stability of the business environments. Since, most of the times investors are layman and nonprofessional IT personnel; this paper aims to propose a framework named GECR in order to help nonexpert persons to discover political risk stability across time based on the political news and events. To achieve this goal, the Bayesian Networks approach was utilized for 186 political news of Pakistan as sample dataset. Bayesian Networks as an artificial intelligence approach has been employed in presented framework, since this is a powerful technique that can be applied to model uncertain domains. The results showed that our framework along with Bayesian Networks as decision support tool, predicted the political risk level with a high degree of accuracy.Keywords: Bayesian Networks, Data mining, GECRframework, Predicting political risk.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21791326 Robot Vision Application based on Complex 3D Pose Computation
Authors: F. Rotaru, S. Bejinariu, C. D. Niţâ, R. Luca, I. Pâvâloi, C. Lazâr
Abstract:
The paper presents a technique suitable in robot vision applications where it is not possible to establish the object position from one view. Usually, one view pose calculation methods are based on the correspondence of image features established at a training step and exactly the same image features extracted at the execution step, for a different object pose. When such a correspondence is not feasible because of the lack of specific features a new method is proposed. In the first step the method computes from two views the 3D pose of feature points. Subsequently, using a registration algorithm, the set of 3D feature points extracted at the execution phase is aligned with the set of 3D feature points extracted at the training phase. The result is a Euclidean transform which have to be used by robot head for reorientation at execution step.Keywords: features correspondence, registration algorithm, robot vision, triangulation method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14751325 A Wavelet-Based Watermarking Method Exploiting the Contrast Sensitivity Function
Authors: John N. Ellinas, Panagiotis Kenterlis
Abstract:
The efficiency of an image watermarking technique depends on the preservation of visually significant information. This is attained by embedding the watermark transparently with the maximum possible strength. The current paper presents an approach for still image digital watermarking in which the watermark embedding process employs the wavelet transform and incorporates Human Visual System (HVS) characteristics. The sensitivity of a human observer to contrast with respect to spatial frequency is described by the Contrast Sensitivity Function (CSF). The strength of the watermark within the decomposition subbands, which occupy an interval on the spatial frequencies, is adjusted according to this sensitivity. Moreover, the watermark embedding process is carried over the subband coefficients that lie on edges where distortions are less noticeable. The experimental evaluation of the proposed method shows very good results in terms of robustness and transparency.
Keywords: Image watermarking, wavelet transform, human visual system, contrast sensitivity function.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20951324 Retrieving Similar Segmented Objects Using Motion Descriptors
Authors: Konstantinos C. Kartsakalis, Angeliki Skoura, Vasileios Megalooikonomou
Abstract:
The fuzzy composition of objects depicted in images acquired through MR imaging or the use of bio-scanners has often been a point of controversy for field experts attempting to effectively delineate between the visualized objects. Modern approaches in medical image segmentation tend to consider fuzziness as a characteristic and inherent feature of the depicted object, instead of an undesirable trait. In this paper, a novel technique for efficient image retrieval in the context of images in which segmented objects are either crisp or fuzzily bounded is presented. Moreover, the proposed method is applied in the case of multiple, even conflicting, segmentations from field experts. Experimental results demonstrate the efficiency of the suggested method in retrieving similar objects from the aforementioned categories while taking into account the fuzzy nature of the depicted data.
Keywords: Fuzzy Object, Fuzzy Image Segmentation, Motion Descriptors, MRI Imaging, Object-Based Image Retrieval.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23061323 Flux Cored Arc Welding Parameter Optimization of AISI 316L (N) Austenitic Stainless Steel
Authors: D.Katherasan, Madana Sashikant, S.Sandeep Bhat, P.Sathiya
Abstract:
Bead-on-plate welds were carried out on AISI 316L (N) austenitic stainless steel (ASS) using flux cored arc welding (FCAW) process. The bead on plates weld was conducted as per L25 orthogonal array. In this paper, the weld bead geometry such as depth of penetration (DOP), bead width (BW) and weld reinforcement (R) of AISI 316L (N) ASS are investigated. Taguchi approach is used as statistical design of experiment (DOE) technique for optimizing the selected welding input parameters. Grey relational analysis and desirability approach are applied to optimize the input parameters considering multiple output variables simultaneously. Confirmation experiment has also been conducted to validate the optimized parameters.Keywords: bead-on-plate welding, bead profiles, desirability approach, grey relational analysis
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24031322 Use of Linear Programming for Optimal Production in a Production Line in Saudi Food Co.
Authors: Qasim M. Kriri
Abstract:
Few Saudi Arabia production companies face financial profit issues until this moment. This work presents a linear integer programming model that solves a production problem of a Saudi Food Company in Saudi Arabia. An optimal solution to the above-mentioned problem is a Linear Programming solution. In this regard, the main purpose of this project is to maximize profit. Linear Programming Technique has been used to derive the maximum profit from production of natural juice at Saudi Food Co. The operations of production of the company were formulated and optimal results are found out by using Lindo Software that employed Sensitivity Analysis and Parametric linear programming in order develop Linear Programming. In addition, the parameter values are increased, then the values of the objective function will be increased.
Keywords: Parameter linear programming, objective function, sensitivity analysis, optimize profit.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29181321 Nature Inspired Metaheuristic Algorithms for Multilevel Thresholding Image Segmentation - A Survey
Authors: C. Deepika, J. Nithya
Abstract:
Segmentation is one of the essential tasks in image processing. Thresholding is one of the simplest techniques for performing image segmentation. Multilevel thresholding is a simple and effective technique. The primary objective of bi-level or multilevel thresholding for image segmentation is to determine a best thresholding value. To achieve multilevel thresholding various techniques has been proposed. A study of some nature inspired metaheuristic algorithms for multilevel thresholding for image segmentation is conducted. Here, we study about Particle swarm optimization (PSO) algorithm, artificial bee colony optimization (ABC), Ant colony optimization (ACO) algorithm and Cuckoo search (CS) algorithm.
Keywords: Ant colony optimization, Artificial bee colony optimization, Cuckoo search algorithm, Image segmentation, Multilevel thresholding, Particle swarm optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 35291320 Chemical and Vibrational Nonequilibrium Hypersonic Viscous Flow around an Axisymmetric Blunt Body
Authors: R. Haoui
Abstract:
Hypersonic flows around spatial vehicles during their reentry phase in planetary atmospheres are characterized by intense aerothermodynamics phenomena. The aim of this work is to analyze high temperature flows around an axisymmetric blunt body taking into account chemical and vibrational non-equilibrium for air mixture species and the no slip condition at the wall. For this purpose, the Navier-Stokes equations system is resolved by the finite volume methodology to determine the flow parameters around the axisymmetric blunt body especially at the stagnation point and in the boundary layer along the wall of the blunt body. The code allows the capture of shock wave before a blunt body placed in hypersonic free stream. The numerical technique uses the Flux Vector Splitting method of Van Leer. CFL coefficient and mesh size level are selected to ensure the numerical convergence.
Keywords: Hypersonic flow, viscous flow, chemical kinetic, dissociation, finite volumes, frozen and non-equilibrium flow.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22071319 Risk Assessment of Building Information Modelling Adoption in Construction Projects
Authors: Amirhossein Karamoozian, Desheng Wu, Behzad Abbasnejad
Abstract:
Building information modelling (BIM) is a new technology to enhance the efficiency of project management in the construction industry. In addition to the potential benefits of this useful technology, there are various risks and obstacles to applying it in construction projects. In this study, a decision making approach is presented for risk assessment in BIM adoption in construction projects. Various risk factors of exerting BIM during different phases of the project lifecycle are identified with the help of Delphi method, experts’ opinions and related literature. Afterward, Shannon’s entropy and Fuzzy TOPSIS (Technique for Order Preference by Similarity to Ideal Situation) are applied to derive priorities of the identified risk factors. Results indicated that lack of knowledge between professional engineers about workflows in BIM and conflict of opinions between different stakeholders are the risk factors with the highest priority.
Keywords: Risk, BIM, Shannon’s entropy, Fuzzy TOPSIS, construction projects.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14791318 Screened Potential in a Reverse Monte Carlo (RMC) Simulation
Authors: M. Habchi, S. M. Mesli, M. Kotbi
Abstract:
A structural study of an aqueous electrolyte whose experimental results are available. It is a solution of LiCl-6H2O type at glassy state (120K) contrasted with pure water at room temperature by means of Partial Distribution Functions (PDF) issue from neutron scattering technique. Based on these partial functions, the Reverse Monte Carlo method (RMC) computes radial and angular correlation functions which allow exploring a number of structural features of the system. The obtained curves include some artifacts. To remedy this, we propose to introduce a screened potential as an additional constraint. Obtained results show a good matching between experimental and computed functions and a significant improvement in PDFs curves with potential constraint. It suggests an efficient fit of pair distribution functions curves.Keywords: RMC simulation; Screened potential; partial and pair distribution functions; glassy and liquid state
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15301317 Novel Process Formulation of Multiple Unit Tablet of Pantoprazole
Authors: Vipin Saini, Sunil Kamboj, Suman Bala, A. Pandurangan
Abstract:
The present invention relates to multiple-unit tablet dosage forms, which is composed of several subunits (multiparticulates/pellets). Each small multiparticulate further composed of many layers. Some layer contains drug substance; others are rate controlling polymer. The resulting multiple-unit tablet dosage forms of pantoprazole were satisfactory fabricated. Pelletization technique has some advantages over coated tablet formulation. In coated tablet the coating may be damaged and a pinhole possibly formed that would result in increased release of drug in stomach and may be deactivated in stomach juices. If the coat of some pellets may be damaged that would not affect the release properties of the multiple-unit tablet. Hence they are beneficial in this aspect. The results confirmed the successful preparation of stable and bioequivalent once daily controlled release multiple-unit tablets of pantoprazole.
Keywords: Controlled release, multiple unit tablets, pantoprazole, pelletization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32641316 Observations about the Principal Components Analysis and Data Clustering Techniques in the Study of Medical Data
Authors: Cristina G. Dascâlu, Corina Dima Cozma, Elena Carmen Cotrutz
Abstract:
The medical data statistical analysis often requires the using of some special techniques, because of the particularities of these data. The principal components analysis and the data clustering are two statistical methods for data mining very useful in the medical field, the first one as a method to decrease the number of studied parameters, and the second one as a method to analyze the connections between diagnosis and the data about the patient-s condition. In this paper we investigate the implications obtained from a specific data analysis technique: the data clustering preceded by a selection of the most relevant parameters, made using the principal components analysis. Our assumption was that, using the principal components analysis before data clustering - in order to select and to classify only the most relevant parameters – the accuracy of clustering is improved, but the practical results showed the opposite fact: the clustering accuracy decreases, with a percentage approximately equal with the percentage of information loss reported by the principal components analysis.Keywords: Data clustering, medical data, principal components analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15081315 Effect of Preheating Temperature and Chamber Pressure on the Properties of Porous NiTi Alloy Prepared by SHS Technique
Authors: Wisutmethangoon S., Denmud N., Sikong L.
Abstract:
The fabrication of porous NiTi shape memory alloys (SMAs) from elemental powder compacts was conducted by selfpropagating high temperature synthesis (SHS). Effects of the preheating temperature and the chamber pressure on the combustion characteristics as well as the final morphology and the composition of products were studied. The samples with porosity between 56.4 and 59.0% under preheating temperature in the range of 200-300°C and Ar-gas chamber pressure of 138 and 201 kPa were obtained. The pore structures were found to be dissimilar only in the samples processed with different preheating temperature. The major phase in the porous product is NiTi with small amounts of secondary phases, NiTi2 and Ni4Ti3. The preheating temperature and the chamber pressure have very little effect on the phase constituent. While the combustion temperature of the sample was notably increased by increasing the preheating temperature, they were slightly changed by varying the chamber pressure.
Keywords: Combustion synthesis, porous materials, self propagating high temperature synthesis, shape memory alloy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1752