Search results for: fling step
332 Optimization Approaches for a Complex Dairy Farm Simulation Model
Authors: Jagannath Aryal, Don Kulasiri, Dishi Liu
Abstract:
This paper describes the optimization of a complex dairy farm simulation model using two quite different methods of optimization, the Genetic algorithm (GA) and the Lipschitz Branch-and-Bound (LBB) algorithm. These techniques have been used to improve an agricultural system model developed by Dexcel Limited, New Zealand, which describes a detailed representation of pastoral dairying scenarios and contains an 8-dimensional parameter space. The model incorporates the sub-models of pasture growth and animal metabolism, which are themselves complex in many cases. Each evaluation of the objective function, a composite 'Farm Performance Index (FPI)', requires simulation of at least a one-year period of farm operation with a daily time-step, and is therefore computationally expensive. The problem of visualization of the objective function (response surface) in high-dimensional spaces is also considered in the context of the farm optimization problem. Adaptations of the sammon mapping and parallel coordinates visualization are described which help visualize some important properties of the model-s output topography. From this study, it is found that GA requires fewer function evaluations in optimization than the LBB algorithm.Keywords: Genetic Algorithm, Linux Cluster, LipschitzBranch-and-Bound, Optimization
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2114331 Morphing Human Faces: Automatic Control Points Selection and Color Transition
Authors: Stephen Karungaru, Minoru Fukumi, Norio Akamatsu
Abstract:
In this paper, we propose a morphing method by which face color images can be freely transformed. The main focus of this work is the transformation of one face image to another. This method is fully automatic in that it can morph two face images by automatically detecting all the control points necessary to perform the morph. A face detection neural network, edge detection and medium filters are employed to detect the face position and features. Five control points, for both the source and target images, are then extracted based on the facial features. Triangulation method is then used to match and warp the source image to the target image using the control points. Finally color interpolation is done using a color Gaussian model that calculates the color for each particular frame depending on the number of frames used. A real coded Genetic algorithm is used in both the image warping and color blending steps to assist in step size decisions and speed up the morphing. This method results in ''very smooth'' morphs and is fast to process.
Keywords: color transition, genetic algorithms morphing, warping
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2829330 Site Selection of Traffic Camera based on Dempster-Shafer and Bagging Theory
Authors: S. Rokhsari, M. Delavar, A. Sadeghi-Niaraki, A. Abed-Elmdoust, B. Moshiri
Abstract:
Traffic incident has bad effect on all parts of society so controlling road networks with enough traffic devices could help to decrease number of accidents, so using the best method for optimum site selection of these devices could help to implement good monitoring system. This paper has considered here important criteria for optimum site selection of traffic camera based on aggregation methods such as Bagging and Dempster-Shafer concepts. In the first step, important criteria such as annual traffic flow, distance from critical places such as parks that need more traffic controlling were identified for selection of important road links for traffic camera installation, Then classification methods such as Artificial neural network and Decision tree algorithms were employed for classification of road links based on their importance for camera installation. Then for improving the result of classifiers aggregation methods such as Bagging and Dempster-Shafer theories were used.Keywords: Aggregation, Bagging theory, Dempster-Shafer theory, Site selection
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1708329 Optimization of Two-Stage Pretreatment Combined with Microwave Radiation Using Response Surface Methodology
Authors: Jidapa Manaso, Apanee Luengnaruemitchai, Sujitra Wongkasemjit
Abstract:
Pretreatment is an essential step in the conversion of lignocellulosic biomass to fermentable sugar that used for biobutanol production. Among pretreatment processes, microwave is considered to improve pretreatment efficiency due to its high heating efficiency, easy operation, and easily to combine with chemical reaction. The main objectives of this work are to investigate the feasibility of microwave pretreatment to enhance enzymatic hydrolysis of corncobs and to determine the optimal conditions using response surface methodology. Corncobs were pretreated via two-stage pretreatment in dilute sodium hydroxide (2 %) followed by dilute sulfuric acid 1 %. Pretreated corncobs were subjected to enzymatic hydrolysis to produce reducing sugar. Statistical experimental design was used to optimize pretreatment parameters including temperature, residence time and solid-to-liquid ratio to achieve the highest amount of glucose. The results revealed that solid-to-liquid ratio and temperature had a significant effect on the amount of glucose.Keywords: Corncobs, Microwave radiation, Pretreatment, Response Surface Methodology.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2557328 Adaptive Fuzzy Control of a Nonlinear Tank Process
Authors: A. R. Tavakolpour-Saleh, H. Jokar
Abstract:
Liquid level control of conical tank system is known to be a great challenge in many industries such as food processing, hydrometallurgical industries and wastewater treatment plant due to its highly nonlinear characteristics. In this research, an adaptive fuzzy PID control scheme is applied to the problem of liquid level control in a nonlinear tank process. A conical tank process is first modeled and primarily simulated. A PID controller is then applied to the plant model as a suitable benchmark for comparison and the dynamic responses of the control system to different step inputs were investigated. It is found that the conventional PID controller is not able to fulfill the controller design criteria such as desired time constant due to highly nonlinear characteristics of the plant model. Consequently, a nonlinear control strategy based on gain-scheduling adaptive control incorporating a fuzzy logic observer is proposed to accurately control the nonlinear tank system. The simulation results clearly demonstrated the superiority of the proposed adaptive fuzzy control method over the conventional PID controller.
Keywords: Adaptive control, fuzzy logic, conical tank, PID controller.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2022327 Removal of Heavy Metals from Water in the Presence of Organic Wastes: Fruit Peels
Authors: Berk Kılıç, Derin Dalgıç, Ela Mia Sevilla Levi, Ömer Aydın
Abstract:
In this experiment our goal was to remove heavy metals from water. Generally, removing toxic heavy elements: Cu+2, Cr+6 and Fe+3, ions from their aqueous solutions has been determined with different kinds of plants’ peels. However, this study focuses on banana, peach, orange, and potato peels. The first step of the experiment was to wash the peels with distilled water and then dry the peels in an oven for 80 h at 80 °C. The peels were washed with NaOH and dried again at 80 °C for 2 days. Once the peels were washed and dried, 0.4 grams were weighed and added to a 200 mL sample of 0.1% heavy metal solution by mass. The mixing process was done via a magnetic stirrer. A sample of each was taken at 15-minute intervals and the level of absorbance change of the solutions was detected using a UV-Vis Spectrophotometer. Among the used waste products, orange showed the best results, followed by banana peel as the most efficient for our purposes. Moreover, the amount of fruit peel, pH values of the initial heavy metal solution, and initial concentration of heavy metal solutions were investigated to determine the effectiveness of fruit peels for absorbency.
Keywords: Absorbance, heavy metal, removal of heavy metals, fruit peels.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 170326 A Digital Twin Approach for Sustainable Territories Planning: A Case Study on District Heating
Authors: A. Amrani, O. Allali, A. Ben Hamida, F. Defrance, S. Morland, E. Pineau, T. Lacroix
Abstract:
The energy planning process is a very complex task that involves several stakeholders and requires the consideration of several local and global factors and constraints. In order to optimize and simplify this process, we propose a tool-based iterative approach applied to district heating planning. We build our tool with the collaboration of a French territory using actual district data and implementing the European incentives. We set up an iterative process including data visualization and analysis, identification and extraction of information related to the area concerned by the operation, design of sustainable planning scenarios leveraging local renewable and recoverable energy sources, and finally, the evaluation of scenarios. The last step is performed by a dynamic digital twin replica of the city. Territory’s energy experts confirm that the tool provides them with valuable support towards sustainable energy planning.
Keywords: Climate change, data management, decision support, digital twin, district heating, energy planning, renewables, smart city.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 661325 A Text Clustering System based on k-means Type Subspace Clustering and Ontology
Authors: Liping Jing, Michael K. Ng, Xinhua Yang, Joshua Zhexue Huang
Abstract:
This paper presents a text clustering system developed based on a k-means type subspace clustering algorithm to cluster large, high dimensional and sparse text data. In this algorithm, a new step is added in the k-means clustering process to automatically calculate the weights of keywords in each cluster so that the important words of a cluster can be identified by the weight values. For understanding and interpretation of clustering results, a few keywords that can best represent the semantic topic are extracted from each cluster. Two methods are used to extract the representative words. The candidate words are first selected according to their weights calculated by our new algorithm. Then, the candidates are fed to the WordNet to identify the set of noun words and consolidate the synonymy and hyponymy words. Experimental results have shown that the clustering algorithm is superior to the other subspace clustering algorithms, such as PROCLUS and HARP and kmeans type algorithm, e.g., Bisecting-KMeans. Furthermore, the word extraction method is effective in selection of the words to represent the topics of the clusters.
Keywords: Subspace Clustering, Text Mining, Feature Weighting, Cluster Interpretation, Ontology
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2463324 Sequential Straightforward Clustering for Local Image Block Matching
Authors: Mohammad Akbarpour Sekeh, Mohd. Aizaini Maarof, Mohd. Foad Rohani, Malihe Motiei
Abstract:
Duplicated region detection is a technical method to expose copy-paste forgeries on digital images. Copy-paste is one of the common types of forgeries to clone portion of an image in order to conceal or duplicate special object. In this type of forgery detection, extracting robust block feature and also high time complexity of matching step are two main open problems. This paper concentrates on computational time and proposes a local block matching algorithm based on block clustering to enhance time complexity. Time complexity of the proposed algorithm is formulated and effects of two parameter, block size and number of cluster, on efficiency of this algorithm are considered. The experimental results and mathematical analysis demonstrate this algorithm is more costeffective than lexicographically algorithms in time complexity issue when the image is complex.Keywords: Copy-paste forgery detection, Duplicated region, Timecomplexity, Local block matching, Sequential block clustering.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1835323 Modified Fuzzy ARTMAP and Supervised Fuzzy ART: Comparative Study with Multispectral Classification
Authors: F.Alilat, S.Loumi, H.Merrad, B.Sansal
Abstract:
In this article a modification of the algorithm of the fuzzy ART network, aiming at returning it supervised is carried out. It consists of the search for the comparison, training and vigilance parameters giving the minimum quadratic distances between the output of the training base and those obtained by the network. The same process is applied for the determination of the parameters of the fuzzy ARTMAP giving the most powerful network. The modification consist in making learn the fuzzy ARTMAP a base of examples not only once as it is of use, but as many time as its architecture is in evolution or than the objective error is not reached . In this way, we don-t worry about the values to impose on the eight (08) parameters of the network. To evaluate each one of these three networks modified, a comparison of their performances is carried out. As application we carried out a classification of the image of Algiers-s bay taken by SPOT XS. We use as criterion of evaluation the training duration, the mean square error (MSE) in step control and the rate of good classification per class. The results of this study presented as curves, tables and images show that modified fuzzy ARTMAP presents the best compromise quality/computing time.
Keywords: Neural Networks, fuzzy ART, fuzzy ARTMAP, Remote sensing, multispectral Classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1367322 Implementing Knowledge Transfer Solution through Web-based Help Desk System
Authors: Mazeyanti M. Ariffin, Noreen Izza Arshad, Ainol Rahmah Shaarani, Syed Uzair Shah
Abstract:
Knowledge management is a process taking any steps that needed to get the most out of available knowledge resources. KM involved several steps; capturing the knowledge discovering new knowledge, sharing the knowledge and applied the knowledge in the decision making process. In applying the knowledge, it is not necessary for the individual that use the knowledge to comprehend it as long as the available knowledge is used in guiding the decision making and actions. When an expert is called and he provides stepby- step procedure on how to solve the problems to the caller, the expert is transferring the knowledge or giving direction to the caller. And the caller is 'applying' the knowledge by following the instructions given by the expert. An appropriate mechanism is needed to ensure effective knowledge transfer which in this case is by telephone or email. The problem with email and telephone is that the knowledge is not fully circulated and disseminated to all users. In this paper, with related experience of local university Help Desk, it is proposed the usage of Information Technology (IT)to effectively support the knowledge transfer in the organization. The issues covered include the existing knowledge, the related works, the methodology used in defining the knowledge management requirements as well the overview of the prototype.Keywords: Knowledge Management, Knowledge Transfer, Help Desk, Web-based system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1786321 The Effect of Methionine and Acetate Concentrations on Mycophenolic Acid Production by Penicillium bervicompactum MUCL 19011 in Submerged Culture
Authors: Fatemeh Ardestani, Seyed Safa-ali Fatemi, Bagher Yakhchali, Seyed Morteza Hosseyni, Ghasem Najafpour
Abstract:
Mycophenolic acid “MPA" is a secondary metabolite of Penicillium bervicompactum with antibiotic and immunosuppressive properties. In this study, fermentation process was established for production of mycophenolic acid by Penicillium bervicompactum MUCL 19011 in shake flask. The maximum MPA production, product yield and productivity were 1.379 g/L, 18.6 mg/g glucose and 4.9 mg/L.h respectively. Glucose consumption, biomass and MPA production profiles were investigated during fermentation time. It was found that MPA production starts approximately after 180 hours and reaches to a maximum at 280 h. In the next step, the effects of methionine and acetate concentrations on MPA production were evaluated. Maximum MPA production, product yield and productivity (1.763 g/L, 23.8 mg/g glucose and 6.30 mg/L. h respectively) were obtained with using 2.5 g/L methionine in culture medium. Further addition of methionine had not more positive effect on MPA production. Finally, results showed that the addition of acetate to the culture medium had not any observable effect on MPA productionKeywords: Penicillium bervicompactum, Methionine, Mycophenolic acid, Submerged culture.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1307320 The Application of FSI Techniques in Modeling of Realist Pulmonary Systems
Authors: Abdurrahim Bolukbasi, Hassan Athari, Dogan Ciloglu
Abstract:
The modeling lung respiratory system that has complex anatomy and biophysics presents several challenges including tissue-driven flow patterns and wall motion. Also, the pulmonary lung system because of that they stretch and recoil with each breath, has not static walls and structures. The direct relationship between air flow and tissue motion in the lung structures naturally prefers an FSI simulation technique. Therefore, in order to toward the realistic simulation of pulmonary breathing mechanics the development of a coupled FSI computational model is an important step. A simple but physiologically relevant three-dimensional deep long geometry is designed and fluid-structure interaction (FSI) coupling technique is utilized for simulating the deformation of the lung parenchyma tissue that produces airflow fields. The real understanding of respiratory tissue system as a complex phenomenon have been investigated with respect to respiratory patterns, fluid dynamics and tissue viscoelasticity and tidal breathing period.
Keywords: Lung deformation and mechanics, tissue mechanics, viscoelasticity, fluid-structure interactions, ANSYS.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2329319 The First Prevalence Report of Direct Identification and Differentiation of B. abortus and B. melitensis using Real Time PCR in House Mouse of Iran
Authors: A. Doosti, S. Moshkelani
Abstract:
Brucellosis is a zoonotic disease; its symptoms and appearances are not exclusive in human and its traditional diagnosis is based on culture, serological methods and conventional PCR. For more sensitive, specific detection and differentiation of Brucella spp., the real time PCR method is recommended. This research has performed to determine the presence and prevalence of Brucella spp. and differentiation of Brucella abortus and Brucella melitensis in house mouse (Mus musculus) in west of Iran. A TaqMan analysis and single-step PCR was carried out in total 326 DNA of Mouse's spleen samples. From the total number of 326 samples, 128 (39.27%) gave positive results for Brucella spp. by conventional PCR, also 65 and 32 out of the 128 specimens were positive for B. melitensis, B. abortus, respectively. These results indicate a high presence of this pathogen in this area and that real time PCR is considerably faster than current standard methods for identification and differentiation of Brucella species. To our knowledge, this study is the first prevalence report of direct identification and differentiation of B. abortus and B. melitensis by real time PCR in mouse tissue samples in Iran.
Keywords: Differentiation, B. abortus, B. melitensis, TaqManprobe, Iran.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1569318 DCBOR: A Density Clustering Based on Outlier Removal
Authors: A. M. Fahim, G. Saake, A. M. Salem, F. A. Torkey, M. A. Ramadan
Abstract:
Data clustering is an important data exploration technique with many applications in data mining. We present an enhanced version of the well known single link clustering algorithm. We will refer to this algorithm as DCBOR. The proposed algorithm alleviates the chain effect by removing the outliers from the given dataset. So this algorithm provides outlier detection and data clustering simultaneously. This algorithm does not need to update the distance matrix, since the algorithm depends on merging the most k-nearest objects in one step and the cluster continues grow as long as possible under specified condition. So the algorithm consists of two phases; at the first phase, it removes the outliers from the input dataset. At the second phase, it performs the clustering process. This algorithm discovers clusters of different shapes, sizes, densities and requires only one input parameter; this parameter represents a threshold for outlier points. The value of the input parameter is ranging from 0 to 1. The algorithm supports the user in determining an appropriate value for it. We have tested this algorithm on different datasets contain outlier and connecting clusters by chain of density points, and the algorithm discovers the correct clusters. The results of our experiments demonstrate the effectiveness and the efficiency of DCBOR.Keywords: Data Clustering, Clustering Algorithms, Handling Noise, Arbitrary Shape of Clusters.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1937317 Numerical Study of Transient Laminar Natural Convection Cooling of high Prandtl Number Fluids in a Cubical Cavity: Influence of the Prandtl Number
Authors: O. Younis, J. Pallares, F. X. Grau
Abstract:
This paper presents and discusses the numerical simulations of transient laminar natural convection cooling of high Prandtl number fluids in cubical cavities, in which the six walls of the cavity are subjected to a step change in temperature. The effect of the fluid Prandtl number on the heat transfer coefficient is studied for three different fluids (Golden Syrup, Glycerin and Glycerin-water solution 50%). The simulations are performed at two different Rayleigh numbers (5·106 and 5·107) and six different Prandtl numbers (3 · 105 ≥Pr≥ 50). Heat conduction through the cavity glass walls is also considered. The propsed correlations of the averaged heat transfer coefficient (N u) showed that it is dependant on the initial Ra and almost independent on P r. The instantaneous flow patterns, temperature contours and time evolution of volume averaged temperature and heat transfer coefficient are presented and analyzed.
Keywords: Transient natural convection, High Prandtl number, variable viscosity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2467316 Reduction of Linear Time-Invariant Systems Using Routh-Approximation and PSO
Authors: S. Panda, S. K. Tomar, R. Prasad, C. Ardil
Abstract:
Order reduction of linear-time invariant systems employing two methods; one using the advantages of Routh approximation and other by an evolutionary technique is presented in this paper. In Routh approximation method the denominator of the reduced order model is obtained using Routh approximation while the numerator of the reduced order model is determined using the indirect approach of retaining the time moments and/or Markov parameters of original system. By this method the reduced order model guarantees stability if the original high order model is stable. In the second method Particle Swarm Optimization (PSO) is employed to reduce the higher order model. PSO method is based on the minimization of the Integral Squared Error (ISE) between the transient responses of original higher order model and the reduced order model pertaining to a unit step input. Both the methods are illustrated through numerical examples.
Keywords: Model Order Reduction, Markov Parameters, Routh Approximation, Particle Swarm Optimization, Integral Squared Error, Steady State Stability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3293315 Segmentation of Lungs from CT Scan Images for Early Diagnosis of Lung Cancer
Authors: Nisar Ahmed Memon, Anwar Majid Mirza, S.A.M. Gilani
Abstract:
Segmentation is an important step in medical image analysis and classification for radiological evaluation or computer aided diagnosis. The CAD (Computer Aided Diagnosis ) of lung CT generally first segment the area of interest (lung) and then analyze the separately obtained area for nodule detection in order to diagnosis the disease. For normal lung, segmentation can be performed by making use of excellent contrast between air and surrounding tissues. However this approach fails when lung is affected by high density pathology. Dense pathologies are present in approximately a fifth of clinical scans, and for computer analysis such as detection and quantification of abnormal areas it is vital that the entire and perfectly lung part of the image is provided and no part, as present in the original image be eradicated. In this paper we have proposed a lung segmentation technique which accurately segment the lung parenchyma from lung CT Scan images. The algorithm was tested against the 25 datasets of different patients received from Ackron Univeristy, USA and AGA Khan Medical University, Karachi, Pakistan.Keywords: Computer Aided Diagnosis, Medical ImageProcessing, Region Growing, Segmentation, Thresholding,
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2602314 Specialized Reduced Models of Dynamic Flows in 2-Stroke Engines
Authors: S. Cagin, X. Fischer, E. Delacourt, N. Bourabaa, C. Morin, D. Coutellier, B. Carré, S. Loumé
Abstract:
The complexity of scavenging by ports and its impact on engine efficiency create the need to understand and to model it as realistically as possible. However, there are few empirical scavenging models and these are highly specialized. In a design optimization process, they appear very restricted and their field of use is limited. This paper presents a comparison of two methods to establish and reduce a model of the scavenging process in 2-stroke diesel engines. To solve the lack of scavenging models, a CFD model has been developed and is used as the referent case. However, its large size requires a reduction. Two techniques have been tested depending on their fields of application: The NTF method and neural networks. They both appear highly appropriate drastically reducing the model’s size (over 90% reduction) with a low relative error rate (under 10%). Furthermore, each method produces a reduced model which can be used in distinct specialized fields of application: the distribution of a quantity (mass fraction for example) in the cylinder at each time step (pseudo-dynamic model) or the qualification of scavenging at the end of the process (pseudo-static model).
Keywords: Diesel engine, Design optimization, Model reduction, Neural network, NTF algorithm, Scavenging.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1332313 Segmentation Problems and Solutions in Printed Degraded Gurmukhi Script
Authors: M. K. Jindal, G. S. Lehal, R. K. Sharma
Abstract:
Character segmentation is an important preprocessing step for text recognition. In degraded documents, existence of touching characters decreases recognition rate drastically, for any optical character recognition (OCR) system. In this paper we have proposed a complete solution for segmenting touching characters in all the three zones of printed Gurmukhi script. A study of touching Gurmukhi characters is carried out and these characters have been divided into various categories after a careful analysis. Structural properties of the Gurmukhi characters are used for defining the categories. New algorithms have been proposed to segment the touching characters in middle zone, upper zone and lower zone. These algorithms have shown a reasonable improvement in segmenting the touching characters in degraded printed Gurmukhi script. The algorithms proposed in this paper are applicable only to machine printed text. We have also discussed a new and useful technique to segment the horizontally overlapping lines.Keywords: Character Segmentation, Middle Zone, Upper Zone, Lower Zone, Touching Characters, Horizontally Overlapping Lines.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1698312 Faster FPGA Routing Solution using DNA Computing
Authors: Manpreet Singh, Parvinder Singh Sandhu, Manjinder Singh Kahlon
Abstract:
There are many classical algorithms for finding routing in FPGA. But Using DNA computing we can solve the routes efficiently and fast. The run time complexity of DNA algorithms is much less than other classical algorithms which are used for solving routing in FPGA. The research in DNA computing is in a primary level. High information density of DNA molecules and massive parallelism involved in the DNA reactions make DNA computing a powerful tool. It has been proved by many research accomplishments that any procedure that can be programmed in a silicon computer can be realized as a DNA computing procedure. In this paper we have proposed two tier approaches for the FPGA routing solution. First, geometric FPGA detailed routing task is solved by transforming it into a Boolean satisfiability equation with the property that any assignment of input variables that satisfies the equation specifies a valid routing. Satisfying assignment for particular route will result in a valid routing and absence of a satisfying assignment implies that the layout is un-routable. In second step, DNA search algorithm is applied on this Boolean equation for solving routing alternatives utilizing the properties of DNA computation. The simulated results are satisfactory and give the indication of applicability of DNA computing for solving the FPGA Routing problem.Keywords: FPGA, Routing, DNA Computing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1595311 Molecular Dynamic Simulation and Receptor-based Pharmacophore Modeling on Human Renin for Discovery of Novel Inhibitors
Authors: Chanin Park, Sundarapandian Thangapandian, Yuno Lee, Minky Son, Shalini John, Young-sik Sohn, Keun Woo Lee
Abstract:
Hypertension is characterized with stress on the heart and blood vessels thus increasing the risk of heart attack and renal diseases. The Renin angiotensin system (RAS) plays a major role in blood pressure control. Renin is the enzyme that controls the RAS at the rate-limiting step. Our aim is to develop new drug-like leads which can inhibit renin and thereby emerge as therapeutics for hypertension. To achieve this, molecular dynamics (MD) simulation and receptor-based pharmacophore modeling were implemented, and three rennin-inhibitor complex structures were selected based on IC50 value and scaffolds of inhibitors. Three pharmacophore models were generated considering conformations induced by inhibitor. The compounds mapped to these models were selected and subjected to drug-like screening. The identified hits were docked into the active site of renin. Finally, hit1 satisfying the binding mode and interaction energy was selected as possible lead candidate to be used in novel renin inhibitors.
Keywords: Renin inhibitor, Molecular dynamics simulation, Structure-based pharmacophore modeling
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1970310 A Numerical Description of a Fibre Reinforced Concrete Using a Genetic Algorithm
Authors: Henrik L. Funke, Lars Ulke-Winter, Sandra Gelbrich, Lothar Kroll
Abstract:
This work reports about an approach for an automatic adaptation of concrete formulations based on genetic algorithms (GA) to optimize a wide range of different fit-functions. In order to achieve the goal, a method was developed which provides a numerical description of a fibre reinforced concrete (FRC) mixture regarding the production technology and the property spectrum of the concrete. In a first step, the FRC mixture with seven fixed components was characterized by varying amounts of the components. For that purpose, ten concrete mixtures were prepared and tested. The testing procedure comprised flow spread, compressive and bending tensile strength. The analysis and approximation of the determined data was carried out by GAs. The aim was to obtain a closed mathematical expression which best describes the given seven-point cloud of FRC by applying a Gene Expression Programming with Free Coefficients (GEP-FC) strategy. The seven-parametric FRC-mixtures model which is generated according to this method correlated well with the measured data. The developed procedure can be used for concrete mixtures finding closed mathematical expressions, which are based on the measured data.
Keywords: Concrete design, fibre reinforced concrete, genetic algorithms, GEP-FC.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 992309 A Hybrid Differential Transform Approach for Laser Heating of a Double-Layered Thin Film
Authors: Cheng-Ying Lo
Abstract:
This paper adopted the hybrid differential transform approach for studying heat transfer problems in a gold/chromium thin film with an ultra-short-pulsed laser beam projecting on the gold side. The physical system, formulated based on the hyperbolic two-step heat transfer model, covers three characteristics: (i) coupling effects between the electron/lattice systems, (ii) thermal wave propagation in metals, and (iii) radiation effects along the interface. The differential transform method is used to transfer the governing equations in the time domain into the spectrum equations, which is further discretized in the space domain by the finite difference method. The results, obtained through a recursive process, show that the electron temperature in the gold film can rise up to several thousand degrees before its electron/lattice systems reach equilibrium at only several hundred degrees. The electron and lattice temperatures in the chromium film are much lower than those in the gold film.
Keywords: Differential transform, hyperbolic heat transfer, thin film, ultrashort-pulsed laser.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1594308 Determining of Threshold Levels of Burst by Burst AQAM/CDMA in Slow Rayleigh Fading Environments
Authors: F. Nejadebrahimi, M. ArdebiliPour
Abstract:
In this paper, we are going to determine the threshold levels of adaptive modulation in a burst by burst CDMA system by a suboptimum method so that the above method attempts to increase the average bit per symbol (BPS) rate of transceiver system by switching between the different modulation modes in variable channel condition. In this method, we choose the minimum values of average bit error rate (BER) and maximum values of average BPS on different values of average channel signal to noise ratio (SNR) and then calculate the relative threshold levels of them, so that when the instantaneous SNR increases, a higher order modulation be employed for increasing throughput and vise-versa when the instantaneous SNR decreases, a lower order modulation be employed for improvement of BER. In transmission step, by this adaptive modulation method, in according to comparison between obtained estimation of pilot symbols and a set of above suboptimum threshold levels, above system chooses one of states no transmission, BPSK, 4QAM and square 16QAM for modulation of data. The expected channel in this paper is a slow Rayleigh fading.
Keywords: AQAM, burst, BER, BPS, CDMA, threshold.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1538307 A Two Level Load Balancing Approach for Cloud Environment
Authors: Anurag Jain, Rajneesh Kumar
Abstract:
Cloud computing is the outcome of rapid growth of internet. Due to elastic nature of cloud computing and unpredictable behavior of user, load balancing is the major issue in cloud computing paradigm. An efficient load balancing technique can improve the performance in terms of efficient resource utilization and higher customer satisfaction. Load balancing can be implemented through task scheduling, resource allocation and task migration. Various parameters to analyze the performance of load balancing approach are response time, cost, data processing time and throughput. This paper demonstrates a two level load balancer approach by combining join idle queue and join shortest queue approach. Authors have used cloud analyst simulator to test proposed two level load balancer approach. The results are analyzed and compared with the existing algorithms and as observed, proposed work is one step ahead of existing techniques.
Keywords: Cloud Analyst, Cloud Computing, Join Idle Queue, Join Shortest Queue, Load balancing, Task Scheduling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1994306 Effect of Curing Profile to Eliminate the Voids / Black Dots Formation in Underfill Epoxy for Hi-CTE Flip Chip Packaging
Authors: Zainudin Kornain, Azman Jalar, Rozaidi Rasid, Fong Chee Seng
Abstract:
Void formation in underfill is considered as failure in flip chip manufacturing process. Void formation possibly caused by several factors such as poor soldering and flux residue during die attach process, void entrapment due moisture contamination, dispense pattern process and setting up the curing process. This paper presents the comparison of single step and two steps curing profile towards the void and black dots formation in underfill for Hi-CTE Flip Chip Ceramic Ball Grid Array Package (FC-CBGA). Statistic analysis was conducted to analyze how different factors such as wafer lot, sawing technique, underfill fillet height and curing profile recipe were affected the formation of voids and black dots. A C-Mode Scanning Aqoustic Microscopy (C-SAM) was used to scan the total count of voids and black dots. It was shown that the 2 steps curing profile provided solution for void elimination and black dots in underfill after curing process.Keywords: black dots formation, curing profile, FC-CBGA, underfill, void formation,
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4074305 Low Power and Less Area Architecture for Integer Motion Estimation
Authors: C Hisham, K Komal, Amit K Mishra
Abstract:
Full search block matching algorithm is widely used for hardware implementation of motion estimators in video compression algorithms. In this paper we are proposing a new architecture, which consists of a 2D parallel processing unit and a 1D unit both working in parallel. The proposed architecture reduces both data access power and computational power which are the main causes of power consumption in integer motion estimation. It also completes the operations with nearly the same number of clock cycles as compared to a 2D systolic array architecture. In this work sum of absolute difference (SAD)-the most repeated operation in block matching, is calculated in two steps. The first step is to calculate the SAD for alternate rows by a 2D parallel unit. If the SAD calculated by the parallel unit is less than the stored minimum SAD, the SAD of the remaining rows is calculated by the 1D unit. Early termination, which stops avoidable computations has been achieved with the help of alternate rows method proposed in this paper and by finding a low initial SAD value based on motion vector prediction. Data reuse has been applied to the reference blocks in the same search area which significantly reduced the memory access.
Keywords: Sum of absolute difference, high speed DSP.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1495304 Inadequate Requirements Engineering Process: A Key Factor for Poor Software Development in Developing Nations: A Case Study
Authors: K. Adu Michael, K. Alese Boniface
Abstract:
Developing a reliable and sustainable software products is today a big challenge among up–coming software developers in Nigeria. The inability to develop a comprehensive problem statement needed to execute proper requirements engineering process is missing. The need to describe the ‘what’ of a system in one document, written in a natural language is a major step in the overall process of Software Engineering. Requirements Engineering is a process use to discover, analyze and validate system requirements. This process is needed in reducing software errors at the early stage of the development of software. The importance of each of the steps in Requirements Engineering is clearly explained in the context of using detailed problem statement from client/customer to get an overview of an existing system along with expectations from the new system. This paper elicits inadequate Requirements Engineering principle as the major cause of poor software development in developing nations using a case study of final year computer science students of a tertiary-education institution in Nigeria.
Keywords: Client/Customer, Problem Statement, Requirements Engineering, Software Developers.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2458303 Risk Assessment in Durations and Costs for Construction of Industrial Facilities in Egypt Using Equations and Computer
Authors: M. Kamal Elbokl, Negadi Kheira
Abstract:
Risk Evaluation is an important step in protecting your workers and your business, as well as complying with the law. It helps you focus on the risks that really matter in your workplace – the ones with the potential to cause real harm. We are in this paper introduce basics of risk assessment then we mention some of ways to risk evaluation by computer especially Monte Carlo simulation and Microsoft project.
We use Program Evaluation and Review Technique (PERT) to deal with Risks in Industrial Facilities in Evaluation and Assessment for this risk. Using PERT Technique in Microsoft Project by the PERT toolbar and using PERTMASTER Program with Primavera Program we evaluate many hazards and make calculations for that by mathematical equation to make right decisions. We define and calculate risk factor and risk severity to ranking the type of the risk then dealing with it using in that many ways like probability computation, curves, and tables. By introducing variables in the equation of functions in computer programs we calculate the risk in the time and the cost in general case and then mention some examples in industrial facilities field.
Keywords: Risk, Industrial Facilities, PERT, Monte Carlo Simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1956