Search results for: map optimization tool
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7795

Search results for: map optimization tool

6325 Concentrated Whey Protein Drink with Orange Flavor: Protein Modification and Formulation

Authors: Shahram Naghizadeh Raeisi, Ali Alghooneh

Abstract:

The application of whey protein in drink industry to enhance the nutritional value of the products is important. Furthermore, the gelification of protein during thermal treatment and shelf life makes some limitations in its application. So, the main goal of this research is manufacturing of high concentrate whey protein orange drink with appropriate shelf life. In this way, whey protein was 5 to 30% hydrolyzed ( in 5 percent intervals at six stages), then thermal stability of samples with 10% concentration of protein was tested in acidic condition (T= 90 °C, pH=4.2, 5 minutes ) and neutral condition (T=120° C, pH:6.7, 20 minutes.) Furthermore, to study the shelf life of heat treated samples in 4 months at 4 and 24 °C, the time sweep rheological test were done. At neutral conditions, 5 to 20% hydrolyzed sample showed gelling during thermal treatment, whereas at acidic condition, was happened only in 5 to 10 percent hydrolyzed samples. This phenomenon could be related to the difference in hydrodynamic radius and zeta potential of samples with different level of hydrolyzation at acidic and neutral conditions. To study the gelification of heat resistant protein solutions during shelf life, for 4 months with 7 days intervals, the time sweep analysis were performed. Cross over was observed for all heat resistant neutral samples at both storage temperature, while in heat resistant acidic samples with degree of hydrolysis, 25 and 30 percentage at 4 and 20 °C, it was not seen. It could be concluded that the former sample was stable during heat treatment and 4 months storage, which made them a good choice for manufacturing high protein drinks. The Scheffe polynomial model and numerical optimization were employed for modeling and high protein orange drink formula optimization. Scheffe model significantly predicted the overal acceptance index (Pvalue<0.05) of sensorial analysis. The coefficient of determination (R2) of 0.94, the adjusted coefficient of determination (R2Adj) of 0.90, insignificance of the lack-of-fit test and F value of 64.21 showed the accuracy of the model. Moreover, the coefficient of variable (C.V) was 6.8% which suggested the replicability of the experimental data. The desirability function had been achieved to be 0.89, which indicates the high accuracy of optimization. The optimum formulation was found as following: Modified whey protein solution (65.30%), natural orange juice (33.50%), stevia sweetener (0.05%), orange peel oil (0.15%) and citric acid (1 %), respectively. Its worth mentioning that this study made an appropriate model for application of whey protein in drink industry without bitter flavor and gelification during heat treatment and shelf life.

Keywords: croos over, orange beverage, protein modification, optimization

Procedia PDF Downloads 57
6324 Cross-Linked Amyloglucosidase Aggregates: A New Carrier Free Immobilization Strategy for Continuous Saccharification of Starch

Authors: Sidra Pervez, Afsheen Aman, Shah Ali Ul Qader

Abstract:

The importance of attaining an optimum performance of an enzyme is often a question of devising an effective method for its immobilization. Cross-linked enzyme aggregate (CLEAs) is a new approach for immobilization of enzymes using carrier free strategy. This method is exquisitely simple (involving precipitation of the enzyme from aqueous buffer followed by cross-linking of the resulting physical aggregates of enzyme molecules) and amenable to rapid optimization. Among many industrial enzymes, amyloglucosidase is an important amylolytic enzyme that hydrolyzes alpha (1→4) and alpha (1→6) glycosidic bonds in starch molecule and produce glucose as a sole end product. Glucose liberated by amyloglucosidase can be used for the production of ethanol and glucose syrups. Besides this amyloglucosidase can be widely used in various food and pharmaceuticals industries. For production of amyloglucosidase on commercial scale, filamentous fungi of genera Aspergillus are mostly used because they secrete large amount of enzymes extracellularly. The current investigation was based on isolation and identification of filamentous fungi from genus Aspergillus for the production of amyloglucosidase in submerged fermentation and optimization of cultivation parameters for starch saccharification. Natural isolates were identified as Aspergillus niger KIBGE-IB36, Aspergillus fumigatus KIBGE-IB33, Aspergillus flavus KIBGE-IB34 and Aspergillus terreus KIBGE-IB35 on taxonomical basis and 18S rDNA analysis and their sequence were submitted to GenBank. Among them, Aspergillus fumigatus KIBGE-IB33 was selected on the basis of maximum enzyme production. After optimization of fermentation conditions enzyme was immobilized on CLEA. Different parameters were optimized for maximum immobilization of amyloglucosidase. Data of enzyme stability (thermal and Storage) and reusability suggested the applicability of immobilized amyloglucosidase for continuous saccharification of starch in industrial processes.

Keywords: aspergillus, immobilization, industrial processes, starch saccharification

Procedia PDF Downloads 485
6323 Identifying the Factors affecting on the Success of Energy Usage Saving in Municipality of Tehran

Authors: Rojin Bana Derakhshan, Abbas Toloie

Abstract:

For the purpose of optimizing and developing energy efficiency in building, it is required to recognize key elements of success in optimization of energy consumption before performing any actions. Surveying Principal Components is one of the most valuable result of Linear Algebra because the simple and non-parametric methods are become confusing. So that energy management system implemented according to energy management system international standard ISO50001:2011 and all energy parameters in building to be measured through performing energy auditing. In this essay by simulating used of data mining, the key impressive elements on energy saving in buildings to be determined. This approach is based on data mining statistical techniques using feature selection method and fuzzy logic and convert data from massive to compressed type and used to increase the selected feature. On the other side, influence portion and amount of each energy consumption elements in energy dissipation in percent are recognized as separated norm while using obtained results from energy auditing and after measurement of all energy consuming parameters and identified variables. Accordingly, energy saving solution divided into 3 categories, low, medium and high expense solutions.

Keywords: energy saving, key elements of success, optimization of energy consumption, data mining

Procedia PDF Downloads 456
6322 Steepest Descent Method with New Step Sizes

Authors: Bib Paruhum Silalahi, Djihad Wungguli, Sugi Guritman

Abstract:

Steepest descent method is a simple gradient method for optimization. This method has a slow convergence in heading to the optimal solution, which occurs because of the zigzag form of the steps. Barzilai and Borwein modified this algorithm so that it performs well for problems with large dimensions. Barzilai and Borwein method results have sparked a lot of research on the method of steepest descent, including alternate minimization gradient method and Yuan method. Inspired by previous works, we modified the step size of the steepest descent method. We then compare the modification results against the Barzilai and Borwein method, alternate minimization gradient method and Yuan method for quadratic function cases in terms of the iterations number and the running time. The average results indicate that the steepest descent method with the new step sizes provide good results for small dimensions and able to compete with the results of Barzilai and Borwein method and the alternate minimization gradient method for large dimensions. The new step sizes have faster convergence compared to the other methods, especially for cases with large dimensions.

Keywords: steepest descent, line search, iteration, running time, unconstrained optimization, convergence

Procedia PDF Downloads 533
6321 Scheduling Residential Daily Energy Consumption Using Bi-criteria Optimization Methods

Authors: Li-hsing Shih, Tzu-hsun Yen

Abstract:

Because of the long-term commitment to net zero carbon emission, utility companies include more renewable energy supply, which generates electricity with time and weather restrictions. This leads to time-of-use electricity pricing to reflect the actual cost of energy supply. From an end-user point of view, better residential energy management is needed to incorporate the time-of-use prices and assist end users in scheduling their daily use of electricity. This study uses bi-criteria optimization methods to schedule daily energy consumption by minimizing the electricity cost and maximizing the comfort of end users. Different from most previous research, this study schedules users’ activities rather than household appliances to have better measures of users’ comfort/satisfaction. The relation between each activity and the use of different appliances could be defined by users. The comfort level is at the highest when the time and duration of an activity completely meet the user’s expectation, and the comfort level decreases when the time and duration do not meet expectations. A questionnaire survey was conducted to collect data for establishing regression models that describe users’ comfort levels when the execution time and duration of activities are different from user expectations. Six regression models representing the comfort levels for six types of activities were established using the responses to the questionnaire survey. A computer program is developed to evaluate electricity cost and the comfort level for each feasible schedule and then find the non-dominated schedules. The Epsilon constraint method is used to find the optimal schedule out of the non-dominated schedules. A hypothetical case is presented to demonstrate the effectiveness of the proposed approach and the computer program. Using the program, users can obtain the optimal schedule of daily energy consumption by inputting the intended time and duration of activities and the given time-of-use electricity prices.

Keywords: bi-criteria optimization, energy consumption, time-of-use price, scheduling

Procedia PDF Downloads 48
6320 Optimization of Reliability and Communicability of a Random Two-Dimensional Point Patterns Using Delaunay Triangulation

Authors: Sopheak Sorn, Kwok Yip Szeto

Abstract:

Reliability is one of the important measures of how well the system meets its design objective, and mathematically is the probability that a complex system will perform satisfactorily. When the system is described by a network of N components (nodes) and their L connection (links), the reliability of the system becomes a network design problem that is an NP-hard combinatorial optimization problem. In this paper, we address the network design problem for a random point set’s pattern in two dimensions. We make use of a Voronoi construction with each cell containing exactly one point in the point pattern and compute the reliability of the Voronoi’s dual, i.e. the Delaunay graph. We further investigate the communicability of the Delaunay network. We find that there is a positive correlation and a negative correlation between the homogeneity of a Delaunay's degree distribution with its reliability and its communicability respectively. Based on the correlations, we alter the communicability and the reliability by performing random edge flips, which preserve the number of links and nodes in the network but can increase the communicability in a Delaunay network at the cost of its reliability. This transformation is later used to optimize a Delaunay network with the optimum geometric mean between communicability and reliability. We also discuss the importance of the edge flips in the evolution of real soap froth in two dimensions.

Keywords: Communicability, Delaunay triangulation, Edge Flip, Reliability, Two dimensional network, Voronio

Procedia PDF Downloads 409
6319 Modeling Studies on the Elevated Temperatures Formability of Tube Ends Using RSM

Authors: M. J. Davidson, N. Selvaraj, L. Venugopal

Abstract:

The elevated temperature forming studies on the expansion of thin walled tubes have been studied in the present work. The influence of process parameters namely the die angle, the die ratio and the operating temperatures on the expansion of tube ends at elevated temperatures is carried out. The range of operating parameters have been identified by perfoming extensive simulation studies. The hot forming parameters have been evaluated for AA2014 alloy for performing the simulation studies. Experimental matrix has been developed from the feasible range got from the simulation results. The design of experiments is used for the optimization of process parameters. Response Surface Method’s (RSM) and Box-Behenken design (BBD) is used for developing the mathematical model for expansion. Analysis of variance (ANOVA) is used to analyze the influence of process parameters on the expansion of tube ends. The effect of various process combinations of expansion are analyzed through graphical representations. The developed model is found to be appropriate as the coefficient of determination value is very high and is equal to 0.9726. The predicted values are found to coincide well with the experimental results, within acceptable error limits.

Keywords: expansion, optimization, Response Surface Method (RSM), ANOVA, bbd, residuals, regression, tube

Procedia PDF Downloads 496
6318 Optimization of Strategies and Models Review for Optimal Technologies-Based on Fuzzy Schemes for Green Architecture

Authors: Ghada Elshafei, A. Elazim Negm

Abstract:

Recently, Green architecture becomes a significant way to a sustainable future. Green building designs involve finding the balance between comfortable homebuilding and sustainable environment. Moreover, the utilization of the new technologies such as artificial intelligence techniques are used to complement current practices in creating greener structures to keep the built environment more sustainable. The most common objectives are green buildings should be designed to minimize the overall impact of the built environment on ecosystems in general and particularly on human health and on the natural environment. This will lead to protecting occupant health, improving employee productivity, reducing pollution and sustaining the environmental. In green building design, multiple parameters which may be interrelated, contradicting, vague and of qualitative/quantitative nature are broaden to use. This paper presents a comprehensive critical state of art review of current practices based on fuzzy and its combination techniques. Also, presented how green architecture/building can be improved using the technologies that been used for analysis to seek optimal green solutions strategies and models to assist in making the best possible decision out of different alternatives.

Keywords: green architecture/building, technologies, optimization, strategies, fuzzy techniques, models

Procedia PDF Downloads 459
6317 A Fast, Portable Computational Framework for Aerodynamic Simulations

Authors: Mehdi Ghommem, Daniel Garcia, Nathan Collier, Victor Calo

Abstract:

We develop a fast, user-friendly implementation of a potential flow solver based on the unsteady vortex lattice method (UVLM). The computational framework uses the Python programming language which has easy integration with the scripts requiring computationally-expensive operations written in Fortran. The mixed-language approach enables high performance in terms of solution time and high flexibility in terms of easiness of code adaptation to different system configurations and applications. This computational tool is intended to predict the unsteady aerodynamic behavior of multiple moving bodies (e.g., flapping wings, rotating blades, suspension bridges...) subject to an incoming air. We simulate different aerodynamic problems to validate and illustrate the usefulness and effectiveness of the developed computational tool.

Keywords: unsteady aerodynamics, numerical simulations, mixed-language approach, potential flow

Procedia PDF Downloads 278
6316 Optimization of Assay Parameters of L-Glutaminase from Bacillus cereus MTCC1305 Using Artificial Neural Network

Authors: P. Singh, R. M. Banik

Abstract:

Artificial neural network (ANN) was employed to optimize assay parameters viz., time, temperature, pH of reaction mixture, enzyme volume and substrate concentration of L-glutaminase from Bacillus cereus MTCC 1305. ANN model showed high value of coefficient of determination (0.9999), low value of root mean square error (0.6697) and low value of absolute average deviation. A multilayer perceptron neural network trained with an error back-propagation algorithm was incorporated for developing a predictive model and its topology was obtained as 5-3-1 after applying Levenberg Marquardt (LM) training algorithm. The predicted activity of L-glutaminase was obtained as 633.7349 U/l by considering optimum assay parameters, viz., pH of reaction mixture (7.5), reaction time (20 minutes), incubation temperature (35˚C), substrate concentration (40mM), and enzyme volume (0.5ml). The predicted data was verified by running experiment at simulated optimum assay condition and activity was obtained as 634.00 U/l. The application of ANN model for optimization of assay conditions improved the activity of L-glutaminase by 1.499 fold.

Keywords: Bacillus cereus, L-glutaminase, assay parameters, artificial neural network

Procedia PDF Downloads 424
6315 Multi-Objective Electric Vehicle Charge Coordination for Economic Network Management under Uncertainty

Authors: Ridoy Das, Myriam Neaimeh, Yue Wang, Ghanim Putrus

Abstract:

Electric vehicles are a popular transportation medium renowned for potential environmental benefits. However, large and uncontrolled charging volumes can impact distribution networks negatively. Smart charging is widely recognized as an efficient solution to achieve both improved renewable energy integration and grid relief. Nevertheless, different decision-makers may pursue diverse and conflicting objectives. In this context, this paper proposes a multi-objective optimization framework to control electric vehicle charging to achieve both energy cost reduction and peak shaving. A weighted-sum method is developed due to its intuitiveness and efficiency. Monte Carlo simulations are implemented to investigate the impact of uncertain electric vehicle driving patterns and provide decision-makers with a robust outcome in terms of prospective cost and network loading. The results demonstrate that there is a conflict between energy cost efficiency and peak shaving, with the decision-makers needing to make a collaborative decision.

Keywords: electric vehicles, multi-objective optimization, uncertainty, mixed integer linear programming

Procedia PDF Downloads 171
6314 Machine Learning Assisted Performance Optimization in Memory Tiering

Authors: Derssie Mebratu

Abstract:

As a large variety of micro services, web services, social graphic applications, and media applications are continuously developed, it is substantially vital to design and build a reliable, efficient, and faster memory tiering system. Despite limited design, implementation, and deployment in the last few years, several techniques are currently developed to improve a memory tiering system in a cloud. Some of these techniques are to develop an optimal scanning frequency; improve and track pages movement; identify pages that recently accessed; store pages across each tiering, and then identify pages as a hot, warm, and cold so that hot pages can store in the first tiering Dynamic Random Access Memory (DRAM) and warm pages store in the second tiering Compute Express Link(CXL) and cold pages store in the third tiering Non-Volatile Memory (NVM). Apart from the current proposal and implementation, we also develop a new technique based on a machine learning algorithm in that the throughput produced 25% improved performance compared to the performance produced by the baseline as well as the latency produced 95% improved performance compared to the performance produced by the baseline.

Keywords: machine learning, bayesian optimization, memory tiering, CXL, DRAM

Procedia PDF Downloads 86
6313 An Adiabatic Quantum Optimization Approach for the Mixed Integer Nonlinear Programming Problem

Authors: Maxwell Henderson, Tristan Cook, Justin Chan Jin Le, Mark Hodson, YoungJung Chang, John Novak, Daniel Padilha, Nishan Kulatilaka, Ansu Bagchi, Sanjoy Ray, John Kelly

Abstract:

We present a method of using adiabatic quantum optimization (AQO) to solve a mixed integer nonlinear programming (MINLP) problem instance. The MINLP problem is a general form of a set of NP-hard optimization problems that are critical to many business applications. It requires optimizing a set of discrete and continuous variables with nonlinear and potentially nonconvex constraints. Obtaining an exact, optimal solution for MINLP problem instances of non-trivial size using classical computation methods is currently intractable. Current leading algorithms leverage heuristic and divide-and-conquer methods to determine approximate solutions. Creating more accurate and efficient algorithms is an active area of research. Quantum computing (QC) has several theoretical benefits compared to classical computing, through which QC algorithms could obtain MINLP solutions that are superior to current algorithms. AQO is a particular form of QC that could offer more near-term benefits compared to other forms of QC, as hardware development is in a more mature state and devices are currently commercially available from D-Wave Systems Inc. It is also designed for optimization problems: it uses an effect called quantum tunneling to explore all lowest points of an energy landscape where classical approaches could become stuck in local minima. Our work used a novel algorithm formulated for AQO to solve a special type of MINLP problem. The research focused on determining: 1) if the problem is possible to solve using AQO, 2) if it can be solved by current hardware, 3) what the currently achievable performance is, 4) what the performance will be on projected future hardware, and 5) when AQO is likely to provide a benefit over classical computing methods. Two different methods, integer range and 1-hot encoding, were investigated for transforming the MINLP problem instance constraints into a mathematical structure that can be embedded directly onto the current D-Wave architecture. For testing and validation a D-Wave 2X device was used, as well as QxBranch’s QxLib software library, which includes a QC simulator based on simulated annealing. Our results indicate that it is mathematically possible to formulate the MINLP problem for AQO, but that currently available hardware is unable to solve problems of useful size. Classical general-purpose simulated annealing is currently able to solve larger problem sizes, but does not scale well and such methods would likely be outperformed in the future by improved AQO hardware with higher qubit connectivity and lower temperatures. If larger AQO devices are able to show improvements that trend in this direction, commercially viable solutions to the MINLP for particular applications could be implemented on hardware projected to be available in 5-10 years. Continued investigation into optimal AQO hardware architectures and novel methods for embedding MINLP problem constraints on to those architectures is needed to realize those commercial benefits.

Keywords: adiabatic quantum optimization, mixed integer nonlinear programming, quantum computing, NP-hard

Procedia PDF Downloads 515
6312 Long-Term Results of Coronary Bifurcation Stenting with Drug Eluting Stents

Authors: Piotr Muzyk, Beata Morawiec, Mariusz Opara, Andrzej Tomasik, Brygida Przywara-Chowaniec, Wojciech Jachec, Ewa Nowalany-Kozielska, Damian Kawecki

Abstract:

Background: Coronary bifurcation is one of the most complex lesion in patients with coronary ar-tery disease. Provisional T-stenting is currently one of the recommended techniques. The aim was to assess optimal methods of treatment in the era of drug-eluting stents (DES). Methods: The regis-try consisted of data from 1916 patients treated with coronary percutaneous interventions (PCI) using either first- or second-generation DES. Patients with bifurcation lesion entered the analysis. Major adverse cardiac and cardiovascular events (MACCE) were assessed at one year of follow-up and comprised of death, acute myocardial infarction (AMI), repeated PCI (re-PCI) of target ves-sel and stroke. Results: Of 1916 registry patients, 204 patients (11%) were diagnosed with bifurcation lesion >50% and entered the analysis. The most commonly used technique was provi-sional T-stenting (141 patients, 69%). Optimization with kissing-balloons technique was performed in 45 patients (22%). In 59 patients (29%) second-generation DES was implanted, while in 112 pa-tients (55%), first-generation DES was used. In 33 patients (16%) both types of DES were used. The procedure success rate (TIMI 3 flow) was achieved in 98% of patients. In one-year follow-up, there were 39 MACCE (19%) (9 deaths, 17 AMI, 16 re-PCI and 5 strokes). Provisional T-stenting resulted in similar rate of MACCE to other techniques (16% vs. 5%, p=0.27) and similar occurrence of re-PCI (6% vs. 2%, p=0.78). The results of post-PCI kissing-balloon technique gave equal out-comes with 3% vs. 16% of MACCE in patients in whom no optimization technique was used (p=0.39). The type of implanted DES (second- vs. first-generation) had no influence on MACCE (4% vs 14%, respectively, p=0.12) and re-PCI (1.7% vs. 51% patients, respectively, p=0.28). Con-clusions: The treatment of bifurcation lesions with PCI represent high-risk procedures with high rate of MACCE. Stenting technique, optimization of PCI and the generation of implanted stent should be personalized for each case to balance risk of the procedure. In this setting, the operator experience might be the factor of better outcome, which should be further investigated.

Keywords: coronary bifurcation, drug eluting stents, long-term follow-up, percutaneous coronary interventions

Procedia PDF Downloads 193
6311 Presenting of 'Local Wishes Map' as a Tool for Promoting Dialogue and Developing Healthy Cities

Authors: Ana Maria G. Sperandio, Murilo U. Malek-Zadeh, João Luiz de S. Areas, Jussara C. Guarnieri

Abstract:

Intersectoral governance is a requirement for developing healthy cities. However, this achievement is difficult to be succeeded, especially in regions at low resources condition. Therefore, it was developed a cheap investigative procedure to diagnose sectoral wishes related to urban planning and health promotion. This procedure is composed of two phases, which can be applied to different groups in order to compare the results. The first phase is a conversation guided by a list of questions. Some of those questions aim to gather information about how individuals understand concepts such as healthy city or a health promotion and what they believe that constitutes the relation between urban planning and urban health. Other questions investigate local issues, and how citizens would like to promote dialogue between sectors. At second phase individuals stand around the investigated city (or city region) map and are asked to represent their wishes on it. They can represent it by writing text notations or inserting icons on it, with the latter representing a city element, for example, some trees, a square, a playground, a hospital, a cycle track. After groups had represented their wishes, the map can be photographed, and then the results from distinct groups can be compared. This procedure was conducted at a small city in Brazil (Holambra), in 2017 which is the first out of four years of the mayor’s term. The prefecture asked for this tool in order to make Holambra become a city of Potential Healthy Municipalities Network in Brazil. Two sectors were investigated: the government and the urban population. By the end of our investigation, the intersection from the group (i.e., population and government) maps was accounted for creating a map of common wishes. Therefore, the material produced can be used as a guide for promoting dialogue between sectors and as a tool of monitoring politics progress. The report of this procedure was directed to public managers, so they could see the common wishes between themselves and local populations, and use this tool as a guide for creating urban politics which intends to enhance health promotion and to develop a healthy city, even at low resources condition.

Keywords: governance, health promotion, intersectorality, urban planning

Procedia PDF Downloads 129
6310 Design Optimization of Miniature Mechanical Drive Systems Using Tolerance Analysis Approach

Authors: Eric Mxolisi Mkhondo

Abstract:

Geometrical deviations and interaction of mechanical parts influences the performance of miniature systems.These deviations tend to cause costly problems during assembly due to imperfections of components, which are invisible to a naked eye.They also tend to cause unsatisfactory performance during operation due to deformation cause by environmental conditions.One of the effective tools to manage the deviations and interaction of parts in the system is tolerance analysis.This is a quantitative tool for predicting the tolerance variations which are defined during the design process.Traditional tolerance analysis assumes that the assembly is static and the deviations come from the manufacturing discrepancies, overlooking the functionality of the whole system and deformation of parts due to effect of environmental conditions. This paper presents an integrated tolerance analysis approach for miniature system in operation.In this approach, a computer-aided design (CAD) model is developed from system’s specification.The CAD model is then used to specify the geometrical and dimensional tolerance limits (upper and lower limits) that vary component’s geometries and sizes while conforming to functional requirements.Worst-case tolerances are analyzed to determine the influenced of dimensional changes due to effects of operating temperatures.The method is used to evaluate the nominal conditions, and worse case conditions in maximum and minimum dimensions of assembled components.These three conditions will be evaluated under specific operating temperatures (-40°C,-18°C, 4°C, 26°C, 48°C, and 70°C). A case study on the mechanism of a zoom lens system is used to illustrate the effectiveness of the methodology.

Keywords: geometric dimensioning, tolerance analysis, worst-case analysis, zoom lens mechanism

Procedia PDF Downloads 157
6309 Experimental Chip/Tool Temperature FEM Model Calibration by Infrared Thermography: A Case Study

Authors: Riccardo Angiuli, Michele Giannuzzi, Rodolfo Franchi, Gabriele Papadia

Abstract:

Temperature knowledge in machining is fundamental to improve the numerical and FEM models used for the study of some critical process aspects, such as the behavior of the worked material and tool. The extreme conditions in which they operate make it impossible to use traditional measuring instruments; infrared thermography can be used as a valid measuring instrument for temperature measurement during metal cutting. In the study, a large experimental program on superduplex steel (ASTM A995 gr. 5A) cutting was carried out, the relevant cutting temperatures were measured by infrared thermography when certain cutting parameters changed, from traditional values to extreme ones. The values identified were used to calibrate a FEM model for the prediction of residual life of the tools. During the study, the problems related to the detection of cutting temperatures by infrared thermography were analyzed, and a dedicated procedure was developed that could be used during similar processing.

Keywords: machining, infrared thermography, FEM, temperature measurement

Procedia PDF Downloads 174
6308 Optimization and Evaluation of Different Pathways to Produce Biofuel from Biomass

Authors: Xiang Zheng, Zhaoping Zhong

Abstract:

In this study, Aspen Plus was used to simulate the whole process of biomass conversion to liquid fuel in different ways, and the main results of material and energy flow were obtained. The process optimization and evaluation were carried out on the four routes of cellulosic biomass pyrolysis gasification low-carbon olefin synthesis olefin oligomerization, biomass water pyrolysis and polymerization to jet fuel, biomass fermentation to ethanol, and biomass pyrolysis to liquid fuel. The environmental impacts of three biomass species (poplar wood, corn stover, and rice husk) were compared by the gasification synthesis pathway. The global warming potential, acidification potential, and eutrophication potential of the three biomasses were the same as those of rice husk > poplar wood > corn stover. In terms of human health hazard potential and solid waste potential, the results were poplar > rice husk > corn stover. In the popular pathway, 100 kg of poplar biomass was input to obtain 11.9 kg of aviation coal fraction and 6.3 kg of gasoline fraction. The energy conversion rate of the system was 31.6% when the output product energy included only the aviation coal product. In the basic process of hydrothermal depolymerization process, 14.41 kg aviation kerosene was produced per 100 kg biomass. The energy conversion rate of the basic process was 33.09%, which can be increased to 38.47% after the optimal utilization of lignin gasification and steam reforming for hydrogen production. The total exergy efficiency of the system increased from 30.48% to 34.43% after optimization, and the exergy loss mainly came from the concentration of precursor dilute solution. Global warming potential in environmental impact is mostly affected by the production process. Poplar wood was used as raw material in the process of ethanol production from cellulosic biomass. The simulation results showed that 827.4 kg of pretreatment mixture, 450.6 kg of fermentation broth, and 24.8 kg of ethanol were produced per 100 kg of biomass. The power output of boiler combustion reached 94.1 MJ, the unit power consumption in the process was 174.9 MJ, and the energy conversion rate was 33.5%. The environmental impact was mainly concentrated in the production process and agricultural processes. On the basis of the original biomass pyrolysis to liquid fuel, the enzymatic hydrolysis lignin residue produced by cellulose fermentation to produce ethanol was used as the pyrolysis raw material, and the fermentation and pyrolysis processes were coupled. In the coupled process, 24.8 kg ethanol and 4.78 kg upgraded liquid fuel were produced per 100 kg biomass with an energy conversion rate of 35.13%.

Keywords: biomass conversion, biofuel, process optimization, life cycle assessment

Procedia PDF Downloads 64
6307 Improving the Teaching and Learning of Basic Mathematics: An Imperative for Sustainable Development

Authors: Dahiru Bawa Muhammad

Abstract:

Mathematics is accorded a prime position in basic education curriculum because it is envisaged to be an important tool in preparing children for life after school as well as equipping them with skills needed for secondary and higher education. As a result of this, the subject is made compulsory from primary through secondary school and candidates are expected to offer it and pass before fulfilling the requirement for higher education. Against this backdrop, this paper overviewed the basic education programme, context of teaching and learning mathematics at basic education level in Katsina State of Nigeria, relevance of the subject to different fields of human endeavours, challenges threatening the utility of the subject as a tool for the achievement of the goals of basic education programme and concluded by recommending how teaching and learning of mathematics can be improved for even development of citizens within nation states and enhanced/mutual sustainable development of nations in the global village.

Keywords: basic education, junior secondary school education, mathematical centre

Procedia PDF Downloads 449
6306 Daylightophil Approach towards High-Performance Architecture for Hybrid-Optimization of Visual Comfort and Daylight Factor in BSk

Authors: Mohammadjavad Mahdavinejad, Hadi Yazdi

Abstract:

The greatest influence we have from the world is shaped through the visual form, thus light is an inseparable element in human life. The use of daylight in visual perception and environment readability is an important issue for users. With regard to the hazards of greenhouse gas emissions from fossil fuels, and in line with the attitudes on the reduction of energy consumption, the correct use of daylight results in lower levels of energy consumed by artificial lighting, heating and cooling systems. Windows are usually the starting points for analysis and simulations to achieve visual comfort and energy optimization; therefore, attention should be paid to the orientation of buildings to minimize electrical energy and maximize the use of daylight. In this paper, by using the Design Builder Software, the effect of the orientation of an 18m2(3m*6m) room with 3m height in city of Tehran has been investigated considering the design constraint limitations. In these simulations, the dimensions of the building have been changed with one degree and the window is located on the smaller face (3m*3m) of the building with 80% ratio. The results indicate that the orientation of building has a lot to do with energy efficiency to meet high-performance architecture and planning goals and objectives.

Keywords: daylight, window, orientation, energy consumption, design builder

Procedia PDF Downloads 219
6305 Predictive Analytics in Oil and Gas Industry

Authors: Suchitra Chnadrashekhar

Abstract:

Earlier looked as a support function in an organization information technology has now become a critical utility to manage their daily operations. Organizations are processing huge amount of data which was unimaginable few decades before. This has opened the opportunity for IT sector to help industries across domains to handle the data in the most intelligent manner. Presence of IT has been a leverage for the Oil & Gas industry to store, manage and process the data in most efficient way possible thus deriving the economic value in their day-to-day operations. Proper synchronization between Operational data system and Information Technology system is the need of the hour. Predictive analytics supports oil and gas companies by addressing the challenge of critical equipment performance, life cycle, integrity, security, and increase their utilization. Predictive analytics go beyond early warning by providing insights into the roots of problems. To reach their full potential, oil and gas companies need to take a holistic or systems approach towards asset optimization and thus have the functional information at all levels of the organization in order to make the right decisions. This paper discusses how the use of predictive analysis in oil and gas industry is redefining the dynamics of this sector. Also, the paper will be supported by real time data and evaluation of the data for a given oil production asset on an application tool, SAS. The reason for using SAS as an application for our analysis is that SAS provides an analytics-based framework to improve uptimes, performance and availability of crucial assets while reducing the amount of unscheduled maintenance, thus minimizing maintenance-related costs and operation disruptions. With state-of-the-art analytics and reporting, we can predict maintenance problems before they happen and determine root causes in order to update processes for future prevention.

Keywords: hydrocarbon, information technology, SAS, predictive analytics

Procedia PDF Downloads 342
6304 A Constrained Neural Network Based Variable Neighborhood Search for the Multi-Objective Dynamic Flexible Job Shop Scheduling Problems

Authors: Aydin Teymourifar, Gurkan Ozturk, Ozan Bahadir

Abstract:

In this paper, a new neural network based variable neighborhood search is proposed for the multi-objective dynamic, flexible job shop scheduling problems. The neural network controls the problems' constraints to prevent infeasible solutions, while the Variable Neighborhood Search (VNS) applies moves, based on the critical block concept to improve the solutions. Two approaches are used for managing the constraints, in the first approach, infeasible solutions are modified according to the constraints, after the moves application, while in the second one, infeasible moves are prevented. Several neighborhood structures from the literature with some modifications, also new structures are used in the VNS. The suggested neighborhoods are more systematically defined and easy to implement. Comparison is done based on a multi-objective flexible job shop scheduling problem that is dynamic because of the jobs different release time and machines breakdowns. The results show that the presented method has better performance than the compared VNSs selected from the literature.

Keywords: constrained optimization, neural network, variable neighborhood search, flexible job shop scheduling, dynamic multi-objective optimization

Procedia PDF Downloads 335
6303 PRISM: An Analytical Tool for Forest Plan Development

Authors: Dung Nguyen, Yu Wei, Eric Henderson

Abstract:

Analytical tools have been used for decades to assist in the development of forest plans. In 2016, a new decision support system, PRISM, was jointly developed by United States Forest Service (USFS) Northern Region and Colorado State University to support the forest planning process. Prism has a friendly user interface with functionality for database management, model development, data visualization, and sensitivity analysis. The software is tailored for USFS planning, but it is flexible enough to support planning efforts by other forestland owners and managers. Here, the core capability of PRISM and its applications in developing plans for several United States national forests are presented. The strengths of PRISM are also discussed to show its potential of being a preferable tool for managers and experts in the domain of forest management and planning.

Keywords: decision support, forest management, forest plan, graphical user interface, software

Procedia PDF Downloads 98
6302 High-Speed Cutting of Inconel 625 Using Carbide Ball End Mill

Authors: Kazumasa Kawasaki, Katsuya Fukazawa

Abstract:

Nickel-based superalloys are an important class of engineering material within the aerospace and power generation, due to their excellent combination of corrosion resistance and mechanical properties, including high-temperature applications Inconel 625 is one of such superalloys and difficult-to-machine material. In cutting of Inconel 625 superalloy with a ball end mill, the problem of adhesive wear often occurs. However, the proper cutting conditions are not known so much because of lack of study examples. In this study, the experiments using ball end mills made of carbide tools were tried to find the best cutting conditions out following qualifications. Using Inconel 625 superalloy as a work material, three kinds of experiment, with the revolution speed of 5000 rpm, 8000 rpm, and 10000 rpm, were performed under dry cutting conditions in feed speed per tooth of 0.045 mm/ tooth, depth of cut of 0.1 mm. As a result, in the case of 8000 rpm, it was successful to cut longest with the least wear.

Keywords: Inconel 625, ball end mill, carbide tool, high speed cutting, tool wear

Procedia PDF Downloads 195
6301 Optimization in the Compressive Strength of Iron Slag Self-Compacting Concrete

Authors: Luis E. Zapata, Sergio Ruiz, María F. Mantilla, Jhon A. Villamizar

Abstract:

Sand as fine aggregate for concrete production needs a feasible substitute due to several environmental issues. In this work, a study of the behavior of self-compacting concrete mixtures under replacement of sand by iron slag from 0.0% to 50.0% of weight and variations of water/cementitious material ratio between 0.3 and 0.5 is presented. Control fresh state tests of Slump flow, T500, J-ring and L-box were determined. In the hardened state, compressive strength was determined and optimization from response surface analysis was performed. The study of the variables in the hardened state was developed based on inferential statistical analyses using central composite design methodology and posterior analyses of variance (ANOVA). An increase in the compressive strength up to 50% higher than control mixtures at 7, 14, and 28 days of maturity was the most relevant result regarding the presence of iron slag as replacement of natural sand. Considering the obtained result, it is possible to infer that iron slag is an acceptable alternative replacement material of the natural fine aggregate to be used in structural concrete.

Keywords: ANOVA, iron slag, response surface analysis, self-compacting concrete

Procedia PDF Downloads 133
6300 Can (E-)Mentoring Be a Tool for the Career of Future Translators?

Authors: Ana Sofia Saldanha

Abstract:

The answer is yes. Globalization is changing the translation world day after day, year after year. The need to know more about new technologies, clients, companies, project management and social networks is becoming more and more demanding and increasingly competitive. The great majority of the recently graduated Translators do not know where to go, what to do or even who to contact to start their careers in translation. It is well known that there are innumerous webinars, books, blogs and webpages with the so-called “tips do become a professional translator” indicating for example, what to do, what not to do, rates, how your resume should look like, etc. but are these pieces of advice coming from real translators? Translators who work daily with clients, who understand their demands, requests, questions? As far as today`s trends, the answer is no. Most of these pieces of advice are just theoretical and coming from “brilliant minds” who are more interested in spreading their word and winning “likes” to become, in some way, “important people in some area. Mentoring is, indeed, a highly important tool to help and guide new translators starting their career. An effective and well oriented Mentoring is a powerful way to orient these translators on how to create their resumes, where to send resumes, how to approach clients, how to answer emails and how to negotiate rates in an efficient way. Mentoring is a crucial tool and even some kind of “psychological trigger”, when properly delivered by professional and experienced translators, to help in the so aimed career development. The advice and orientation sessions which can bem 100% done online, using Skype for example, are almost a “weapon” to destroy the barriers created by opinions, by influences or even by universities. This new orientation trend is the future path for new translators and is the future of the Translation industry and professionals and Universities who must update their way of approaching the real translation world, therefore, minds and spirits need to be opened and engaged in this new trend of developing skills.

Keywords: mentoring, orientation, professional follow-up, translation

Procedia PDF Downloads 106
6299 Voxel Models as Input for Heat Transfer Simulations with Siemens NX Based on X-Ray Microtomography Images of Random Fibre Reinforced Composites

Authors: Steven Latré, Frederik Desplentere, Ilya Straumit, Stepan V. Lomov

Abstract:

A method is proposed in order to create a three-dimensional finite element model representing fibre reinforced insulation materials for the simulation software Siemens NX. VoxTex software, a tool for quantification of µCT images of fibrous materials, is used for the transformation of microtomography images of random fibre reinforced composites into finite element models. An automatic tool was developed to execute the import of the models to the thermal solver module of Siemens NX. The paper describes the numerical tools used for the image quantification and the transformation and illustrates them on several thermal simulations of fibre reinforced insulation blankets filled with low thermal conductive fillers. The calculation of thermal conductivity is validated by comparison with the experimental data.

Keywords: analysis, modelling, thermal, voxel

Procedia PDF Downloads 280
6298 Optimization of Monitoring Networks for Air Quality Management in Urban Hotspots

Authors: Vethathirri Ramanujam Srinivasan, S. M. Shiva Nagendra

Abstract:

Air quality management in urban areas is a serious concern in both developed and developing countries. In this regard, more number of air quality monitoring stations are planned to mitigate air pollution in urban areas. In India, Central Pollution Control Board has set up 574 air quality monitoring stations across the country and proposed to set up another 500 stations in the next few years. The number of monitoring stations for each city has been decided based on population data. The setting up of ambient air quality monitoring stations and their operation and maintenance are highly expensive. Therefore, there is a need to optimize monitoring networks for air quality management. The present paper discusses the various methods such as Indian Standards (IS) method, US EPA method and European Union (EU) method to arrive at the minimum number of air quality monitoring stations. In addition, optimization of rain-gauge method and Inverse Distance Weighted (IDW) method using Geographical Information System (GIS) are also explored in the present work for the design of air quality network in Chennai city. In summary, additionally 18 stations are required for Chennai city, and the potential monitoring locations with their corresponding land use patterns are ranked and identified from the 1km x 1km sized grids.

Keywords: air quality monitoring network, inverse distance weighted method, population based method, spatial variation

Procedia PDF Downloads 176
6297 Least-Square Support Vector Machine for Characterization of Clusters of Microcalcifications

Authors: Baljit Singh Khehra, Amar Partap Singh Pharwaha

Abstract:

Clusters of Microcalcifications (MCCs) are most frequent symptoms of Ductal Carcinoma in Situ (DCIS) recognized by mammography. Least-Square Support Vector Machine (LS-SVM) is a variant of the standard SVM. In the paper, LS-SVM is proposed as a classifier for classifying MCCs as benign or malignant based on relevant extracted features from enhanced mammogram. To establish the credibility of LS-SVM classifier for classifying MCCs, a comparative evaluation of the relative performance of LS-SVM classifier for different kernel functions is made. For comparative evaluation, confusion matrix and ROC analysis are used. Experiments are performed on data extracted from mammogram images of DDSM database. A total of 380 suspicious areas are collected, which contain 235 malignant and 145 benign samples, from mammogram images of DDSM database. A set of 50 features is calculated for each suspicious area. After this, an optimal subset of 23 most suitable features is selected from 50 features by Particle Swarm Optimization (PSO). The results of proposed study are quite promising.

Keywords: clusters of microcalcifications, ductal carcinoma in situ, least-square support vector machine, particle swarm optimization

Procedia PDF Downloads 347
6296 Improved Blood Glucose-Insulin Monitoring with Dual-Layer Predictive Control Design

Authors: Vahid Nademi

Abstract:

In response to widely used wearable medical devices equipped with a continuous glucose monitor (CGM) and insulin pump, the advanced control methods are still demanding to get the full benefit of these devices. Unlike costly clinical trials, implementing effective insulin-glucose control strategies can provide significant contributions to the patients suffering from chronic diseases such as diabetes. This study deals with a key role of two-layer insulin-glucose regulator based on model-predictive-control (MPC) scheme so that the patient’s predicted glucose profile is in compliance with the insulin level injected through insulin pump automatically. It is achieved by iterative optimization algorithm which is called an integrated perturbation analysis and sequential quadratic programming (IPA-SQP) solver for handling uncertainties due to unexpected variations in glucose-insulin values and body’s characteristics. The feasibility evaluation of the discussed control approach is also studied by means of numerical simulations of two case scenarios via measured data. The obtained results are presented to verify the superior and reliable performance of the proposed control scheme with no negative impact on patient safety.

Keywords: blood glucose monitoring, insulin pump, predictive control, optimization

Procedia PDF Downloads 130