Search results for: optimization algorithms
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4942

Search results for: optimization algorithms

3052 Integrated Genetic-A* Graph Search Algorithm Decision Model for Evaluating Cost and Quality of School Renovation Strategies

Authors: Yu-Ching Cheng, Yi-Kai Juan, Daniel Castro

Abstract:

Energy consumption of buildings has been an increasing concern for researchers and practitioners in the last decade. Sustainable building renovation can reduce energy consumption and carbon dioxide emissions; meanwhile, it also can extend existing buildings useful life and facilitate environmental sustainability while providing social and economic benefits to the society. School buildings are different from other designed spaces as they are more crowded and host the largest portion of daily activities and occupants. Strategies that focus on reducing energy use but also improve the students’ learning environment becomes a significant subject in sustainable school buildings development. A decision model is developed in this study to solve complicated and large-scale combinational, discrete and determinate problems such as school renovation projects. The task of this model is to automatically search for the most cost-effective (lower cost and higher quality) renovation strategies. In this study, the search process of optimal school building renovation solutions is by nature a large-scale zero-one programming determinate problem. A* is suitable for solving deterministic problems due to its stable and effective search process, and genetic algorithms (GA) provides opportunities to acquire global optimal solutions in a short time via its indeterminate search process based on probability. These two algorithms are combined in this study to consider trade-offs between renovation cost and improved quality, this decision model is able to evaluate current school environmental conditions and suggest an optimal scheme of sustainable school buildings renovation strategies. Through adoption of this decision model, school managers can overcome existing limitations and transform school buildings into spaces more beneficial to students and friendly to the environment.

Keywords: decision model, school buildings, sustainable renovation, genetic algorithm, A* search algorithm

Procedia PDF Downloads 124
3051 Revolutionizing Accounting: Unleashing the Power of Artificial Intelligence

Authors: Sogand Barghi

Abstract:

The integration of artificial intelligence (AI) in accounting practices is reshaping the landscape of financial management. This paper explores the innovative applications of AI in the realm of accounting, emphasizing its transformative impact on efficiency, accuracy, decision-making, and financial insights. By harnessing AI's capabilities in data analysis, pattern recognition, and automation, accounting professionals can redefine their roles, elevate strategic decision-making, and unlock unparalleled value for businesses. This paper delves into AI-driven solutions such as automated data entry, fraud detection, predictive analytics, and intelligent financial reporting, highlighting their potential to revolutionize the accounting profession. Artificial intelligence has swiftly emerged as a game-changer across industries, and accounting is no exception. This paper seeks to illuminate the profound ways in which AI is reshaping accounting practices, transcending conventional boundaries, and propelling the profession toward a new era of efficiency and insight-driven decision-making. One of the most impactful applications of AI in accounting is automation. Tasks that were once labor-intensive and time-consuming, such as data entry and reconciliation, can now be streamlined through AI-driven algorithms. This not only reduces the risk of errors but also allows accountants to allocate their valuable time to more strategic and analytical tasks. AI's ability to analyze vast amounts of data in real time enables it to detect irregularities and anomalies that might go unnoticed by traditional methods. Fraud detection algorithms can continuously monitor financial transactions, flagging any suspicious patterns and thereby bolstering financial security. AI-driven predictive analytics can forecast future financial trends based on historical data and market variables. This empowers organizations to make informed decisions, optimize resource allocation, and develop proactive strategies that enhance profitability and sustainability. Traditional financial reporting often involves extensive manual effort and data manipulation. With AI, reporting becomes more intelligent and intuitive. Automated report generation not only saves time but also ensures accuracy and consistency in financial statements. While the potential benefits of AI in accounting are undeniable, there are challenges to address. Data privacy and security concerns, the need for continuous learning to keep up with evolving AI technologies, and potential biases within algorithms demand careful attention. The convergence of AI and accounting marks a pivotal juncture in the evolution of financial management. By harnessing the capabilities of AI, accounting professionals can transcend routine tasks, becoming strategic advisors and data-driven decision-makers. The applications discussed in this paper underline the transformative power of AI, setting the stage for an accounting landscape that is smarter, more efficient, and more insightful than ever before. The future of accounting is here, and it's driven by artificial intelligence.

Keywords: artificial intelligence, accounting, automation, predictive analytics, financial reporting

Procedia PDF Downloads 77
3050 Impact of Climate Change on Flow Regime in Himalayan Basins, Nepal

Authors: Tirtha Raj Adhikari, Lochan Prasad Devkota

Abstract:

This research studied the hydrological regime of three glacierized river basins in Khumbu, Langtang and Annapurna regions of Nepal using the Hydraologiska Byrans Vattenbalansavde (HBV), HVB-light 3.0 model. Future scenario of discharge is also studied using downscaled climate data derived from statistical downscaling method. General Circulation Models (GCMs) successfully simulate future climate variability and climate change on a global scale; however, poor spatial resolution constrains their application for impact studies at a regional or a local level. The dynamically downscaled precipitation and temperature data from Coupled Global Circulation Model 3 (CGCM3) was used for the climate projection, under A2 and A1B SRES scenarios. In addition, the observed historical temperature, precipitation and discharge data were collected from 14 different hydro-metrological locations for the implementation of this study, which include watershed and hydro-meteorological characteristics, trends analysis and water balance computation. The simulated precipitation and temperature were corrected for bias before implementing in the HVB-light 3.0 conceptual rainfall-runoff model to predict the flow regime, in which Groups Algorithms Programming (GAP) optimization approach and then calibration were used to obtain several parameter sets which were finally reproduced as observed stream flow. Except in summer, the analysis showed that the increasing trends in annual as well as seasonal precipitations during the period 2001 - 2060 for both A2 and A1B scenarios over three basins under investigation. In these river basins, the model projected warmer days in every seasons of entire period from 2001 to 2060 for both A1B and A2 scenarios. These warming trends are higher in maximum than in minimum temperatures throughout the year, indicating increasing trend of daily temperature range due to recent global warming phenomenon. Furthermore, there are decreasing trends in summer discharge in Langtang Khola (Langtang region) which is increasing in Modi Khola (Annapurna region) as well as Dudh Koshi (Khumbu region) river basin. The flow regime is more pronounced during later parts of the future decades than during earlier parts in all basins. The annual water surplus of 1419 mm, 177 mm and 49 mm are observed in Annapurna, Langtang and Khumbu region, respectively.

Keywords: temperature, precipitation, water discharge, water balance, global warming

Procedia PDF Downloads 345
3049 Automation of Finite Element Simulations for the Design Space Exploration and Optimization of Type IV Pressure Vessel

Authors: Weili Jiang, Simon Cadavid Lopera, Klaus Drechsler

Abstract:

Fuel cell vehicle has become the most competitive solution for the transportation sector in the hydrogen economy. Type IV pressure vessel is currently the most popular and widely developed technology for the on-board storage, based on their high reliability and relatively low cost. Due to the stringent requirement on mechanical performance, the pressure vessel is subject to great amount of composite material, a major cost driver for the hydrogen tanks. Evidently, the optimization of composite layup design shows great potential in reducing the overall material usage, yet requires comprehensive understanding on underlying mechanisms as well as the influence of different design parameters on mechanical performance. Given the type of materials and manufacturing processes by which the type IV pressure vessels are manufactured, the design and optimization are a nuanced subject. The manifold of stacking sequence and fiber orientation variation possibilities have an out-standing effect on vessel strength due to the anisotropic property of carbon fiber composites, which make the design space high dimensional. Each variation of design parameters requires computational resources. Using finite element analysis to evaluate different designs is the most common method, however, the model-ing, setup and simulation process can be very time consuming and result in high computational cost. For this reason, it is necessary to build a reliable automation scheme to set up and analyze the di-verse composite layups. In this research, the simulation process of different tank designs regarding various parameters is conducted and automatized in a commercial finite element analysis framework Abaqus. Worth mentioning, the modeling of the composite overwrap is automatically generated using an Abaqus-Python scripting interface. The prediction of the winding angle of each layer and corresponding thickness variation on dome region is the most crucial step of the modeling, which is calculated and implemented using analytical methods. Subsequently, these different composites layups are simulated as axisymmetric models to facilitate the computational complexity and reduce the calculation time. Finally, the results are evaluated and compared regarding the ultimate tank strength. By automatically modeling, evaluating and comparing various composites layups, this system is applicable for the optimization of the tanks structures. As mentioned above, the mechanical property of the pressure vessel is highly dependent on composites layup, which requires big amount of simulations. Consequently, to automatize the simulation process gains a rapid way to compare the various designs and provide an indication of the optimum one. Moreover, this automation process can also be operated for creating a data bank of layups and corresponding mechanical properties with few preliminary configuration steps for the further case analysis. Subsequently, using e.g. machine learning to gather the optimum by the data pool directly without the simulation process.

Keywords: type IV pressure vessels, carbon composites, finite element analy-sis, automation of simulation process

Procedia PDF Downloads 139
3048 Parallel Multisplitting Methods for Differential Systems

Authors: Malika El Kyal, Ahmed Machmoum

Abstract:

We prove the superlinear convergence of asynchronous multi-splitting methods applied to differential equations. This study is based on the technique of nested sets. It permits to specify kind of the convergence in the asynchronous mode.The main characteristic of an asynchronous mode is that the local algorithm not have to wait at predetermined messages to become available. We allow some processors to communicate more frequently than others, and we allow the communication delays to be substantial and unpredictable. Note that synchronous algorithms in the computer science sense are particular cases of our formulation of asynchronous one.

Keywords: parallel methods, asynchronous mode, multisplitting, ODE

Procedia PDF Downloads 527
3047 A New Multi-Target, Multi-Agent Search and Rescue Path Planning Approach

Authors: Jean Berger, Nassirou Lo, Martin Noel

Abstract:

Perfectly suited for natural or man-made emergency and disaster management situations such as flood, earthquakes, tornadoes, or tsunami, multi-target search path planning for a team of rescue agents is known to be computationally hard, and most techniques developed so far come short to successfully estimate optimality gap. A novel mixed-integer linear programming (MIP) formulation is proposed to optimally solve the multi-target multi-agent discrete search and rescue (SAR) path planning problem. Aimed at maximizing cumulative probability of successful target detection, it captures anticipated feedback information associated with possible observation outcomes resulting from projected path execution, while modeling agent discrete actions over all possible moving directions. Problem modeling further takes advantage of network representation to encompass decision variables, expedite compact constraint specification, and lead to substantial problem-solving speed-up. The proposed MIP approach uses CPLEX optimization machinery, efficiently computing near-optimal solutions for practical size problems, while giving a robust upper bound obtained from Lagrangean integrality constraint relaxation. Should eventually a target be positively detected during plan execution, a new problem instance would simply be reformulated from the current state, and then solved over the next decision cycle. A computational experiment shows the feasibility and the value of the proposed approach.

Keywords: search path planning, search and rescue, multi-agent, mixed-integer linear programming, optimization

Procedia PDF Downloads 376
3046 Development of a Bioprocess Technology for the Production of Vibrio midae, a Probiotic for Use in Abalone Aquaculture

Authors: Ghaneshree Moonsamy, Nodumo N. Zulu, Rajesh Lalloo, Suren Singh, Santosh O. Ramchuran

Abstract:

The abalone industry of South Africa is under severe pressure due to illegal harvesting and poaching of this seafood delicacy. These abalones are harvested excessively; as a result, these animals do not have a chance to replace themselves in their habitats, ensuing in a drastic decrease in natural stocks of abalone. Abalone has an extremely slow growth rate and takes approximately four years to reach a size that is market acceptable; therefore, it was imperative to investigate methods to boost the overall growth rate and immunity of the animal. The University of Cape Town (UCT) began to research, which resulted in the isolation of two microorganisms, a yeast isolate Debaryomyces hansenii and a bacterial isolate Vibrio midae, from the gut of the abalone and characterised them for their probiotic abilities. This work resulted in an internationally competitive concept technology that was patented. The next stage of research was to develop a suitable bioprocess to enable commercial production. Numerous steps were taken to develop an efficient production process for V. midae, one of the isolates found by UCT. The initial stages of research involved the development of a stable and robust inoculum and the optimization of physiological growth parameters such as temperature and pH. A range of temperature and pH conditions were evaluated, and data obtained revealed an optimum growth temperature of 30ᵒC and a pH of 6.5. Once these critical growth parameters were established further media optimization studies were performed. Corn steep liquor (CSL) and high test molasses (HTM) were selected as suitable alternatives to more expensive, conventionally used growth medium additives. The optimization of CSL (6.4 g.l⁻¹) and HTM (24 g.l⁻¹) concentrations in the growth medium resulted in a 180% increase in cell concentration, a 5716-fold increase in cell productivity and a 97.2% decrease in the material cost of production in comparison to conventional growth conditions and parameters used at the onset of the study. In addition, a stable market-ready liquid probiotic product, encompassing the viable but not culturable (VBNC) state of Vibrio midae cells, was developed during the downstream processing aspect of the study. The demonstration of this technology at a full manufacturing scale has further enhanced the attractiveness and commercial feasibility of this production process.

Keywords: probiotics, abalone aquaculture, bioprocess technology, manufacturing scale technology development

Procedia PDF Downloads 157
3045 Simulation of Wet Scrubbers for Flue Gas Desulfurization

Authors: Anders Schou Simonsen, Kim Sorensen, Thomas Condra

Abstract:

Wet scrubbers are used for flue gas desulfurization by injecting water directly into the flue gas stream from a set of sprayers. The water droplets will flow freely inside the scrubber, and flow down along the scrubber walls as a thin wall film while reacting with the gas phase to remove SO₂. This complex multiphase phenomenon can be divided into three main contributions: the continuous gas phase, the liquid droplet phase, and the liquid wall film phase. This study proposes a complete model, where all three main contributions are taken into account and resolved using OpenFOAM for the continuous gas phase, and MATLAB for the liquid droplet and wall film phases. The 3D continuous gas phase is composed of five species: CO₂, H₂O, O₂, SO₂, and N₂, which are resolved along with momentum, energy, and turbulence. Source terms are present for four species, energy and momentum, which are affecting the steady-state solution. The liquid droplet phase experiences breakup, collisions, dynamics, internal chemistry, evaporation and condensation, species mass transfer, energy transfer and wall film interactions. Numerous sub-models have been implemented and coupled to realise the above-mentioned phenomena. The liquid wall film experiences impingement, acceleration, atomization, separation, internal chemistry, evaporation and condensation, species mass transfer, and energy transfer, which have all been resolved using numerous sub-models as well. The continuous gas phase has been coupled with the liquid phases using source terms by an approach, where the two software packages are couples using a link-structure. The complete CFD model has been verified using 16 experimental tests from an existing scrubber installation, where a gradient-based pattern search optimization algorithm has been used to tune numerous model parameters to match the experimental results. The CFD model needed to be fast for evaluation in order to apply this optimization routine, where approximately 1000 simulations were needed. The results show that the complex multiphase phenomena governing wet scrubbers can be resolved in a single model. The optimization routine was able to tune the model to accurately predict the performance of an existing installation. Furthermore, the study shows that a coupling between OpenFOAM and MATLAB is realizable, where the data and source term exchange increases the computational requirements by approximately 5%. This allows for exploiting the benefits of both software programs.

Keywords: desulfurization, discrete phase, scrubber, wall film

Procedia PDF Downloads 276
3044 Multi-Objective Optimization of Assembly Manufacturing Factory Setups

Authors: Andreas Lind, Aitor Iriondo Pascual, Dan Hogberg, Lars Hanson

Abstract:

Factory setup lifecycles are most often described and prepared in CAD environments; the preparation is based on experience and inputs from several cross-disciplinary processes. Early in the factory setup preparation, a so-called block layout is created. The intention is to describe a high-level view of the intended factory setup and to claim area reservations and allocations. Factory areas are then blocked, i.e., targeted to be used for specific intended resources and processes, later redefined with detailed factory setup layouts. Each detailed layout is based on the block layout and inputs from cross-disciplinary preparation processes, such as manufacturing sequence, productivity, workers’ workplace requirements, and resource setup preparation. However, this activity is often not carried out with all variables considered simultaneously, which might entail a risk of sub-optimizing the detailed layout based on manual decisions. Therefore, this work aims to realize a digital method for assembly manufacturing layout planning where productivity, area utilization, and ergonomics can be considered simultaneously in a cross-disciplinary manner. The purpose of the digital method is to support engineers in finding optimized designs of detailed layouts for assembly manufacturing factories, thereby facilitating better decisions regarding setups of future factories. Input datasets are company-specific descriptions of required dimensions for specific area reservations, such as defined dimensions of a worker’s workplace, material façades, aisles, and the sequence to realize the product assembly manufacturing process. To test and iteratively develop the digital method, a demonstrator has been developed with an adaptation of existing software that simulates and proposes optimized designs of detailed layouts. Since the method is to consider productivity, ergonomics, area utilization, and constraints from the automatically generated block layout, a multi-objective optimization approach is utilized. In the demonstrator, the input data are sent to the simulation software industrial path solutions (IPS). Based on the input and Lua scripts, the IPS software generates a block layout in compliance with the company’s defined dimensions of area reservations. Communication is then established between the IPS and the software EPP (Ergonomics in Productivity Platform), including intended resource descriptions, assembly manufacturing process, and manikin (digital human) resources. Using multi-objective optimization approaches, the EPP software then calculates layout proposals that are sent iteratively and simulated and rendered in IPS, following the rules and regulations defined in the block layout as well as productivity and ergonomics constraints and objectives. The software demonstrator is promising. The software can handle several parameters to optimize the detailed layout simultaneously and can put forward several proposals. It can optimize multiple parameters or weight the parameters to fine-tune the optimal result of the detailed layout. The intention of the demonstrator is to make the preparation between cross-disciplinary silos transparent and achieve a common preparation of the assembly manufacturing factory setup, thereby facilitating better decisions.

Keywords: factory setup, multi-objective, optimization, simulation

Procedia PDF Downloads 157
3043 Sensitivity Analysis in Fuzzy Linear Programming Problems

Authors: S. H. Nasseri, A. Ebrahimnejad

Abstract:

Fuzzy set theory has been applied to many fields, such as operations research, control theory, and management sciences. In this paper, we consider two classes of fuzzy linear programming (FLP) problems: Fuzzy number linear programming and linear programming with trapezoidal fuzzy variables problems. We state our recently established results and develop fuzzy primal simplex algorithms for solving these problems. Finally, we give illustrative examples.

Keywords: fuzzy linear programming, fuzzy numbers, duality, sensitivity analysis

Procedia PDF Downloads 570
3042 Optimum Performance of the Gas Turbine Power Plant Using Adaptive Neuro-Fuzzy Inference System and Statistical Analysis

Authors: Thamir K. Ibrahim, M. M. Rahman, Marwah Noori Mohammed

Abstract:

This study deals with modeling and performance enhancements of a gas-turbine combined cycle power plant. A clean and safe energy is the greatest challenges to meet the requirements of the green environment. These requirements have given way the long-time governing authority of steam turbine (ST) in the world power generation, and the gas turbine (GT) will replace it. Therefore, it is necessary to predict the characteristics of the GT system and optimize its operating strategy by developing a simulation system. The integrated model and simulation code for exploiting the performance of gas turbine power plant are developed utilizing MATLAB code. The performance code for heavy-duty GT and CCGT power plants are validated with the real power plant of Baiji GT and MARAFIQ CCGT plants the results have been satisfactory. A new technology of correlation was considered for all types of simulation data; whose coefficient of determination (R2) was calculated as 0.9825. Some of the latest launched correlations were checked on the Baiji GT plant and apply error analysis. The GT performance was judged by particular parameters opted from the simulation model and also utilized Adaptive Neuro-Fuzzy System (ANFIS) an advanced new optimization technology. The best thermal efficiency and power output attained were about 56% and 345MW respectively. Thus, the operation conditions and ambient temperature are strongly influenced on the overall performance of the GT. The optimum efficiency and power are found at higher turbine inlet temperatures. It can be comprehended that the developed models are powerful tools for estimating the overall performance of the GT plants.

Keywords: gas turbine, optimization, ANFIS, performance, operating conditions

Procedia PDF Downloads 429
3041 Optimization Aluminium Design for the Facade Second Skin toward Visual Comfort: Case Studies & Dialux Daylighting Simulation Model

Authors: Yaseri Dahlia Apritasari

Abstract:

Visual comfort is important for the building occupants to need. Visual comfort can be fulfilled through natural lighting (daylighting) and artificial lighting. One strategy to optimize natural lighting can be achieved through the facade second skin design. This strategy can reduce glare, and fulfill visual comfort need. However, the design strategy cannot achieve light intensity for visual comfort. Because the materials, design and opening percentage of the facade of second skin blocked sunlight. This paper discusses aluminum material for the facade second skin design that can fulfill the optimal visual comfort with the case studies Multi Media Tower building. The methodology of the research is combination quantitative and qualitative through field study observed, lighting measurement and visual comfort questionnaire. Then it used too simulation modeling (DIALUX 4.13, 2016) for three facades second skin design model. Through following steps; (1) Measuring visual comfort factor: light intensity indoor and outdoor; (2) Taking visual comfort data from building occupants; (3) Making models with different facade second skin design; (3) Simulating and analyzing the light intensity value for each models that meet occupants visual comfort standard: 350 lux (Indonesia National Standard, 2010). The result shows that optimization of aluminum material for the facade second skin design can meet optimal visual comfort for building occupants. The result can give recommendation aluminum opening percentage of the facade second skin can meet optimal visual comfort for building occupants.

Keywords: aluminium material, Facade, second skin, visual comfort

Procedia PDF Downloads 356
3040 Generative Design of Acoustical Diffuser and Absorber Elements Using Large-Scale Additive Manufacturing

Authors: Saqib Aziz, Brad Alexander, Christoph Gengnagel, Stefan Weinzierl

Abstract:

This paper explores a generative design, simulation, and optimization workflow for the integration of acoustical diffuser and/or absorber geometry with embedded coupled Helmholtz-resonators for full-scale 3D printed building components. Large-scale additive manufacturing in conjunction with algorithmic CAD design tools enables a vast amount of control when creating geometry. This is advantageous regarding the increasing demands of comfort standards for indoor spaces and the use of more resourceful and sustainable construction methods and materials. The presented methodology highlights these new technological advancements and offers a multimodal and integrative design solution with the potential for an immediate application in the AEC-Industry. In principle, the methodology can be applied to a wide range of structural elements that can be manufactured by additive manufacturing processes. The current paper focuses on a case study of an application for a biaxial load-bearing beam grillage made of reinforced concrete, which allows for a variety of applications through the combination of additive prefabricated semi-finished parts and in-situ concrete supplementation. The semi-prefabricated parts or formwork bodies form the basic framework of the supporting structure and at the same time have acoustic absorption and diffusion properties that are precisely acoustically programmed for the space underneath the structure. To this end, a hybrid validation strategy is being explored using a digital and cross-platform simulation environment, verified with physical prototyping. The iterative workflow starts with the generation of a parametric design model for the acoustical geometry using the algorithmic visual scripting editor Grasshopper3D inside the building information modeling (BIM) software Revit. Various geometric attributes (i.e., bottleneck and cavity dimensions) of the resonator are parameterized and fed to a numerical optimization algorithm which can modify the geometry with the goal of increasing absorption at resonance and increasing the bandwidth of the effective absorption range. Using Rhino.Inside and LiveLink for Revit, the generative model was imported directly into the Multiphysics simulation environment COMSOL. The geometry was further modified and prepared for simulation in a semi-automated process. The incident and scattered pressure fields were simulated from which the surface normal absorption coefficients were calculated. This reciprocal process was repeated to further optimize the geometric parameters. Subsequently the numerical models were compared to a set of 3D concrete printed physical twin models, which were tested in a .25 m x .25 m impedance tube. The empirical results served to improve the starting parameter settings of the initial numerical model. The geometry resulting from the numerical optimization was finally returned to grasshopper for further implementation in an interdisciplinary study.

Keywords: acoustical design, additive manufacturing, computational design, multimodal optimization

Procedia PDF Downloads 161
3039 Automatic Approach for Estimating the Protection Elements of Electric Power Plants

Authors: Mahmoud Mohammad Salem Al-Suod, Ushkarenko O. Alexander, Dorogan I. Olga

Abstract:

New algorithms using microprocessor systems have been proposed for protection the diesel-generator unit in autonomous power systems. The software structure is designed to enhance the control automata of the system, in which every protection module of diesel-generator encapsulates the finite state machine.

Keywords: diesel-generator unit, protection, state diagram, control system, algorithm, software components

Procedia PDF Downloads 422
3038 Optimization of Springback Prediction in U-Channel Process Using Response Surface Methodology

Authors: Muhamad Sani Buang, Shahrul Azam Abdullah, Juri Saedon

Abstract:

There is not much effective guideline on development of design parameters selection on springback for advanced high strength steel sheet metal in U-channel process during cold forming process. This paper presents the development of predictive model for springback in U-channel process on advanced high strength steel sheet employing Response Surface Methodology (RSM). The experimental was performed on dual phase steel sheet, DP590 in U-channel forming process while design of experiment (DoE) approach was used to investigates the effects of four factors namely blank holder force (BHF), clearance (C) and punch travel (Tp) and rolling direction (R) were used as input parameters using two level values by applying Full Factorial design (24). From a statistical analysis of variant (ANOVA), result showed that blank holder force (BHF), clearance (C) and punch travel (Tp) displayed significant effect on springback of flange angle (β2) and wall opening angle (β1), while rolling direction (R) factor is insignificant. The significant parameters are optimized in order to reduce the springback behavior using Central Composite Design (CCD) in RSM and the optimum parameters were determined. A regression model for springback was developed. The effect of individual parameters and their response was also evaluated. The results obtained from optimum model are in agreement with the experimental values

Keywords: advance high strength steel, u-channel process, springback, design of experiment, optimization, response surface methodology (rsm)

Procedia PDF Downloads 546
3037 Aggregating Buyers and Sellers for E-Commerce: How Demand and Supply Meet in Fairs

Authors: Pierluigi Gallo, Francesco Randazzo, Ignazio Gallo

Abstract:

In recent years, many new and interesting models of successful online business have been developed. Many of these are based on the competition between users, such as online auctions, where the product price is not fixed and tends to rise. Other models, including group-buying, are based on cooperation between users, characterized by a dynamic price of the product that tends to go down. There is not yet a business model in which both sellers and buyers are grouped in order to negotiate on a specific product or service. The present study investigates a new extension of the group-buying model, called fair, which allows aggregation of demand and supply for price optimization, in a cooperative manner. Additionally, our system also aggregates products and destinations for shipping optimization. We introduced the following new relevant input parameters in order to implement a double-side aggregation: (a) price-quantity curves provided by the seller; (b) waiting time, that is, the longer buyers wait, the greater discount they get; (c) payment time, which determines if the buyer pays before, during or after receiving the product; (d) the distance between the place where products are available and the place of shipment, provided in advance by the buyer or dynamically suggested by the system. To analyze the proposed model we implemented a system prototype and a simulator that allows studying effects of changing some input parameters. We analyzed the dynamic price model in fairs having one single seller and a combination of selected sellers. The results are very encouraging and motivate further investigation on this topic.

Keywords: auction, aggregation, fair, group buying, social buying

Procedia PDF Downloads 297
3036 Ultrasound-Assisted Extraction of Carotenoids from Tangerine Peel Using Ostrich Oil as a Green Solvent and Optimization of the Process by Response Surface Methodology

Authors: Fariba Tadayon, Nika Gharahgolooyan, Ateke Tadayon, Mostafa Jafarian

Abstract:

Carotenoid pigments are a various group of lipophilic compounds that generate the yellow to red colors of many plants, foods and flowers. A well-known type of carotenoids which is pro-vitamin A is β-carotene. Due to the color of citrus fruit’s peel, the peel can be a good source of different carotenoids. Ostrich oil is one of the most valuable foundations in many branches of industry, medicine, cosmetics and nutrition. The animal-based ostrich oil could be considered as an alternative and green solvent. Following this study, wastes of citrus peel will recycle by a simple method and extracted carotenoids can increase properties of ostrich oil. In this work, a simple and efficient method for extraction of carotenoids from tangerine peel was designed. Ultrasound-assisted extraction (UAE) showed significant effect on the extraction rate by increasing the mass transfer rate. Ostrich oil can be used as a green solvent in many studies to eliminate petroleum-based solvents. Since tangerine peel is a complex source of different carotenoids separation and determination was performed by high-performance liquid chromatography (HPLC). In addition, the ability of ostrich oil and sunflower oil in carotenoid extraction from tangerine peel and carrot was compared. The highest yield of β-carotene extracted from tangerine peel using sunflower oil and ostrich oil were 75.741 and 88.110 (mg/L), respectively. Optimization of the process was achieved by response surface methodology (RSM) and the optimal extraction conditions were tangerine peel powder particle size of 0.180 mm, ultrasonic intensity of 19 W/cm2 and sonication time of 30 minutes.

Keywords: β-carotene, carotenoids, citrus peel, ostrich oil, response surface methodology, ultrasound-assisted extraction

Procedia PDF Downloads 318
3035 Integrating Optuna and Synthetic Data Generation for Optimized Medical Transcript Classification Using BioBERT

Authors: Sachi Nandan Mohanty, Shreya Sinha, Sweeti Sah, Shweta Sharma

Abstract:

The advancement of natural language processing has majorly influenced the field of medical transcript classification, providing a robust framework for enhancing the accuracy of clinical data processing. It has enormous potential to transform healthcare and improve people's livelihoods. This research focuses on improving the accuracy of medical transcript categorization using Bidirectional Encoder Representations from Transformers (BERT) and its specialized variants, including BioBERT, ClinicalBERT, SciBERT, and BlueBERT. The experimental work employs Optuna, an optimization framework, for hyperparameter tuning to identify the most effective variant, concluding that BioBERT yields the best performance. Furthermore, various optimizers, including Adam, RMSprop, and Layerwise adaptive large batch optimization (LAMB), were evaluated alongside BERT's default AdamW optimizer. The findings show that the LAMB optimizer achieves a performance that is equally good as AdamW's. Synthetic data generation techniques from Gretel were utilized to augment the dataset, expanding the original dataset from 5,000 to 10,000 rows. Subsequent evaluations demonstrated that the model maintained its performance with synthetic data, with the LAMB optimizer showing marginally better results. The enhanced dataset and optimized model configurations improved classification accuracy, showcasing the efficacy of the BioBERT variant and the LAMB optimizer. It resulted in an accuracy of up to 98.2% and 90.8% for the original and combined datasets.

Keywords: BioBERT, clinical data, healthcare AI, transformer models

Procedia PDF Downloads 13
3034 Rain Gauges Network Optimization in Southern Peninsular Malaysia

Authors: Mohd Khairul Bazli Mohd Aziz, Fadhilah Yusof, Zulkifli Yusop, Zalina Mohd Daud, Mohammad Afif Kasno

Abstract:

Recent developed rainfall network design techniques have been discussed and compared by many researchers worldwide due to the demand of acquiring higher levels of accuracy from collected data. In many studies, rain-gauge networks are designed to provide good estimation for areal rainfall and for flood modelling and prediction. In a certain study, even using lumped models for flood forecasting, a proper gauge network can significantly improve the results. Therefore existing rainfall network in Johor must be optimized and redesigned in order to meet the required level of accuracy preset by rainfall data users. The well-known geostatistics method (variance-reduction method) that is combined with simulated annealing was used as an algorithm of optimization in this study to obtain the optimal number and locations of the rain gauges. Rain gauge network structure is not only dependent on the station density; station location also plays an important role in determining whether information is acquired accurately. The existing network of 84 rain gauges in Johor is optimized and redesigned by using rainfall, humidity, solar radiation, temperature and wind speed data during monsoon season (November – February) for the period of 1975 – 2008. Three different semivariogram models which are Spherical, Gaussian and Exponential were used and their performances were also compared in this study. Cross validation technique was applied to compute the errors and the result showed that exponential model is the best semivariogram. It was found that the proposed method was satisfied by a network of 64 rain gauges with the minimum estimated variance and 20 of the existing ones were removed and relocated. An existing network may consist of redundant stations that may make little or no contribution to the network performance for providing quality data. Therefore, two different cases were considered in this study. The first case considered the removed stations that were optimally relocated into new locations to investigate their influence in the calculated estimated variance and the second case explored the possibility to relocate all 84 existing stations into new locations to determine the optimal position. The relocations of the stations in both cases have shown that the new optimal locations have managed to reduce the estimated variance and it has proven that locations played an important role in determining the optimal network.

Keywords: geostatistics, simulated annealing, semivariogram, optimization

Procedia PDF Downloads 307
3033 Environmental Benefits of Corn Cob Ash in Lateritic Soil Cement Stabilization for Road Works in a Sub-Tropical Region

Authors: Ahmed O. Apampa, Yinusa A. Jimoh

Abstract:

The potential economic viability and environmental benefits of using a biomass waste, such as corn cob ash (CCA) as pozzolan in stabilizing soils for road pavement construction in a sub-tropical region was investigated. Corn cob was obtained from Maya in South West Nigeria and processed to ash of characteristics similar to Class C Fly Ash pozzolan as specified in ASTM C618-12. This was then blended with ordinary Portland cement in the CCA:OPC ratios of 1:1, 1:2 and 2:1. Each of these blends was then mixed with lateritic soil of ASHTO classification A-2-6(3) in varying percentages from 0 – 7.5% at 1.5% intervals. The soil-CCA-Cement mixtures were thereafter tested for geotechnical index properties including the BS Proctor Compaction, California Bearing Ratio (CBR) and the Unconfined Compression Strength Test. The tests were repeated for soil-cement mix without any CCA blending. The cost of the binder inputs and optimal blends of CCA:OPC in the stabilized soil were thereafter analyzed by developing algorithms that relate the experimental data on strength parameters (Unconfined Compression Strength, UCS and California Bearing Ratio, CBR) with the bivariate independent variables CCA and OPC content, using Matlab R2011b. An optimization problem was then set up minimizing the cost of chemical stabilization of laterite with CCA and OPC, subject to the constraints of minimum strength specifications. The Evolutionary Engine as well as the Generalized Reduced Gradient option of the Solver of MS Excel 2010 were used separately on the cells to obtain the optimal blend of CCA:OPC. The optimal blend attaining the required strength of 1800 kN/m2 was determined for the 1:2 CCA:OPC as 5.4% mix (OPC content 3.6%) compared with 4.2% for the OPC only option; and as 6.2% mix for the 1:1 blend (OPC content 3%). The 2:1 blend did not attain the required strength, though over a 100% gain in UCS value was obtained over the control sample with 0% binder. Upon the fact that 0.97 tonne of CO2 is released for every tonne of cement used (OEE, 2001), the reduced OPC requirement to attain the same result indicates the possibility of reducing the net CO2 contribution of the construction industry to the environment ranging from 14 – 28.5% if CCA:OPC blends are widely used in soil stabilization, going by the results of this study. The paper concludes by recommending that Nigeria and other developing countries in the sub-tropics with abundant stock of biomass waste should look in the direction of intensifying the use of biomass waste as fuel and the derived ash for the production of pozzolans for road-works, thereby reducing overall green house gas emissions and in compliance with the objectives of the United Nations Framework on Climate Change.

Keywords: corn cob ash, biomass waste, lateritic soil, unconfined compression strength, CO2 emission

Procedia PDF Downloads 377
3032 Creation of S-Box in Blowfish Using AES

Authors: C. Rekha, G. N. Krishnamurthy

Abstract:

This paper attempts to develop a different approach for key scheduling algorithm which uses both Blowfish and AES algorithms. The main drawback of Blowfish algorithm is, it takes more time to create the S-box entries. To overcome this, we are replacing process of S-box creation in blowfish, by using key dependent S-box creation from AES without affecting the basic operation of blowfish. The method proposed in this paper uses good features of blowfish as well as AES and also this paper demonstrates the performance of blowfish and new algorithm by considering different aspects of security namely Encryption Quality, Key Sensitivity, and Correlation of horizontally adjacent pixels in an encrypted image.

Keywords: AES, blowfish, correlation coefficient, encryption quality, key sensitivity, s-box

Procedia PDF Downloads 227
3031 A Generalized Weighted Loss for Support Vextor Classification and Multilayer Perceptron

Authors: Filippo Portera

Abstract:

Usually standard algorithms employ a loss where each error is the mere absolute difference between the true value and the prediction, in case of a regression task. In the present, we present several error weighting schemes that are a generalization of the consolidated routine. We study both a binary classification model for Support Vextor Classification and a regression net for Multylayer Perceptron. Results proves that the error is never worse than the standard procedure and several times it is better.

Keywords: loss, binary-classification, MLP, weights, regression

Procedia PDF Downloads 101
3030 Service Flow in Multilayer Networks: A Method for Evaluating the Layout of Urban Medical Resources

Authors: Guanglin Song

Abstract:

(Objective) Situated within the context of China's tiered medical treatment system, this study aims to analyze spatial causes of urban healthcare access difficulties from the perspective of the configuration of healthcare facilities. (Methods) A social network analysis approach is employed to construct a healthcare demand and supply flow network between major residential clusters and various tiers of hospitals in the city.(Conclusion) The findings reveal that:1.there exists overall maldistribution and over-concentration of healthcare resources in Study Area, characterized by structural imbalance; 2.the low rate of primary care utilization in Study Area is a key factor contributing to congestion at higher-tier hospitals, as excessive reliance on these institutions by neighboring communities exacerbates the problem; 3.gradual optimization of the healthcare facility layout in Study Area, encompassing holistic, local, and individual institutional levels, can enhance systemic efficiency and resource balance.(Prospects) This research proposes a method for evaluating urban healthcare resource distribution structures based on service flows within hierarchical networks. It offers spatially targeted optimization suggestions for promoting the implementation of the tiered healthcare system and alleviating challenges related to accessibility and congestion in seeking medical care. Provide some new ideas for researchers and healthcare managers in countries, cities, and healthcare management around the world with similar challenges.

Keywords: flow of public services, urban networks, healthcare facilities, spatial planning, urban networks

Procedia PDF Downloads 76
3029 Early Prediction of Diseases in a Cow for Cattle Industry

Authors: Ghufran Ahmed, Muhammad Osama Siddiqui, Shahbaz Siddiqui, Rauf Ahmad Shams Malick, Faisal Khan, Mubashir Khan

Abstract:

In this paper, a machine learning-based approach for early prediction of diseases in cows is proposed. Different ML algos are applied to extract useful patterns from the available dataset. Technology has changed today’s world in every aspect of life. Similarly, advanced technologies have been developed in livestock and dairy farming to monitor dairy cows in various aspects. Dairy cattle monitoring is crucial as it plays a significant role in milk production around the globe. Moreover, it has become necessary for farmers to adopt the latest early prediction technologies as the food demand is increasing with population growth. This highlight the importance of state-ofthe-art technologies in analyzing how important technology is in analyzing dairy cows’ activities. It is not easy to predict the activities of a large number of cows on the farm, so, the system has made it very convenient for the farmers., as it provides all the solutions under one roof. The cattle industry’s productivity is boosted as the early diagnosis of any disease on a cattle farm is detected and hence it is treated early. It is done on behalf of the machine learning output received. The learning models are already set which interpret the data collected in a centralized system. Basically, we will run different algorithms on behalf of the data set received to analyze milk quality, and track cows’ health, location, and safety. This deep learning algorithm draws patterns from the data, which makes it easier for farmers to study any animal’s behavioral changes. With the emergence of machine learning algorithms and the Internet of Things, accurate tracking of animals is possible as the rate of error is minimized. As a result, milk productivity is increased. IoT with ML capability has given a new phase to the cattle farming industry by increasing the yield in the most cost-effective and time-saving manner.

Keywords: IoT, machine learning, health care, dairy cows

Procedia PDF Downloads 77
3028 An Intelligent Search and Retrieval System for Mining Clinical Data Repositories Based on Computational Imaging Markers and Genomic Expression Signatures for Investigative Research and Decision Support

Authors: David J. Foran, Nhan Do, Samuel Ajjarapu, Wenjin Chen, Tahsin Kurc, Joel H. Saltz

Abstract:

The large-scale data and computational requirements of investigators throughout the clinical and research communities demand an informatics infrastructure that supports both existing and new investigative and translational projects in a robust, secure environment. In some subspecialties of medicine and research, the capacity to generate data has outpaced the methods and technology used to aggregate, organize, access, and reliably retrieve this information. Leading health care centers now recognize the utility of establishing an enterprise-wide, clinical data warehouse. The primary benefits that can be realized through such efforts include cost savings, efficient tracking of outcomes, advanced clinical decision support, improved prognostic accuracy, and more reliable clinical trials matching. The overarching objective of the work presented here is the development and implementation of a flexible Intelligent Retrieval and Interrogation System (IRIS) that exploits the combined use of computational imaging, genomics, and data-mining capabilities to facilitate clinical assessments and translational research in oncology. The proposed System includes a multi-modal, Clinical & Research Data Warehouse (CRDW) that is tightly integrated with a suite of computational and machine-learning tools to provide insight into the underlying tumor characteristics that are not be apparent by human inspection alone. A key distinguishing feature of the System is a configurable Extract, Transform and Load (ETL) interface that enables it to adapt to different clinical and research data environments. This project is motivated by the growing emphasis on establishing Learning Health Systems in which cyclical hypothesis generation and evidence evaluation become integral to improving the quality of patient care. To facilitate iterative prototyping and optimization of the algorithms and workflows for the System, the team has already implemented a fully functional Warehouse that can reliably aggregate information originating from multiple data sources including EHR’s, Clinical Trial Management Systems, Tumor Registries, Biospecimen Repositories, Radiology PAC systems, Digital Pathology archives, Unstructured Clinical Documents, and Next Generation Sequencing services. The System enables physicians to systematically mine and review the molecular, genomic, image-based, and correlated clinical information about patient tumors individually or as part of large cohorts to identify patterns that may influence treatment decisions and outcomes. The CRDW core system has facilitated peer-reviewed publications and funded projects, including an NIH-sponsored collaboration to enhance the cancer registries in Georgia, Kentucky, New Jersey, and New York, with machine-learning based classifications and quantitative pathomics, feature sets. The CRDW has also resulted in a collaboration with the Massachusetts Veterans Epidemiology Research and Information Center (MAVERIC) at the U.S. Department of Veterans Affairs to develop algorithms and workflows to automate the analysis of lung adenocarcinoma. Those studies showed that combining computational nuclear signatures with traditional WHO criteria through the use of deep convolutional neural networks (CNNs) led to improved discrimination among tumor growth patterns. The team has also leveraged the Warehouse to support studies to investigate the potential of utilizing a combination of genomic and computational imaging signatures to characterize prostate cancer. The results of those studies show that integrating image biomarkers with genomic pathway scores is more strongly correlated with disease recurrence than using standard clinical markers.

Keywords: clinical data warehouse, decision support, data-mining, intelligent databases, machine-learning.

Procedia PDF Downloads 136
3027 Brain-Computer Interfaces That Use Electroencephalography

Authors: Arda Ozkurt, Ozlem Bozkurt

Abstract:

Brain-computer interfaces (BCIs) are devices that output commands by interpreting the data collected from the brain. Electroencephalography (EEG) is a non-invasive method to measure the brain's electrical activity. Since it was invented by Hans Berger in 1929, it has led to many neurological discoveries and has become one of the essential components of non-invasive measuring methods. Despite the fact that it has a low spatial resolution -meaning it is able to detect when a group of neurons fires at the same time-, it is a non-invasive method, making it easy to use without possessing any risks. In EEG, electrodes are placed on the scalp, and the voltage difference between a minimum of two electrodes is recorded, which is then used to accomplish the intended task. The recordings of EEGs include, but are not limited to, the currents along dendrites from synapses to the soma, the action potentials along the axons connecting neurons, and the currents through the synaptic clefts connecting axons with dendrites. However, there are some sources of noise that may affect the reliability of the EEG signals as it is a non-invasive method. For instance, the noise from the EEG equipment, the leads, and the signals coming from the subject -such as the activity of the heart or muscle movements- affect the signals detected by the electrodes of the EEG. However, new techniques have been developed to differentiate between those signals and the intended ones. Furthermore, an EEG device is not enough to analyze the data from the brain to be used by the BCI implication. Because the EEG signal is very complex, to analyze it, artificial intelligence algorithms are required. These algorithms convert complex data into meaningful and useful information for neuroscientists to use the data to design BCI devices. Even though for neurological diseases which require highly precise data, invasive BCIs are needed; non-invasive BCIs - such as EEGs - are used in many cases to help disabled people's lives or even to ease people's lives by helping them with basic tasks. For example, EEG is used to detect before a seizure occurs in epilepsy patients, which can then prevent the seizure with the help of a BCI device. Overall, EEG is a commonly used non-invasive BCI technique that has helped develop BCIs and will continue to be used to detect data to ease people's lives as more BCI techniques will be developed in the future.

Keywords: BCI, EEG, non-invasive, spatial resolution

Procedia PDF Downloads 78
3026 AI for Efficient Geothermal Exploration and Utilization

Authors: Velimir Monty Vesselinov, Trais Kliplhuis, Hope Jasperson

Abstract:

Artificial intelligence (AI) is a powerful tool in the geothermal energy sector, aiding in both exploration and utilization. Identifying promising geothermal sites can be challenging due to limited surface indicators and the need for expensive drilling to confirm subsurface resources. Geothermal reservoirs can be located deep underground and exhibit complex geological structures, making traditional exploration methods time-consuming and imprecise. AI algorithms can analyze vast datasets of geological, geophysical, and remote sensing data, including satellite imagery, seismic surveys, geochemistry, geology, etc. Machine learning algorithms can identify subtle patterns and relationships within this data, potentially revealing hidden geothermal potential in areas previously overlooked. To address these challenges, a SIML (Science-Informed Machine Learning) technology has been developed. SIML methods are different from traditional ML techniques. In both cases, the ML models are trained to predict the spatial distribution of an output (e.g., pressure, temperature, heat flux) based on a series of inputs (e.g., permeability, porosity, etc.). The traditional ML (a) relies on deep and wide neural networks (NNs) based on simple algebraic mappings to represent complex processes. In contrast, the SIML neurons incorporate complex mappings (including constitutive relationships and physics/chemistry models). This results in ML models that have a physical meaning and satisfy physics laws and constraints. The prototype of the developed software, called GeoTGO, is accessible through the cloud. Our software prototype demonstrates how different data sources can be made available for processing, executed demonstrative SIML analyses, and presents the results in a table and graphic form.

Keywords: science-informed machine learning, artificial inteligence, exploration, utilization, hidden geothermal

Procedia PDF Downloads 61
3025 Aerodynamic Analysis by Computational Fluids Dynamics in Building: Case Study

Authors: Javier Navarro Garcia, Narciso Vazquez Carretero

Abstract:

Eurocode 1, part 1-4, wind actions, includes in its article 1.5 the possibility of using numerical calculation methods to obtain information on the loads acting on a building. On the other hand, the analysis using computational fluids dynamics (CFD) in aerospace, aeronautical, and industrial applications is already in widespread use. The application of techniques based on CFD analysis on the building to study its aerodynamic behavior now opens a whole alternative field of possibilities for civil engineering and architecture; optimization of the results with respect to those obtained by applying the regulations, the possibility of obtaining information on pressures, speeds at any point of the model for each moment, the analysis of turbulence and the possibility of modeling any geometry or configuration. The present work compares the results obtained on a building, with respect to its aerodynamic behavior, from a mathematical model based on the analysis by CFD with the results obtained by applying Eurocode1, part1-4, wind actions. It is verified that the results obtained by CFD techniques suppose an optimization of the wind action that acts on the building with respect to the wind action obtained by applying the Eurocode1, part 1-4, wind actions. In order to carry out this verification, a 45m high square base truncated pyramid building has been taken. The mathematical model on CFD, based on finite volumes, has been calculated using the FLUENT commercial computer application using a scale-resolving simulation (SRS) type large eddy simulation (LES) turbulence model for an atmospheric boundary layer wind with turbulent component in the direction of the flow.

Keywords: aerodynamic, CFD, computacional fluids dynamics, computational mechanics

Procedia PDF Downloads 141
3024 Microwave Heating and Catalytic Activity of Iron/Carbon Materials for H₂ Production from the Decomposition of Plastic Wastes

Authors: Peng Zhang, Cai Liang

Abstract:

The non-biodegradable plastic wastes have posed severe environmental and ecological contaminations. Numerous technologies, such as pyrolysis, incineration, and landfilling, have already been employed for the treatment of plastic waste. Compared with conventional methods, microwave has displayed unique advantages in the rapid production of hydrogen from plastic wastes. Understanding the interaction between microwave radiation and materials would promote the optimization of several parameters for the microwave reaction system. In this work, various carbon materials have been investigated to reveal microwave heating performance and the ensuing catalytic activity. Results showed that the diversity in the heating characteristic was mainly due to the dielectric properties and the individual microstructures. Furthermore, the gaps and steps among the surface of carbon materials would lead to the distortion of the electromagnetic field, which correspondingly induced plasma discharging. The intensity and location of local plasma were also studied. For high-yield H₂ production, iron nanoparticles were selected as the active sites, and a series of iron/carbon bifunctional catalysts were synthesized. Apart from the high catalytic activity, the iron particles in nano-size close to the microwave skin depth would transfer microwave irradiation to the heat, intensifying the decomposition of plastics. Under microwave radiation, iron is supported on activated carbon material with 10wt.% loading exhibited the best catalytic activity for H₂ production. Specifically, the plastics were rapidly heated up and subsequently converted into H₂ with a hydrogen efficiency of 85%. This work demonstrated a deep understanding of microwave reaction systems and provided the optimization for plastic treatment.

Keywords: plastic waste, recycling, hydrogen, microwave

Procedia PDF Downloads 77
3023 Sintering of YNbO3:Eu3+ Compound: Correlation between Luminescence and Spark Plasma Sintering Effect

Authors: Veronique Jubera, Ka-Young Kim, U-Chan Chung, Amelie Veillere, Jean-Marc Heintz

Abstract:

Emitting materials and all solid state lasers are widely used in the field of optical applications and materials science as a source of excitement, instrumental measurements, medical applications, metal shaping etc. Recently promising optical efficiencies were recorded on ceramics which result from a cheaper and faster ways to obtain crystallized materials. The choice and optimization of the sintering process is the key point to fabricate transparent ceramics. It includes a high control on the preparation of the powder with the choice of an adequate synthesis, a pre-heat-treatment, the reproducibility of the sintering cycle, the polishing and post-annealing of the ceramic. The densification is the main factor needed to reach a satisfying transparency, and many technologies are now available. The symmetry of the unit cell plays a crucial role in the diffusion rate of the material. Therefore, the cubic symmetry compounds having an isotropic refractive index is preferred. The cubic Y3NbO7 matrix is an interesting host which can accept a high concentration of rare earth doping element and it has been demonstrated that SPS is an efficient way to sinter this material. The optimization of diffusion losses requires a microstructure of fine ceramics, generally less than one hundred nanometers. In this case, grain growth is not an obstacle to transparency. The ceramics properties are then isotropic thereby to free-shaping step by orienting the ceramics as this is the case for the compounds of lower symmetry. After optimization of the synthesis route, several SPS parameters as heating rate, holding, dwell time and pressure were adjusted in order to increase the densification of the Eu3+ doped Y3NbO7 pellets. The luminescence data coupled with X-Ray diffraction analysis and electronic diffraction microscopy highlight the existence of several distorted environments of the doping element in the studied defective fluorite-type host lattice. Indeed, the fast and high crystallization rate obtained to put in evidence a lack of miscibility in the phase diagram, being the final composition of the pellet driven by the ratio between niobium and yttrium elements. By following the luminescence properties, we demonstrate a direct impact on the SPS process on this material.

Keywords: emission, niobate of rare earth, Spark plasma sintering, lack of miscibility

Procedia PDF Downloads 271