Search results for: H₂-optimal model reduction
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 20451

Search results for: H₂-optimal model reduction

19041 Interoperable Design Coordination Method for Sharing Communication Information Using Building Information Model Collaboration Format

Authors: Jin Gang Lee, Hyun-Soo Lee, Moonseo Park

Abstract:

The utilization of BIM and IFC allows project participants to collaborate across different areas by consistently sharing interoperable product information represented in a model. Comments or markups generated during the coordination process can be categorized as communication information, which can be shared in less standardized manner. It can be difficult to manage and reuse such information compared to the product information in a model. The present study proposes an interoperable coordination method using BCF (the BIM Collaboration Format) for managing and sharing the communication information during BIM based coordination process. A management function for coordination in the BIM collaboration system is developed to assess its ability to share the communication information in BIM collaboration projects. This approach systematically links communication information during the coordination process to the building model and serves as a type of storage system for retrieving knowledge created during BIM collaboration projects.

Keywords: design coordination, building information model, BIM collaboration format, industry foundation classes

Procedia PDF Downloads 422
19040 Tabu Random Algorithm for Guiding Mobile Robots

Authors: Kevin Worrall, Euan McGookin

Abstract:

The use of optimization algorithms is common across a large number of diverse fields. This work presents the use of a hybrid optimization algorithm applied to a mobile robot tasked with carrying out a search of an unknown environment. The algorithm is then applied to the multiple robots case, which results in a reduction in the time taken to carry out the search. The hybrid algorithm is a Random Search Algorithm fused with a Tabu mechanism. The work shows that the algorithm locates the desired points in a quicker time than a brute force search. The Tabu Random algorithm is shown to work within a simulated environment using a validated mathematical model. The simulation was run using three different environments with varying numbers of targets. As an algorithm, the Tabu Random is small, clear and can be implemented with minimal resources. The power of the algorithm is the speed at which it locates points of interest and the robustness to the number of robots involved. The number of robots can vary with no changes to the algorithm resulting in a flexible algorithm.

Keywords: algorithms, control, multi-agent, search and rescue

Procedia PDF Downloads 235
19039 Development of an Automatic Calibration Framework for Hydrologic Modelling Using Approximate Bayesian Computation

Authors: A. Chowdhury, P. Egodawatta, J. M. McGree, A. Goonetilleke

Abstract:

Hydrologic models are increasingly used as tools to predict stormwater quantity and quality from urban catchments. However, due to a range of practical issues, most models produce gross errors in simulating complex hydraulic and hydrologic systems. Difficulty in finding a robust approach for model calibration is one of the main issues. Though automatic calibration techniques are available, they are rarely used in common commercial hydraulic and hydrologic modelling software e.g. MIKE URBAN. This is partly due to the need for a large number of parameters and large datasets in the calibration process. To overcome this practical issue, a framework for automatic calibration of a hydrologic model was developed in R platform and presented in this paper. The model was developed based on the time-area conceptualization. Four calibration parameters, including initial loss, reduction factor, time of concentration and time-lag were considered as the primary set of parameters. Using these parameters, automatic calibration was performed using Approximate Bayesian Computation (ABC). ABC is a simulation-based technique for performing Bayesian inference when the likelihood is intractable or computationally expensive to compute. To test the performance and usefulness, the technique was used to simulate three small catchments in Gold Coast. For comparison, simulation outcomes from the same three catchments using commercial modelling software, MIKE URBAN were used. The graphical comparison shows strong agreement of MIKE URBAN result within the upper and lower 95% credible intervals of posterior predictions as obtained via ABC. Statistical validation for posterior predictions of runoff result using coefficient of determination (CD), root mean square error (RMSE) and maximum error (ME) was found reasonable for three study catchments. The main benefit of using ABC over MIKE URBAN is that ABC provides a posterior distribution for runoff flow prediction, and therefore associated uncertainty in predictions can be obtained. In contrast, MIKE URBAN just provides a point estimate. Based on the results of the analysis, it appears as though ABC the developed framework performs well for automatic calibration.

Keywords: automatic calibration framework, approximate bayesian computation, hydrologic and hydraulic modelling, MIKE URBAN software, R platform

Procedia PDF Downloads 296
19038 Economical Dependency Evolution and Complexity

Authors: Allé Dieng, Mamadou Bousso, Latif Dramani

Abstract:

The purpose of this work is to show the complexity behind economical interrelations in a country and provide a linear dynamic model of economical dependency evolution in a country. The model is based on National Transfer Account which is one of the most robust methodology developed in order to measure a level of demographic dividend captured in a country. It is built upon three major factors: demography, economical dependency and migration. The established mathematical model has been simulated using Netlogo software. The innovation of this study is in describing economical dependency as a complex system and simulating using mathematical equation the evolution of the two populations: the economical dependent and the non-economical dependent as defined in the National Transfer Account methodology. It also allows us to see the interactions and behaviors of both populations. The model can track individual characteristics and look at the effect of birth and death rates on the evolution of these two populations. The developed model is useful to understand how demographic and economic phenomenon are related

Keywords: ABM, demographic dividend, National Transfer Accounts (NTA), ODE

Procedia PDF Downloads 193
19037 Analysis of the Contribution of Drude and Brendel Model Terms to the Dielectric Function

Authors: Christopher Mkirema Maghanga, Maurice Mghendi Mwamburi

Abstract:

Parametric modeling provides a means to deeper understand the properties of materials. Drude, Brendel, Lorentz and OJL incorporated in SCOUT® software are some of the models used to study dielectric films. In our work, we utilized Brendel and Drude models to extract the optical constants from spectroscopic data of fabricated undoped and niobium doped titanium oxide thin films. The individual contributions by the two models were studied to establish how they influence the dielectric function. The effect of dopants on their influences was also analyzed. For the undoped films, results indicate minimal contribution from the Drude term due to the dielectric nature of the films. However as doping levels increase, the rise in the concentration of free electrons favors the use of Drude model. Brendel model was confirmed to work well with dielectric films - the undoped titanium Oxide films in our case.

Keywords: modeling, Brendel model, optical constants, titanium oxide, Drude Model

Procedia PDF Downloads 177
19036 A Multicriteria Mathematical Programming Model for Farm Planning in Greece

Authors: Basil Manos, Parthena Chatzinikolaou, Fedra Kiomourtzi

Abstract:

This paper presents a Multicriteria Mathematical Programming model for farm planning and sustainable optimization of agricultural production. The model can be used as a tool for the analysis and simulation of agricultural production plans, as well as for the study of impacts of various measures of Common Agriculture Policy in the member states of European Union. The model can achieve the optimum production plan of a farm or an agricultural region combining in one utility function different conflicting criteria as the maximization of gross margin and the minimization of fertilizers used, under a set of constraints for land, labor, available capital, Common Agricultural Policy etc. The proposed model was applied to the region of Larisa in central Greece. The optimum production plan achieves a greater gross return, a less fertilizers use, and a less irrigated water use than the existent production plan.

Keywords: sustainable optimization, multicriteria analysis, agricultural production, farm planning

Procedia PDF Downloads 600
19035 A Comparative Analysis of E-Government Quality Models

Authors: Abdoullah Fath-Allah, Laila Cheikhi, Rafa E. Al-Qutaish, Ali Idri

Abstract:

Many quality models have been used to measure e-government portals quality. However, the absence of an international consensus for e-government portals quality models results in many differences in terms of quality attributes and measures. The aim of this paper is to compare and analyze the existing e-government quality models proposed in literature (those that are based on ISO standards and those that are not) in order to propose guidelines to build a good and useful e-government portals quality model. Our findings show that, there is no e-government portal quality model based on the new international standard ISO 25010. Besides that, the quality models are not based on a best practice model to allow agencies to both; measure e-government portals quality and identify missing best practices for those portals.

Keywords: e-government, portal, best practices, quality model, ISO, standard, ISO 25010, ISO 9126

Procedia PDF Downloads 550
19034 Predicting Options Prices Using Machine Learning

Authors: Krishang Surapaneni

Abstract:

The goal of this project is to determine how to predict important aspects of options, including the ask price. We want to compare different machine learning models to learn the best model and the best hyperparameters for that model for this purpose and data set. Option pricing is a relatively new field, and it can be very complicated and intimidating, especially to inexperienced people, so we want to create a machine learning model that can predict important aspects of an option stock, which can aid in future research. We tested multiple different models and experimented with hyperparameter tuning, trying to find some of the best parameters for a machine-learning model. We tested three different models: a Random Forest Regressor, a linear regressor, and an MLP (multi-layer perceptron) regressor. The most important feature in this experiment is the ask price; this is what we were trying to predict. In the field of stock pricing prediction, there is a large potential for error, so we are unable to determine the accuracy of the models based on if they predict the pricing perfectly. Due to this factor, we determined the accuracy of the model by finding the average percentage difference between the predicted and actual values. We tested the accuracy of the machine learning models by comparing the actual results in the testing data and the predictions made by the models. The linear regression model performed worst, with an average percentage error of 17.46%. The MLP regressor had an average percentage error of 11.45%, and the random forest regressor had an average percentage error of 7.42%

Keywords: finance, linear regression model, machine learning model, neural network, stock price

Procedia PDF Downloads 72
19033 Didactics for Enhancing Balance in Adolescents: Core and Centering

Authors: A. Fogliata, L. Martiniello, A. Ambretti

Abstract:

Introduction: The significance of balance and stability in physical education among adolescents is well-established. This study aims to assess the efficacy of Centering (CENT), which employs intra-abdominal pressure (IAP) in line with the Synchrony Method, in optimizing balance and reducing perceived stress. Materials and Methods: A 6-week intervention was conducted on a sample of adolescents, divided into a control group and an experimental group that incorporated CENT into their physical education program. The Stork Balance Test and the Perceived Stress Scale (PSS) were used to measure changes. Results: Findings revealed a significant enhancement in the balance of both the dominant and non-dominant limbs in the experimental group compared to the control group. Moreover, the PSS test indicated a reduction in perceived stress within the experimental group. Conclusion: Integrating the centering technique into physical education programs can lead to substantial improvements in adolescents' balance and stability, in addition to a reduction in perceived stress levels. These findings suggest the need for further research on broader populations to solidify these pivotal outcomes.

Keywords: adolescents, physical education, balance, centering, intra-abdominal pressure

Procedia PDF Downloads 54
19032 Experimental and Numerical Analysis of Mustafa Paşa Mosque in Skopje

Authors: Ozden Saygili, Eser Cakti

Abstract:

The masonry building stock in Istanbul and in other cities of Turkey are exposed to significant earthquake hazard. Determination of the safety of masonry structures against earthquakes is a complex challenge. This study deals with experimental tests and non-linear dynamic analysis of masonry structures modeled through discrete element method. The 1:10 scale model of Mustafa Paşa Mosque was constructed and the data were obtained from the sensors on it during its testing on the shake table. The results were used in the calibration/validation of the numerical model created on the basis of the 1:10 scale model built for shake table testing. 3D distinct element model was developed that represents the linear and nonlinear behavior of the shake table model as closely as possible during experimental tests. Results of numerical analyses with those from the experimental program were compared and discussed.

Keywords: dynamic analysis, non-linear modeling, shake table tests, masonry

Procedia PDF Downloads 418
19031 The Effect of Sustainable Supply Chain Management on Performance of Agricultural Firms in Nigeria

Authors: Haruna Daddau

Abstract:

This study investigates the effect of sustainable supply chain management (SSCM) on the performance of agricultural firms in Nigeria. Green packaging, product design, waste reduction and supply chain design were examined. The ecological modernization theory, which suggests the economic benefit of the environment, was used to underpin the study. The research is quantitative in nature, and a survey research method was adopted where information was obtained using questionnaires distributed directly to the top managers of 6 agricultural firms in Nigeria. STATA and SPSS were used for the data analysis, and regression analysis was used to examine the effects. Findings showed that SSCM positively improves the performance of the firms. Also, detailed information about the study’s selected variables' effect on performance was provided. Additionally, the significant role of SSCM in accelerating the firms’ performance was highlighted. It is recommended that SSCM should be given serious attention by integrating it into the overall firm's business strategy.

Keywords: sustainable supply chain management, green packaging, product design, waste reduction, supply chain design and performance

Procedia PDF Downloads 21
19030 Tolerating Input Faults in Asynchronous Sequential Machines

Authors: Jung-Min Yang

Abstract:

A method of tolerating input faults for input/state asynchronous sequential machines is proposed. A corrective controller is placed in front of the considered asynchronous machine to realize model matching with a reference model. The value of the external input transmitted to the closed-loop system may change by fault. We address the existence condition for the controller that can counteract adverse effects of any input fault while maintaining the objective of model matching. A design procedure for constructing the controller is outlined. The proposed reachability condition for the controller design is validated in an illustrative example.

Keywords: asynchronous sequential machines, corrective control, fault tolerance, input faults, model matching

Procedia PDF Downloads 416
19029 Inhibition Theory: The Development of Subjective Happiness and Life Satisfaction After Experiencing Severe Traumatic Life Events (Paraplegia)

Authors: Tanja Ecken, Laura Fricke, Anika Wehling, Maren M. Michaelsen, Tobias Esch

Abstract:

Studies and applied experiences evidence severe and traumatic accidents not only require physical rehabilitation and recovery but also necessitate a psychological adaption and reorganization to the changed living conditions. Neurobiological models underpinning the experience of happiness and satisfaction postulate life shocks to potentially enhance the experience of happiness and life satisfaction, i.e., posttraumatic growth (PTG). This present study aims to provide an in-depth understanding of the underlying psychological processes of PTG and to outline its consequences on subjective happiness and life satisfaction. To explore the aforementioned, Esch’s ABC Model was used as guidance for the development of a questionnaire assessing changes in happiness and life satisfaction and for a schematic model postulating the development of PTG in the context of paraplegia. Two-stage qualitative interview procedures explored participants’ experiences of paraplegia. Specifically, narrative, semi-structured interviews (N=28) focused on the time before and after the accident, the availability of supportive resources, and potential changes in the perception of happiness and life satisfaction. Qualitative analysis (Grounded Theory) indicated an initial phase of reorganization was followed by a gradual psychological adaption to novel, albeit reduced, opportunities in life. Participants reportedly experienced a ‘compelled’ slowing down and elements of mindfulness, subsequently instilling a sense of gratitude and joy in relation to life’s presumed trivialities. Despite physical limitations and difficulties, participants reported an enhanced ability to relate to oneself and others and a reduction of perceived everyday nuisances. Concluding, PTG can be experienced in response to severe, traumatic life events and has the potential to enrich the lives of affected persons in numerous, unexpected and yet challenging ways. PTG appears to be a spectrum comprised of an interplay of internal and external resources underpinned by neurobiological processes. Participants experienced PTG irrelevant of age, gender, marital status, income or level of education.

Keywords: post traumatic growth, happiness, life satisfaction, traumatic life events, paraplegia, ABC model, trauma

Procedia PDF Downloads 60
19028 The Free Vibration Analysis of Honeycomb Sandwich Beam using 3D and Continuum Model

Authors: Gürkan Şakar, Fevzi Çakmak Bolat

Abstract:

In this study free vibration analysis of aluminum honeycomb sandwich structures were carried out experimentally and numerically. The natural frequencies and mode shapes of sandwich structures fabricated with different configurations for clamped-free boundary condition were determined. The effects of lower and upper face sheet thickness, the core material thickness, cell diameter, cell angle and foil thickness on the vibration characteristics were examined. The numerical studies were performed with ANSYS package. While the sandwich structures were modeled in ANSYS the continuum model was used. Later, the numerical results were compared with the experimental findings.

Keywords: sandwich structure, free vibration, numeric analysis, 3D model, continuum model

Procedia PDF Downloads 410
19027 Qsar Studies of Certain Novel Heterocycles Derived From bis-1, 2, 4 Triazoles as Anti-Tumor Agents

Authors: Madhusudan Purohit, Stephen Philip, Bharathkumar Inturi

Abstract:

In this paper we report the quantitative structure activity relationship of novel bis-triazole derivatives for predicting the activity profile. The full model encompassed a dataset of 46 Bis- triazoles. Tripos Sybyl X 2.0 program was used to conduct CoMSIA QSAR modeling. The Partial Least-Squares (PLS) analysis method was used to conduct statistical analysis and to derive a QSAR model based on the field values of CoMSIA descriptor. The compounds were divided into test and training set. The compounds were evaluated by various CoMSIA parameters to predict the best QSAR model. An optimum numbers of components were first determined separately by cross-validation regression for CoMSIA model, which were then applied in the final analysis. A series of parameters were used for the study and the best fit model was obtained using donor, partition coefficient and steric parameters. The CoMSIA models demonstrated good statistical results with regression coefficient (r2) and the cross-validated coefficient (q2) of 0.575 and 0.830 respectively. The standard error for the predicted model was 0.16322. In the CoMSIA model, the steric descriptors make a marginally larger contribution than the electrostatic descriptors. The finding that the steric descriptor is the largest contributor for the CoMSIA QSAR models is consistent with the observation that more than half of the binding site area is occupied by steric regions.

Keywords: 3D QSAR, CoMSIA, triazoles, novel heterocycles

Procedia PDF Downloads 439
19026 Spatio-Temporal Land Cover Changes Monitoring Using Remotely Sensed Techniques in Riyadh Region, KSA

Authors: Abdelrahman Elsehsah

Abstract:

Land Use and Land Cover (LULC) dynamics in Riyadh over a decade were comprehensively analyzed using the Google Earth Engine (GEE) platform. By harnessing the Landsat 8 Image collection and night-time light image collection from May to August for the years 2013 and 2023, we were able to generate insightful datasets capturing the changing landscape of the region. Our approach involved a Random Forest (RF) classification model that consistently displayed commendable precision scores above 92% for both years. A notable discovery from the study was the pronounced urban expansion, particularly around Riyadh city. Within a mere ten-year span, urbanization surged noticeably, affecting the broader ecological environment of the region. Interestingly, the northeastern part of Riyadh emerged as a focal point of this growth, signaling rapid urban growth of urban sprawl and development. A comparison between the two years indicates a 21.51% increase in built-up areas, revealing the transformative pace of urban sprawl. Contrastingly, vegetation cover patterns presented a more nuanced picture. While our initial hypothesis predicted a decline in vegetation, the actual findings depicted both vegetation reduction in certain pockets and new growth in others, resulting in an overall 25.89% increase. This intricate pattern might be attributed to shifting agricultural practices, afforestation efforts, or even satellite image timings not aligning with seasonal vegetation growth. The bare soil, predominant in the desert landscape of Riyadh, saw a marginal reduction of 0.37% over the decade, challenging our initial expectations. Urban and agricultural advancements in Saudi Arabia appear to have slightly reduced the expanse of barren terrains. This study, underpinned by a rigorous methodological framework, reveals the multifaceted land cover changes in Riyadh in response to urban development and environmental factors. The precise, data-driven insights provided by our analysis serve as invaluable tools for understanding urban growth trajectories, guiding urban planning, policy formulation, and sustainable development endeavors in the region.

Keywords: remote sensing, KSA, ArcGIS, spatio-temporal

Procedia PDF Downloads 22
19025 AI-Driven Solutions for Optimizing Master Data Management

Authors: Srinivas Vangari

Abstract:

In the era of big data, ensuring the accuracy, consistency, and reliability of critical data assets is crucial for data-driven enterprises. Master Data Management (MDM) plays a crucial role in this endeavor. This paper investigates the role of Artificial Intelligence (AI) in enhancing MDM, focusing on how AI-driven solutions can automate and optimize various stages of the master data lifecycle. By integrating AI (Quantitative and Qualitative Analysis) into processes such as data creation, maintenance, enrichment, and usage, organizations can achieve significant improvements in data quality and operational efficiency. Quantitative analysis is employed to measure the impact of AI on key metrics, including data accuracy, processing speed, and error reduction. For instance, our study demonstrates an 18% improvement in data accuracy and a 75% reduction in duplicate records across multiple systems post-AI implementation. Furthermore, AI’s predictive maintenance capabilities reduced data obsolescence by 22%, as indicated by statistical analyses of data usage patterns over a 12-month period. Complementing this, a qualitative analysis delves into the specific AI-driven strategies that enhance MDM practices, such as automating data entry and validation, which resulted in a 28% decrease in manual errors. Insights from case studies highlight how AI-driven data cleansing processes reduced inconsistencies by 25% and how AI-powered enrichment strategies improved data relevance by 24%, thus boosting decision-making accuracy. The findings demonstrate that AI significantly enhances data quality and integrity, leading to improved enterprise performance through cost reduction, increased compliance, and more accurate, real-time decision-making. These insights underscore the value of AI as a critical tool in modern data management strategies, offering a competitive edge to organizations that leverage its capabilities.

Keywords: artificial intelligence, master data management, data governance, data quality

Procedia PDF Downloads 7
19024 Fabrication Methodologies for Anti-Microbial Polypropylene Surfaces with Leachable and Non-leachable Anti-Microbial Agents

Authors: Saleh Alkarri, Dimple Sharma, Teresa M. Bergholz, Muhammad Rabnawaz

Abstract:

Aims: Develop a methodology for the fabrication of anti-microbial polypropylene (PP) surfaces with (i) leachable copper, (II) chloride dihydrate (CuCl₂·₂H₂O) and (ii) non-leachable magnesium hydroxide (Mg(OH)₂) biocides. Methods and Results: Two methodologies are used to develop anti-microbial PP surfaces. One method involves melt-blending and subsequent injection molding, where the biocide additives were compounded with PP and subsequently injection-molded. The other method involves the thermal embossing of anti-microbial agents on the surface of a PP substrate. The obtained biocide-bearing PP surfaces were evaluated against E. coli K-12 MG1655 for 0, 4, and 24 h to evaluate their anti-microbial properties. The injection-molded PP bearing 5% CuCl2·₂H₂O showed a 6-log reduction of E. coli K-12 MG1655 after 24 h, while only 1 log reduction was observed for PP bearing 5% Mg(OH)2. The thermally embossed PP surfaces bearing CuCl2·2H2O and Mg(OH)₂ particles (at a concentration of 10 mg/mL) showed 3 log and 4 log reduction, respectively, against E.coli K-12 MG1655 after 24 h. Conclusion: The results clearly demonstrate that CuCl₂·2H₂O conferred anti-microbial properties to PP surfaces that were prepared by both injection molding as well as thermal embossing approaches owing to the presence of leachable copper ions. In contrast, the non-leachable Mg(OH)₂ imparted anti-microbial properties only to the surface prepared via the thermal embossing technique. Significance and Impact of The Study: Plastics with leachable biocides are effective anti-microbial surfaces, but their toxicity is a major concern. This study provides a fabrication methodology for non-leachable PP-based anti-microbial surfaces that are potentially safer. In addition, this strategy can be extended to many other plastics substrates.

Keywords: anti-microbial activity, E. coli K-12 MG1655, copper (II) chloride dihydrate, magnesium hydroxide, leachable, non-leachable, compounding, thermal embossing

Procedia PDF Downloads 72
19023 Fabrication Methodologies for Anti-microbial Polypropylene Surfaces with Leachable and Non-leachable Anti-microbial Agents

Authors: Saleh Alkarri, Dimple Sharma, Teresa M. Bergholz, Muhammad Rabnawa

Abstract:

Aims: Develop a methodology for the fabrication of anti-microbial polypropylene (PP) surfaces with (i) leachable copper (II) chloride dihydrate (CuCl2·2H2O) and (ii) non-leachable magnesium hydroxide (Mg(OH)2) biocides. Methods and Results: Two methodologies are used to develop anti-microbial PP surfaces. One method involves melt-blending and subsequent injection molding, where the biocide additives were compounded with PP and subsequently injection-molded. The other method involves the thermal embossing of anti-microbial agents on the surface of a PP substrate. The obtained biocide-bearing PP surfaces were evaluated against E. coli K-12 MG1655 for 0, 4, and 24 h to evaluate their anti-microbial properties. The injection-molded PP bearing 5% CuCl2·2H2O showed a 6-log reduction of E. coli K-12 MG1655 after 24 h, while only 1 log reduction was observed for PP bearing 5% Mg(OH)2. The thermally embossed PP surfaces bearing CuCl2·2H2O and Mg(OH)2 particles (at a concentration of 10 mg/mL) showed 3 log and 4 log reduction, respectively, against E.coli K-12 MG1655 after 24 h. Conclusion: The results clearly demonstrate that CuCl2·2H2O conferred anti-microbial properties to PP surfaces that were prepared by both injection molding as well as thermal embossing approaches owing to the presence of leachable copper ions. In contrast, the non-leachable Mg(OH)2 imparted anti-microbial properties only to the surface prepared via the thermal embossing technique. Significance and Impact of The Study: Plastics with leachable biocides are effective anti-microbial surfaces, but their toxicity is a major concern. This study provides a fabrication methodology for non-leachable PP-based anti-microbial surfaces that are potentially safer. In addition, this strategy can be extended to many other plastics substrates.

Keywords: anti-microbial activity, E. coli K-12 MG1655, copper (II) chloride dihydrate, magnesium hydroxide, leachable, non-leachable, compounding, thermal embossing

Procedia PDF Downloads 78
19022 Estimation of Structural Parameters in Time Domain Using One Dimensional Piezo Zirconium Titanium Patch Model

Authors: N. Jinesh, K. Shankar

Abstract:

This article presents a method of using the one dimensional piezo-electric patch on beam model for structural identification. A hybrid element constituted of one dimensional beam element and a PZT sensor is used with reduced material properties. This model is convenient and simple for identification of beams. Accuracy of this element is first verified against a corresponding 3D finite element model (FEM). The structural identification is carried out as an inverse problem whereby parameters are identified by minimizing the deviation between the predicted and measured voltage response of the patch, when subjected to excitation. A non-classical optimization algorithm Particle Swarm Optimization is used to minimize this objective function. The signals are polluted with 5% Gaussian noise to simulate experimental noise. The proposed method is applied on beam structure and identified parameters are stiffness and damping. The model is also validated experimentally.

Keywords: inverse problem, particle swarm optimization, PZT patches, structural identification

Procedia PDF Downloads 302
19021 Cascaded Neural Network for Internal Temperature Forecasting in Induction Motor

Authors: Hidir S. Nogay

Abstract:

In this study, two systems were created to predict interior temperature in induction motor. One of them consisted of a simple ANN model which has two layers, ten input parameters and one output parameter. The other one consisted of eight ANN models connected each other as cascaded. Cascaded ANN system has 17 inputs. Main reason of cascaded system being used in this study is to accomplish more accurate estimation by increasing inputs in the ANN system. Cascaded ANN system is compared with simple conventional ANN model to prove mentioned advantages. Dataset was obtained from experimental applications. Small part of the dataset was used to obtain more understandable graphs. Number of data is 329. 30% of the data was used for testing and validation. Test data and validation data were determined for each ANN model separately and reliability of each model was tested. As a result of this study, it has been understood that the cascaded ANN system produced more accurate estimates than conventional ANN model.

Keywords: cascaded neural network, internal temperature, inverter, three-phase induction motor

Procedia PDF Downloads 339
19020 Coefficient of Performance (COP) Optimization of an R134a Cross Vane Expander Compressor Refrigeration System

Authors: Y. D. Lim, K. S. Yap, K. T. Ooi

Abstract:

Cross Vane Expander Compressor (CVEC) is a newly invented expander-compressor combined unit, where it is introduced to replace the compressor and the expansion valve in traditional refrigeration system. The mathematical model of CVEC has been developed to examine its performance, and it was found that the energy consumption of a conventional refrigeration system was reduced by as much as 18%. It is believed that energy consumption can be further reduced by optimizing the device. In this study, the coefficient of performance (COP) of CVEC has been optimized under predetermined operational parameters and constrained main design parameters. Several main design parameters of CVEC were selected to be the variables, and the optimization was done with theoretical model in a simulation program. The theoretical model consists of geometrical model, dynamic model, heat transfer model and valve dynamics model. Complex optimization method, which is a constrained, direct search and multi-variables method was used in the study. As a result, the optimization study suggested that with an appropriate combination of design parameters, a 58% COP improvement in CVEC R134a refrigeration system is possible.

Keywords: COP, cross vane expander-compressor, CVEC, design, simulation, refrigeration system, air-conditioning, R134a, multi variables

Procedia PDF Downloads 326
19019 Rainfall–Runoff Simulation Using WetSpa Model in Golestan Dam Basin, Iran

Authors: M. R. Dahmardeh Ghaleno, M. Nohtani, S. Khaledi

Abstract:

Flood simulation and prediction is one of the most active research areas in surface water management. WetSpa is a distributed, continuous, and physical model with daily or hourly time step that explains precipitation, runoff, and evapotranspiration processes for both simple and complex contexts. This model uses a modified rational method for runoff calculation. In this model, runoff is routed along the flow path using Diffusion-Wave equation which depends on the slope, velocity, and flow route characteristics. Golestan Dam Basin is located in Golestan province in Iran and it is passing over coordinates 55° 16´ 50" to 56° 4´ 25" E and 37° 19´ 39" to 37° 49´ 28"N. The area of the catchment is about 224 km2, and elevations in the catchment range from 414 to 2856 m at the outlet, with average slope of 29.78%. Results of the simulations show a good agreement between calculated and measured hydrographs at the outlet of the basin. Drawing upon Nash-Sutcliffe model efficiency coefficient for calibration periodic model estimated daily hydrographs and maximum flow rate with an accuracy up to 59% and 80.18%, respectively.

Keywords: watershed simulation, WetSpa, stream flow, flood prediction

Procedia PDF Downloads 240
19018 Reinforcement Learning for Self Driving Racing Car Games

Authors: Adam Beaunoyer, Cory Beaunoyer, Mohammed Elmorsy, Hanan Saleh

Abstract:

This research aims to create a reinforcement learning agent capable of racing in challenging simulated environments with a low collision count. We present a reinforcement learning agent that can navigate challenging tracks using both a Deep Q-Network (DQN) and a Soft Actor-Critic (SAC) method. A challenging track includes curves, jumps, and varying road widths throughout. Using open-source code on Github, the environment used in this research is based on the 1995 racing game WipeOut. The proposed reinforcement learning agent can navigate challenging tracks rapidly while maintaining low racing completion time and collision count. The results show that the SAC model outperforms the DQN model by a large margin. We also propose an alternative multiple-car model that can navigate the track without colliding with other vehicles on the track. The SAC model is the basis for the multiple-car model, where it can complete the laps quicker than the single-car model but has a higher collision rate with the track wall.

Keywords: reinforcement learning, soft actor-critic, deep q-network, self-driving cars, artificial intelligence, gaming

Procedia PDF Downloads 39
19017 Developing a Sustainable Business Model for Platform-Based Applications in Small and Medium-Sized Enterprise Sawmills: A Systematic Approach

Authors: Franziska Mais, Till Gramberg

Abstract:

The paper presents the development of a sustainable business model for a platform-based application tailored for sawing companies in small and medium-sized enterprises (SMEs). The focus is on the integration of sustainability principles into the design of the business model to ensure a technologically advanced, legally sound, and economically efficient solution. Easy2IoT is a research project that aims to enable companies in the prefabrication sheet metal and sheet metal processing industry to enter the Industrial Internet of Things (IIoT) with a low-threshold and cost-effective approach. The methodological approach of Easy2IoT includes an in-depth requirements analysis and customer interviews with stakeholders along the value chain. Based on these insights, actions, requirements, and potential solutions for smart services are derived. The structuring of the business ecosystem within the application plays a central role, whereby the roles of the partners, the management of the IT infrastructure and services, as well as the design of a sustainable operator model are considered. The business model is developed using the value proposition canvas, whereby a detailed analysis of the requirements for the business model is carried out, taking sustainability into account. This includes coordination with the business model patterns, according to Gassmann, and integration into a business model canvas for the Easy2IoT product. Potential obstacles and problems are identified and evaluated in order to formulate a comprehensive and sustainable business model. In addition, sustainable payment models and distribution channels are developed. In summary, the article offers a well-founded insight into the systematic development of a sustainable business model for platform-based applications in SME sawmills, with a particular focus on the synergy of ecological responsibility and economic efficiency.

Keywords: business model, sustainable business model, IIoT, IIoT-platform, industrie 4.0, big data

Procedia PDF Downloads 69
19016 Freight Time and Cost Optimization in Complex Logistics Networks, Using a Dimensional Reduction Method and K-Means Algorithm

Authors: Egemen Sert, Leila Hedayatifar, Rachel A. Rigg, Amir Akhavan, Olha Buchel, Dominic Elias Saadi, Aabir Abubaker Kar, Alfredo J. Morales, Yaneer Bar-Yam

Abstract:

The complexity of providing timely and cost-effective distribution of finished goods from industrial facilities to customers makes effective operational coordination difficult, yet effectiveness is crucial for maintaining customer service levels and sustaining a business. Logistics planning becomes increasingly complex with growing numbers of customers, varied geographical locations, the uncertainty of future orders, and sometimes extreme competitive pressure to reduce inventory costs. Linear optimization methods become cumbersome or intractable due to a large number of variables and nonlinear dependencies involved. Here we develop a complex systems approach to optimizing logistics networks based upon dimensional reduction methods and apply our approach to a case study of a manufacturing company. In order to characterize the complexity in customer behavior, we define a “customer space” in which individual customer behavior is described by only the two most relevant dimensions: the distance to production facilities over current transportation routes and the customer's demand frequency. These dimensions provide essential insight into the domain of effective strategies for customers; direct and indirect strategies. In the direct strategy, goods are sent to the customer directly from a production facility using box or bulk trucks. In the indirect strategy, in advance of an order by the customer, goods are shipped to an external warehouse near a customer using trains and then "last-mile" shipped by trucks when orders are placed. Each strategy applies to an area of the customer space with an indeterminate boundary between them. Specific company policies determine the location of the boundary generally. We then identify the optimal delivery strategy for each customer by constructing a detailed model of costs of transportation and temporary storage in a set of specified external warehouses. Customer spaces help give an aggregate view of customer behaviors and characteristics. They allow policymakers to compare customers and develop strategies based on the aggregate behavior of the system as a whole. In addition to optimization over existing facilities, using customer logistics and the k-means algorithm, we propose additional warehouse locations. We apply these methods to a medium-sized American manufacturing company with a particular logistics network, consisting of multiple production facilities, external warehouses, and customers along with three types of shipment methods (box truck, bulk truck and train). For the case study, our method forecasts 10.5% savings on yearly transportation costs and an additional 4.6% savings with three new warehouses.

Keywords: logistics network optimization, direct and indirect strategies, K-means algorithm, dimensional reduction

Procedia PDF Downloads 133
19015 Two-Warehouse Inventory Model for Deteriorating Items with Inventory-Level-Dependent Demand under Two Dispatching Policies

Authors: Lei Zhao, Zhe Yuan, Wenyue Kuang

Abstract:

This paper studies two-warehouse inventory models for a deteriorating item considering that the demand is influenced by inventory levels. The problem mainly focuses on the optimal order policy and the optimal order cycle with inventory-level-dependent demand in two-warehouse system for retailers. It considers the different deterioration rates and the inventory holding costs in owned warehouse (OW) and rented warehouse (RW), and the conditions of transportation cost, allowed shortage and partial backlogging. Two inventory models are formulated: last-in first-out (LIFO) model and first-in-first-out (FIFO) model based on the policy choices of LIFO and FIFO, and a comparative analysis of LIFO model and FIFO model is made. The study finds that the FIFO policy is more in line with realistic operating conditions. Especially when the inventory holding cost of OW is high, and there is no difference or big difference between deterioration rates of OW and RW, the FIFO policy has better applicability. Meanwhile, this paper considers the differences between the effects of warehouse and shelf inventory levels on demand, and then builds retailers’ inventory decision model and studies the factors of the optimal order quantity, the optimal order cycle and the average inventory cost per unit time. To minimize the average total cost, the optimal dispatching policies are provided for retailers’ decisions.

Keywords: FIFO model, inventory-level-dependent, LIFO model, two-warehouse inventory

Procedia PDF Downloads 276
19014 Corrosion Characteristics and Electrochemical Treatment of Heritage Silver Alloys

Authors: Ahmad N. Abu-Baker

Abstract:

This study investigated the corrosion of a group of heritage silver-copper alloy coins and their conservation treatment by potentiostatic methods. The corrosion products of the coins were characterized by a combination of scanning electron microscopy/ energy-dispersive X-ray spectroscopy (SEM/EDX) and X-ray diffraction (XRD) analyses. Cathodic polarization curves, measured by linear sweep voltammetry (LSV), also identified the corrosion products and the working conditions to treat the coins using a potentiostatic reduction method, which was monitored by chronoamperometry. The corrosion products showed that the decay mechanisms were dominated by selective attack on the copper-rich phases of the silver-copper alloys, which is consistent with an internal galvanic corrosion phenomenon, which leads to the deposition of copper corrosion products on the surface of the coins. Silver chloride was also detected on the coins, which reflects selective corrosion of the silver-rich phases under different chemical environments. The potentiostatic treatment showed excellent effectiveness in determining treatment parameters and monitoring the reduction process of the corrosion products on the coins, which helped to preserve surface details in the cleaning process and to prevent over-treatment.

Keywords: silver alloys, corrosion, conservation, heritage

Procedia PDF Downloads 132
19013 Carbon Footprint Reduction Using Cleaner Production Strategies in a Otoshimi Producing Plant

Authors: Razuana Rahim, Abdul Aziz Abdul Raman

Abstract:

In this work, a study was conducted to evaluate the feasibility of using Cleaner Production (CP) strategy to reduce carbon dioxide emission (CO2) in a plant that produces Otoshimi. CP strategy is meant to reduce CO2 emission while taking into consideration the economic aspect. For this purpose, a CP audit was conducted and the information obtained were analyzed and major contributors of CO2 emission inside the boundary of the production plant was identified. Electricity, water and fuel consumption and generation of solid waste and wastewater were identified as the main contributors. Total CO2 emission generated was 0.27 kg CO2 per kg of Otoshimi produced, where 68% was contributed by electricity consumption. Subsequently, a total of three CP options were generated and implementations of these options are expected to reduce the CO2 emission from electricity consumption to 0.16 kg CO2 per kg of Otoshimi produced, a reduction of about 14%. The study proves that CP strategy can be implemented even without any investment to reduce CO2 for a plant that produces Otoshimi.

Keywords: carbon dioxide emission, cleaner production audit, cleaner production options, otoshimi production

Procedia PDF Downloads 420
19012 Thermomechanical Damage Modeling of F114 Carbon Steel

Authors: A. El Amri, M. El Yakhloufi Haddou, A. Khamlichi

Abstract:

The numerical simulation based on the Finite Element Method (FEM) is widely used in academic institutes and in the industry. It is a useful tool to predict many phenomena present in the classical manufacturing forming processes such as fracture. But, the results of such numerical model depend strongly on the parameters of the constitutive behavior model. The influences of thermal and mechanical loads cause damage. The temperature and strain rate dependent materials’ properties and their modelling are discussed. A Johnson-Cook Model of damage has been selected for the numerical simulations. Virtual software called the ABAQUS 6.11 is used for finite element analysis. This model was introduced in order to give information concerning crack initiation during thermal and mechanical loads.

Keywords: thermo-mechanical fatigue, failure, numerical simulation, fracture, damage

Procedia PDF Downloads 386