Search results for: 3D Models
4355 A Numerical Study on Micromechanical Aspects in Short Fiber Composites
Authors: I. Ioannou, I. M. Gitman
Abstract:
This study focused on the contribution of micro-mechanical parameters on the macro-mechanical response of short fiber composites, namely polypropylene matrix reinforced by glass fibers. In the framework of this paper, an attention has been given to the glass fibers length, as micromechanical parameter influences the overall macroscopic material’s behavior. Three dimensional numerical models were developed and analyzed through the concept of a Representative Volume Element (RVE). Results of the RVE-based approach were compared with analytical Halpin-Tsai’s model.Keywords: effective properties, homogenization, representative volume element, short fiber reinforced composites
Procedia PDF Downloads 2694354 Numerical Modeling of Phase Change Materials Walls under Reunion Island's Tropical Weather
Authors: Lionel Trovalet, Lisa Liu, Dimitri Bigot, Nadia Hammami, Jean-Pierre Habas, Bruno Malet-Damour
Abstract:
The MCP-iBAT1 project is carried out to study the behavior of Phase Change Materials (PCM) integrated in building envelopes in a tropical environment. Through the phase transitions (melting and freezing) of the material, thermal energy can be absorbed or released. This process enables the regulation of indoor temperatures and the improvement of thermal comfort for the occupants. Most of the commercially available PCMs are more suitable to temperate climates than to tropical climates. The case of Reunion Island is noteworthy as there are multiple micro-climates. This leads to our key question: developing one or multiple bio-based PCMs that cover the thermal needs of the different locations of the island. The present paper focuses on the numerical approach to select the PCM properties relevant to tropical areas. Numerical simulations have been carried out with two softwares: EnergyPlusTM and Isolab. The latter has been developed in the laboratory, with the implicit Finite Difference Method, in order to evaluate different physical models. Both are Thermal Dynamic Simulation (TDS) softwares that predict the building’s thermal behavior with one-dimensional heat transfers. The parameters used in this study are the construction’s characteristics (dimensions and materials) and the environment’s description (meteorological data and building surroundings). The building is modeled in accordance with the experimental setup. It is divided into two rooms, cells A and B, with same dimensions. Cell A is the reference, while in cell B, a layer of commercial PCM (Thermo Confort of MCI Technologies) has been applied to the inner surface of the North wall. Sensors are installed in each room to retrieve temperatures, heat flows, and humidity rates. The collected data are used for the comparison with the numerical results. Our strategy is to implement two similar buildings at different altitudes (Saint-Pierre: 70m and Le Tampon: 520m) to measure different temperature ranges. Therefore, we are able to collect data for various seasons during a condensed time period. The following methodology is used to validate the numerical models: calibration of the thermal and PCM models in EnergyPlusTM and Isolab based on experimental measures, then numerical testing with a sensitivity analysis of the parameters to reach the targeted indoor temperatures. The calibration relies on the past ten months’ measures (from September 2020 to June 2021), with a focus on one-week study on November (beginning of summer) when the effect of PCM on inner surface temperatures is more visible. A first simulation with the PCM model of EnergyPlus gave results approaching the measurements with a mean error of 5%. The studied property in this paper is the melting temperature of the PCM. By determining the representative temperature of winter, summer and inter-seasons with past annual’s weather data, it is possible to build a numerical model of multi-layered PCM. Hence, the combined properties of the materials will provide an optimal scenario for the application on PCM in tropical areas. Future works will focus on the development of bio-based PCMs with the selected properties followed by experimental and numerical validation of the materials. 1Materiaux ´ a Changement de Phase, une innovation pour le B ` ati TropicalKeywords: energyplus, multi-layer of PCM, phase changing materials, tropical area
Procedia PDF Downloads 954353 BIM Modeling of Site and Existing Buildings: Case Study of ESTP Paris Campus
Authors: Rita Sassine, Yassine Hassani, Mohamad Al Omari, Stéphanie Guibert
Abstract:
Building Information Modelling (BIM) is the process of creating, managing, and centralizing information during the building lifecycle. BIM can be used all over a construction project, from the initiation phase to the planning and execution phases to the maintenance and lifecycle management phase. For existing buildings, BIM can be used for specific applications such as lifecycle management. However, most of the existing buildings don’t have a BIM model. Creating a compatible BIM for existing buildings is very challenging. It requires special equipment for data capturing and efforts to convert these data into a BIM model. The main difficulties for such projects are to define the data needed, the level of development (LOD), and the methodology to be adopted. In addition to managing information for an existing building, studying the impact of the built environment is a challenging topic. So, integrating the existing terrain that surrounds buildings into the digital model is essential to be able to make several simulations as flood simulation, energy simulation, etc. Making a replication of the physical model and updating its information in real-time to make its Digital Twin (DT) is very important. The Digital Terrain Model (DTM) represents the ground surface of the terrain by a set of discrete points with unique height values over 2D points based on reference surface (e.g., mean sea level, geoid, and ellipsoid). In addition, information related to the type of pavement materials, types of vegetation and heights and damaged surfaces can be integrated. Our aim in this study is to define the methodology to be used in order to provide a 3D BIM model for the site and the existing building based on the case study of “Ecole Spéciale des Travaux Publiques (ESTP Paris)” school of engineering campus. The property is located on a hilly site of 5 hectares and is composed of more than 20 buildings with a total area of 32 000 square meters and a height between 50 and 68 meters. In this work, the campus precise levelling grid according to the NGF-IGN69 altimetric system and the grid control points are computed according to (Réseau Gédésique Français) RGF93 – Lambert 93 french system with different methods: (i) Land topographic surveying methods using robotic total station, (ii) GNSS (Global Network Satellite sytem) levelling grid with NRTK (Network Real Time Kinematic) mode, (iii) Point clouds generated by laser scanning. These technologies allow the computation of multiple building parameters such as boundary limits, the number of floors, the floors georeferencing, the georeferencing of the 4 base corners of each building, etc. Once the entry data are identified, the digital model of each building is done. The DTM is also modeled. The process of altimetric determination is complex and requires efforts in order to collect and analyze multiple data formats. Since many technologies can be used to produce digital models, different file formats such as DraWinG (DWG), LASer (LAS), Comma-separated values (CSV), Industry Foundation Classes (IFC) and ReViT (RVT) will be generated. Checking the interoperability between BIM models is very important. In this work, all models are linked together and shared on 3DEXPERIENCE collaborative platform.Keywords: building information modeling, digital terrain model, existing buildings, interoperability
Procedia PDF Downloads 1144352 The Interdisciplinary Synergy Between Computer Engineering and Mathematics
Authors: Mitat Uysal, Aynur Uysal
Abstract:
Computer engineering and mathematics share a deep and symbiotic relationship, with mathematics providing the foundational theories and models for computer engineering advancements. From algorithm development to optimization techniques, mathematics plays a pivotal role in solving complex computational problems. This paper explores key mathematical principles that underpin computer engineering, illustrating their significance through a case study that demonstrates the application of optimization techniques using Python code. The case study addresses the well-known vehicle routing problem (VRP), an extension of the traveling salesman problem (TSP), and solves it using a genetic algorithm.Keywords: VRP, TSP, genetic algorithm, computer engineering, optimization
Procedia PDF Downloads 154351 Numerical Flow Simulation around HSP Propeller in Open Water and behind a Vessel Wake Using RANS CFD Code
Authors: Kadda Boumediene, Mohamed Bouzit
Abstract:
The prediction of the flow around marine propellers and vessel hulls propeller interaction is one of the challenges of Computational fluid dynamics (CFD). The CFD has emerged as a potential tool in recent years and has promising applications. The objective of the current study is to predict the hydrodynamic performances of HSP marine propeller in open water and behind a vessel. The unsteady 3-D flow was modeled numerically along with respectively the K-ω standard and K-ω SST turbulence models for steady and unsteady cases. The hydrodynamic performances such us a torque and thrust coefficients and efficiency show good agreement with the experiment results.Keywords: seiun maru propeller, steady, unstead, CFD, HSP
Procedia PDF Downloads 3074350 Thermal Contact Resistance of Nanoscale Rough Surfaces
Authors: Ravi Prasher
Abstract:
In nanostructured material thermal transport is dominated by contact resistance. Theoretical models describing thermal transport at interfaces assume perfectly flat surface whereas in reality surfaces can be rough with roughness ranging from sub-nanoscale dimension to micron scale. Here we introduce a model which includes both nanoscale contact mechanics and nanoscale heat transfer for rough nanoscale surfaces. This comprehensive model accounts for the effect of phonon acoustic mismatch, mechanical properties, chemical properties and randomness of the rough surface.Keywords: adhesion and contact resistance, Kaptiza resistance of rough surfaces, nanoscale thermal transport
Procedia PDF Downloads 3724349 Effects of Viscoelastic and Viscous Links on Seismic Pounding Mitigation in Buildings
Authors: Ali Reza Mirzagoltabar Roshan, H. Ahmadi Taleshian, A. Eliasi
Abstract:
This paper examines the effects of viscous and viscoelastic dampers as an efficient technique for seismic pounding mitigation. To aim that, 15 steel frame models with different numbers of stories and bays and also with different types of ductility were analyzed under 10 different earthquake records for assigned values of link damping and stiffness and the most suitable values of damper parameters (damping and stiffness) are presented. Moreover, it is demonstrated that viscous dampers can perform as efficiently as viscoelastic alternative with a more economical aspect for pounding mitigation purposes.Keywords: adjacent buildings, separation distance, seismic pounding mitigation, viscoelastic link
Procedia PDF Downloads 3344348 Cr (VI) Adsorption on Ce0.25Zr0.75O2.nH2O-Kinetics and Thermodynamics
Authors: Carlos Alberto Rivera-corredor, Angie Dayana Vargas-Ceballos, Edison Gilpavas, Izabela Dobrosz-Gómez, Miguel Ángel Gómez-García
Abstract:
Hexavalent chromium, Cr (VI) is present in the effluents from different industries such as electroplating, mining, leather tanning, etc. This compound is of great academic and industrial concern because of its toxic and carcinogenic behavior. Its dumping to both environmental and public health for animals and humans causes serious problems in water sources. The amount of Cr (VI) in industrial wastewaters ranges from 0.5 to 270,000 mgL-1. According to the Colombian standard for water quality (NTC-813-2010), the maximum allowed concentration for the Cr (VI) in drinking water is 0.05 mg L-1. To comply with this limit, it is essential that industries treat their effluent to reduce the Cr (VI) to acceptable levels. Numerous methods have been reported for the treatment removing metal ions from aqueous solutions such as: reduction, ion exchange, electrodialysis, etc. Adsorption has become a promising method for the purification of metal ions in water, since its application corresponds with an economic and efficient technology. The absorbent selection and the kinetic and thermodynamic study of the adsorption conditions are key to the development of a suitable adsorption technology. The Ce0.25Zr0.75O2.nH2O presents higher adsorption capacity between a series of hydrated mixed oxides Ce1-xZrxO2 (x = 0, 0.25, 0.5, 0.75, 1). This work presents the kinetic and thermodynamic study of Cr (VI) adsorption on Ce0.25Zr0.75O2.nH2O. Experiments were performed under the following experimental conditions: initial Cr (VI) concentration = 25, 50 and 100 mgL-1, pH = 2, adsorbent charge = 4 gL-1, stirring time = 60 min, temperature=20, 28 and 40 °C. The Cr (VI) concentration was spectrophotometrically estimated by the method of difenilcarbazide with monitoring the absorbance at 540 nm. The Cr (VI) adsorption over hydrated Ce0.25Zr0.75O2.nH2O models was analyzed using pseudo-first and pseudo-second order kinetics. The Langmuir and Freundlich models were used to model the experimental data. The convergence between the experimental values and those predicted by the model, is expressed as a linear regression correlation coefficient (R2) and was employed as the model selection criterion. The adsorption process followed the pseudo-second order kinetic model and obeyed the Langmuir isotherm model. The thermodynamic parameters were calculated as: ΔH°=9.04 kJmol-1,ΔS°=0.03 kJmol-1 K-1, ΔG°=-0.35 kJmol-1 and indicated the endothermic and spontaneous nature of the adsorption process, governed by physisorption interactions.Keywords: adsorption, hexavalent chromium, kinetics, thermodynamics
Procedia PDF Downloads 3024347 Adaptive Control Approach for an Unmanned Aerial Manipulator
Authors: Samah Riache, Madjid Kidouche
Abstract:
In this paper, we propose a nonlinear controller for Aerial Manipulator (AM) consists of a Quadrotor equipped with two degrees of freedom robotic arm. The kinematic and dynamic models were developed by considering the aerial manipulator as a coupled system. The proposed controller was designed using Nonsingular Terminal Sliding Mode Control. The objective of our approach is to improve performances and attenuate the chattering drawback using an adaptive algorithm in the discontinuous control part. Simulation results prove the effectiveness of the proposed control strategy compared with Sliding Mode Controller.Keywords: adaptive algorithm, quadrotor, robotic arm, sliding mode control
Procedia PDF Downloads 1864346 Flood Mapping Using Height above the Nearest Drainage Model: A Case Study in Fredericton, NB, Canada
Authors: Morteza Esfandiari, Shabnam Jabari, Heather MacGrath, David Coleman
Abstract:
Flood is a severe issue in different places in the world as well as the city of Fredericton, New Brunswick, Canada. The downtown area of Fredericton is close to the Saint John River, which is susceptible to flood around May every year. Recently, the frequency of flooding seems to be increased, especially after the fact that the downtown area and surrounding urban/agricultural lands got flooded in two consecutive years in 2018 and 2019. In order to have an explicit vision of flood span and damage to affected areas, it is necessary to use either flood inundation modelling or satellite data. Due to contingent availability and weather dependency of optical satellites, and limited existing data for the high cost of hydrodynamic models, it is not always feasible to rely on these sources of data to generate quality flood maps after or during the catastrophe. Height Above the Nearest Drainage (HAND), a state-of-the-art topo-hydrological index, normalizes the height of a basin based on the relative elevation along with the stream network and specifies the gravitational or the relative drainage potential of an area. HAND is a relative height difference between the stream network and each cell on a Digital Terrain Model (DTM). The stream layer is provided through a multi-step, time-consuming process which does not always result in an optimal representation of the river centerline depending on the topographic complexity of that region. HAND is used in numerous case studies with quite acceptable and sometimes unexpected results because of natural and human-made features on the surface of the earth. Some of these features might cause a disturbance in the generated model, and consequently, the model might not be able to predict the flow simulation accurately. We propose to include a previously existing stream layer generated by the province of New Brunswick and benefit from culvert maps to improve the water flow simulation and accordingly the accuracy of HAND model. By considering these parameters in our processing, we were able to increase the accuracy of the model from nearly 74% to almost 92%. The improved model can be used for generating highly accurate flood maps, which is necessary for future urban planning and flood damage estimation without any need for satellite imagery or hydrodynamic computations.Keywords: HAND, DTM, rapid floodplain, simplified conceptual models
Procedia PDF Downloads 1524345 Macroeconomic Reevaluation of CNY/USD Exchange Rate: Quantitative Impact on EUR/USD Exchange Rate
Authors: R. Henry, H. Andriamboavonjy, J. B. Paulin, S. Drahy, R. Gourichon
Abstract:
During past decade, Chinese monetary policy has been to maintain stability of exchange rate CNY/USD by creating parity between the two currencies. This policy, against market equilibrium, impacts the exchange rate in having low Yuan currency, and keeping attractiveness of Chinese industries. Using macroeconomic and statistic approach, the impact of such policy onto CNY/USD exchange rate is quantitatively determined. It is also pointed out how Chinese banks respect Basel III ratios, in particular the foreign exchange ratio. The main analysis is focusing on how Chinese banks will respect these ratios in the future.Keywords: macroeconomics models, yuan floating exchange rate, basel iii, china banking system
Procedia PDF Downloads 5694344 Fractional Calculus into Structural Dynamics
Authors: Jorge Lopez
Abstract:
In this work, we introduce fractional calculus in order to study the dynamics of a damped multistory building with some symmetry. Initially we make a review of the dynamics of a free and damped multistory building. Then we introduce those concepts of fractional calculus that will be involved in our study. It has been noticed that fractional calculus provides models with less parameters than those based on classical calculus. In particular, a damped classical oscilator is more naturally described by using fractional derivatives. Accordingly, we model our multistory building as a set of coupled fractional oscillators and compare its dynamics with the results coming from traditional methods.Keywords: coupled oscillators, fractional calculus, fractional oscillator, structural dynamics
Procedia PDF Downloads 2464343 Global Analysis of HIV Virus Models with Cell-to-Cell
Authors: Hossein Pourbashash
Abstract:
Recent experimental studies have shown that HIV can be transmitted directly from cell to cell when structures called virological synapses form during interactions between T cells. In this article, we describe a new within-host model of HIV infection that incorporates two mechanisms: infection by free virions and the direct cell-to-cell transmission. We conduct the local and global stability analysis of the model. We show that if the basic reproduction number R0 1, the virus is cleared and the disease dies out; if R0 > 1, the virus persists in the host. We also prove that the unique positive equilibrium attracts all positive solutions under additional assumptions on the parameters.Keywords: HIV virus model, cell-to-cell transmission, global stability, Lyapunov function, second compound matrices
Procedia PDF Downloads 5184342 Different Receptions of Hygienic Architecture in Two Mexican Cities: Cuernavaca and Mexico
Authors: Marcela Dávalos López
Abstract:
In Mexico, the distribution of hygienistarchitecture during the 20th century had different rhythms. The culmination of the urban hygiene system (from sewers to showers, passing through garbage collection) forced neighbors and citizens to participate in the common welfare. This turned the urban references and dissociated the ways of living and led to comfort and health. However, the contrast between two Mexicancities, Cuernavaca and Mexico City shows us very different cultural practices regarding the use of hygienicarchitectures: in the first, thenature of its deepravines marked the destiny of residential architecture, while in Mexico City, state participation alteredthelandscape and homogenized the architectural models of domestic and intímate spaces.Keywords: Cultural Practices, Dissociated Ways To Comfort, Hygiene Architecture , Mexico
Procedia PDF Downloads 1894341 Prediction of Anticancer Potential of Curcumin Nanoparticles by Means of Quasi-Qsar Analysis Using Monte Carlo Method
Authors: Ruchika Goyal, Ashwani Kumar, Sandeep Jain
Abstract:
The experimental data for anticancer potential of curcumin nanoparticles was calculated by means of eclectic data. The optimal descriptors were examined using Monte Carlo method based CORAL SEA software. The statistical quality of the model is following: n = 14, R² = 0.6809, Q² = 0.5943, s = 0.175, MAE = 0.114, F = 26 (sub-training set), n =5, R²= 0.9529, Q² = 0.7982, s = 0.086, MAE = 0.068, F = 61, Av Rm² = 0.7601, ∆R²m = 0.0840, k = 0.9856 and kk = 1.0146 (test set) and n = 5, R² = 0.6075 (validation set). This data can be used to build predictive QSAR models for anticancer activity.Keywords: anticancer potential, curcumin, model, nanoparticles, optimal descriptors, QSAR
Procedia PDF Downloads 3204340 Developing an Intonation Labeled Dataset for Hindi
Authors: Esha Banerjee, Atul Kumar Ojha, Girish Nath Jha
Abstract:
This study aims to develop an intonation labeled database for Hindi. Although no single standard for prosody labeling exists in Hindi, researchers in the past have employed perceptual and statistical methods in literature to draw inferences about the behavior of prosody patterns in Hindi. Based on such existing research and largely agreed upon intonational theories in Hindi, this study attempts to develop a manually annotated prosodic corpus of Hindi speech data, which can be used for training speech models for natural-sounding speech in the future. 100 sentences ( 500 words) each for declarative and interrogative types have been labeled using Praat.Keywords: speech dataset, Hindi, intonation, labeled corpus
Procedia PDF Downloads 2024339 Integrated Best Worst PROMETHEE to Evaluate Public Transport Service Quality
Authors: Laila Oubahman, Duleba Szabolcs
Abstract:
Public transport stakeholders aim to increase the ridership ratio by encouraging citizens to use common transportation modes. For this sight, improving service quality is a crucial option to reach the quality desired by users and reduce the gap between desired and perceived quality. Multi-criteria decision aid has been applied in literature in recent decades because it provides efficient models to assess the most impacting criteria on the overall assessment. In this paper, the PROMETHEE method is combined with the best-worst approach to construct a consensual model that avoids rank reversal to support stakeholders in ameliorating service quality.Keywords: best-worst method, MCDA, PROMETHEE, public transport
Procedia PDF Downloads 2104338 ChatGPT 4.0 Demonstrates Strong Performance in Standardised Medical Licensing Examinations: Insights and Implications for Medical Educators
Authors: K. O'Malley
Abstract:
Background: The emergence and rapid evolution of large language models (LLMs) (i.e., models of generative artificial intelligence, or AI) has been unprecedented. ChatGPT is one of the most widely used LLM platforms. Using natural language processing technology, it generates customized responses to user prompts, enabling it to mimic human conversation. Responses are generated using predictive modeling of vast internet text and data swathes and are further refined and reinforced through user feedback. The popularity of LLMs is increasing, with a growing number of students utilizing these platforms for study and revision purposes. Notwithstanding its many novel applications, LLM technology is inherently susceptible to bias and error. This poses a significant challenge in the educational setting, where academic integrity may be undermined. This study aims to evaluate the performance of the latest iteration of ChatGPT (ChatGPT4.0) in standardized state medical licensing examinations. Methods: A considered search strategy was used to interrogate the PubMed electronic database. The keywords ‘ChatGPT’ AND ‘medical education’ OR ‘medical school’ OR ‘medical licensing exam’ were used to identify relevant literature. The search included all peer-reviewed literature published in the past five years. The search was limited to publications in the English language only. Eligibility was ascertained based on the study title and abstract and confirmed by consulting the full-text document. Data was extracted into a Microsoft Excel document for analysis. Results: The search yielded 345 publications that were screened. 225 original articles were identified, of which 11 met the pre-determined criteria for inclusion in a narrative synthesis. These studies included performance assessments in national medical licensing examinations from the United States, United Kingdom, Saudi Arabia, Poland, Taiwan, Japan and Germany. ChatGPT 4.0 achieved scores ranging from 67.1 to 88.6 percent. The mean score across all studies was 82.49 percent (SD= 5.95). In all studies, ChatGPT exceeded the threshold for a passing grade in the corresponding exam. Conclusion: The capabilities of ChatGPT in standardized academic assessment in medicine are robust. While this technology can potentially revolutionize higher education, it also presents several challenges with which educators have not had to contend before. The overall strong performance of ChatGPT, as outlined above, may lend itself to unfair use (such as the plagiarism of deliverable coursework) and pose unforeseen ethical challenges (arising from algorithmic bias). Conversely, it highlights potential pitfalls if users assume LLM-generated content to be entirely accurate. In the aforementioned studies, ChatGPT exhibits a margin of error between 11.4 and 32.9 percent, which resonates strongly with concerns regarding the quality and veracity of LLM-generated content. It is imperative to highlight these limitations, particularly to students in the early stages of their education who are less likely to possess the requisite insight or knowledge to recognize errors, inaccuracies or false information. Educators must inform themselves of these emerging challenges to effectively address them and mitigate potential disruption in academic fora.Keywords: artificial intelligence, ChatGPT, generative ai, large language models, licensing exam, medical education, medicine, university
Procedia PDF Downloads 344337 Production of Rhamnolipids from Different Resources and Estimating the Kinetic Parameters for Bioreactor Design
Authors: Olfat A. Mohamed
Abstract:
Rhamnolipids biosurfactants have distinct properties given them importance in many industrial applications, especially their great new future applications in cosmetic and pharmaceutical industries. These applications have encouraged the search for diverse and renewable resources to control the cost of production. The experimental results were then applied to find a suitable mathematical model for obtaining the design criteria of the batch bioreactor. This research aims to produce Rhamnolipids from different oily wastewater sources such as petroleum crude oil (PO) and vegetable oil (VO) by using Pseudomonas aeruginosa ATCC 9027. Different concentrations of the PO and the VO are added to the media broth separately are in arrangement (0.5 1, 1.5, 2, 2.5 % v/v) and (2, 4, 6, 8 and 10%v/v). The effect of the initial concentration of oil residues and the addition of glycerol and palmitic acid was investigated as an inducer in the production of rhamnolipid and the surface tension of the broth. It was found that 2% of the waste (PO) and 6% of the waste (VO) was the best initial substrate concentration for the production of rhamnolipids (2.71, 5.01 g rhamnolipid/l) as arrangement. Addition of glycerol (10-20% v glycerol/v PO) to the 2% PO fermentation broth led to increase the rhamnolipid production (about 1.8-2 times fold). However, the addition of palmitic acid (5 and 10 g/l) to fermentation broth contained 6% VO rarely enhanced the production rate. The experimental data for 2% initially (PO) was used to estimate the various kinetic parameters. The following results were obtained, maximum rate or velocity of reaction (Vmax) = 0.06417 g/l.hr), yield of cell weight per unit weight of substrate utilized (Yx/s = 0.324 g Cx/g Cs) maximum specific growth rate (μmax = 0.05791 hr⁻¹), yield of rhamnolipid weight per unit weight of substrate utilized (Yp/s)=0.2571gCp/g Cs), maintenance coefficient (Ms =0.002419), Michaelis-Menten constant, (Km=6.1237 gmol/l), endogenous decay coefficient (Kd=0.002375 hr⁻¹). Predictive parameters and advanced mathematical models were applied to evaluate the time of the batch bioreactor. The results were as follows: 123.37, 129 and 139.3 hours in respect of microbial biomass, substrate and product concentration, respectively compared with experimental batch time of 120 hours in all cases. The expected mathematical models are compatible with the laboratory results and can, therefore, be considered as tools for expressing the actual system.Keywords: batch bioreactor design, glycerol, kinetic parameters, petroleum crude oil, Pseudomonas aeruginosa, rhamnolipids biosurfactants, vegetable oil
Procedia PDF Downloads 1324336 Unlocking Health Insights: Studying Data for Better Care
Authors: Valentina Marutyan
Abstract:
Healthcare data mining is a rapidly developing field at the intersection of technology and medicine that has the potential to change our understanding and approach to providing healthcare. Healthcare and data mining is the process of examining huge amounts of data to extract useful information that can be applied in order to improve patient care, treatment effectiveness, and overall healthcare delivery. This field looks for patterns, trends, and correlations in a variety of healthcare datasets, such as electronic health records (EHRs), medical imaging, patient demographics, and treatment histories. To accomplish this, it uses advanced analytical approaches. Predictive analysis using historical patient data is a major area of interest in healthcare data mining. This enables doctors to get involved early to prevent problems or improve results for patients. It also assists in early disease detection and customized treatment planning for every person. Doctors can customize a patient's care by looking at their medical history, genetic profile, current and previous therapies. In this way, treatments can be more effective and have fewer negative consequences. Moreover, helping patients, it improves the efficiency of hospitals. It helps them determine the number of beds or doctors they require in regard to the number of patients they expect. In this project are used models like logistic regression, random forests, and neural networks for predicting diseases and analyzing medical images. Patients were helped by algorithms such as k-means, and connections between treatments and patient responses were identified by association rule mining. Time series techniques helped in resource management by predicting patient admissions. These methods improved healthcare decision-making and personalized treatment. Also, healthcare data mining must deal with difficulties such as bad data quality, privacy challenges, managing large and complicated datasets, ensuring the reliability of models, managing biases, limited data sharing, and regulatory compliance. Finally, secret code of data mining in healthcare helps medical professionals and hospitals make better decisions, treat patients more efficiently, and work more efficiently. It ultimately comes down to using data to improve treatment, make better choices, and simplify hospital operations for all patients.Keywords: data mining, healthcare, big data, large amounts of data
Procedia PDF Downloads 784335 Kinetic Modelling of Drying Process of Jumbo Squid (Dosidicus Gigas) Slices Subjected to an Osmotic Pretreatment under High Pressure
Authors: Mario Perez-Won, Roberto Lemus-Mondaca, Constanza Olivares-Rivera, Fernanda Marin-Monardez
Abstract:
This research presents the simultaneous application of high hydrostatic pressure (HHP) and osmotic dehydration (DO) as a pretreatment to hot –air drying of jumbo squid (Dosidicus gigas) cubes. The drying time was reduced to 2 hours at 60ºC and 5 hours at 40°C as compared to the jumbo squid samples untreated. This one was due to osmotic pressure under high-pressure treatment where increased salt saturation what caused an increasing water loss. Thus, a more reduced time during convective drying was reached, and so water effective diffusion in drying would play an important role in this research. Different working conditions such as pressure (350-550 MPa), pressure time (5-10 min), salt concentration, NaCl (10 y 15%) and drying temperature (40-60ºC) were optimized according to kinetic parameters of each mathematical model. The models used for drying experimental curves were those corresponding to Weibull, Page and Logarithmic models, however, the latest one was the best fitted to the experimental data. The values for water effective diffusivity varied from 4.82 to 6.59x10-9 m2/s for the 16 curves (DO+HHP) whereas the control samples obtained a value of 1.76 and 5.16×10-9 m2/s, for 40 and 60°C, respectively. On the other hand, quality characteristics such as color, texture, non-enzymatic browning, water holding capacity (WHC) and rehydration capacity (RC) were assessed. The L* (lightness) color parameter increased, however, b * (yellowish) and a* (reddish) parameters decreased for the DO+HHP treated samples, indicating treatment prevents sample browning. The texture parameters such as hardness and elasticity decreased, but chewiness increased with treatment, which resulted in a product with a higher tenderness and less firmness compared to the untreated sample. Finally, WHC and RC values of the most treatments increased owing to a minor damage in tissue cellular compared to untreated samples. Therefore, a knowledge regarding to the drying kinetic as well as quality characteristics of dried jumbo squid samples subjected to a pretreatment of osmotic dehydration under high hydrostatic pressure is extremely important to an industrial level so that the drying process can be successful at different pretreatment conditions and/or variable processes.Keywords: diffusion coefficient, drying process, high pressure, jumbo squid, modelling, quality aspects
Procedia PDF Downloads 2464334 Hydrogen Production Using an Anion-Exchange Membrane Water Electrolyzer: Mathematical and Bond Graph Modeling
Authors: Hugo Daneluzzo, Christelle Rabbat, Alan Jean-Marie
Abstract:
Water electrolysis is one of the most advanced technologies for producing hydrogen and can be easily combined with electricity from different sources. Under the influence of electric current, water molecules can be split into oxygen and hydrogen. The production of hydrogen by water electrolysis favors the integration of renewable energy sources into the energy mix by compensating for their intermittence through the storage of the energy produced when production exceeds demand and its release during off-peak production periods. Among the various electrolysis technologies, anion exchange membrane (AEM) electrolyser cells are emerging as a reliable technology for water electrolysis. Modeling and simulation are effective tools to save time, money, and effort during the optimization of operating conditions and the investigation of the design. The modeling and simulation become even more important when dealing with multiphysics dynamic systems. One of those systems is the AEM electrolysis cell involving complex physico-chemical reactions. Once developed, models may be utilized to comprehend the mechanisms to control and detect flaws in the systems. Several modeling methods have been initiated by scientists. These methods can be separated into two main approaches, namely equation-based modeling and graph-based modeling. The former approach is less user-friendly and difficult to update as it is based on ordinary or partial differential equations to represent the systems. However, the latter approach is more user-friendly and allows a clear representation of physical phenomena. In this case, the system is depicted by connecting subsystems, so-called blocks, through ports based on their physical interactions, hence being suitable for multiphysics systems. Among the graphical modelling methods, the bond graph is receiving increasing attention as being domain-independent and relying on the energy exchange between the components of the system. At present, few studies have investigated the modelling of AEM systems. A mathematical model and a bond graph model were used in previous studies to model the electrolysis cell performance. In this study, experimental data from literature were simulated using OpenModelica using bond graphs and mathematical approaches. The polarization curves at different operating conditions obtained by both approaches were compared with experimental ones. It was stated that both models predicted satisfactorily the polarization curves with error margins lower than 2% for equation-based models and lower than 5% for the bond graph model. The activation polarization of hydrogen evolution reactions (HER) and oxygen evolution reactions (OER) were behind the voltage loss in the AEM electrolyzer, whereas ion conduction through the membrane resulted in the ohmic loss. Therefore, highly active electro-catalysts are required for both HER and OER while high-conductivity AEMs are needed for effectively lowering the ohmic losses. The bond graph simulation of the polarisation curve for operating conditions at various temperatures has illustrated that voltage increases with temperature owing to the technology of the membrane. Simulation of the polarisation curve can be tested virtually, hence resulting in reduced cost and time involved due to experimental testing and improved design optimization. Further improvements can be made by implementing the bond graph model in a real power-to-gas-to-power scenario.Keywords: hydrogen production, anion-exchange membrane, electrolyzer, mathematical modeling, multiphysics modeling
Procedia PDF Downloads 934333 Determining a Sustainability Business Model Using Materiality Matrices in an Electricity Bus Factory
Authors: Ozcan Yavas, Berrak Erol Nalbur, Sermin Gunarslan
Abstract:
A materiality matrix is a tool that organizations use to prioritize their activities and adapt to the increasing sustainability requirements in recent years. For the materiality index to move from business models to the sustainability business model stage, it must be done with all partners in the raw material, supply, production, product, and end-of-life product stages. Within the scope of this study, the Materiality Matrix was used to transform the business model into a sustainability business model and to create a sustainability roadmap in a factory producing electric buses. This matrix determines the necessary roadmap for all stakeholders to participate in the process, especially in sectors that produce sustainable products, such as the electric vehicle sector, and to act together with the cradle-to-cradle approach of sustainability roadmaps. Global Reporting Initiative analysis was used in the study conducted with 1150 stakeholders within the scope of the study, and 43 questions were asked to the stakeholders under the main headings of 'Legal Compliance Level,' 'Environmental Strategies,' 'Risk Management Activities,' 'Impact of Sustainability Activities on Products and Services,' 'Corporate Culture,' 'Responsible and Profitable Business Model Practices' and 'Achievements in Leading the Sector' and Economic, Governance, Environment, Social and Other. The results of the study aimed to include five 1st priority issues and four 2nd priority issues in the sustainability strategies of the organization in the short and medium term. When the studies carried out in the short term are evaluated in terms of Sustainability and Environmental Risk Management, it is seen that the studies are still limited to the level of legal legislation (60%) and individual studies in line with the strategies (20%). At the same time, the stakeholders expect the company to integrate sustainability activities into its business model within five years (35%) and to carry out projects to become the first company that comes to mind with its success leading the sector (20%). Another result obtained within the study's scope is identifying barriers to implementation. It is seen that the most critical obstacles identified by stakeholders with climate change and environmental impacts are financial deficiency and lack of infrastructure in the dissemination of sustainable products. These studies are critical for transitioning to sustainable business models for the electric vehicle sector to achieve the EU Green Deal and CBAM targets.Keywords: sustainability business model, materiality matrix, electricity bus, carbon neutrality, sustainability management
Procedia PDF Downloads 634332 Early Prediction of Disposable Addresses in Ethereum Blockchain
Authors: Ahmad Saleem
Abstract:
Ethereum is the second largest crypto currency in blockchain ecosystem. Along with standard transactions, it supports smart contracts and NFT’s. Current research trends are focused on analyzing the overall structure of the network its growth and behavior. Ethereum addresses are anonymous and can be created on fly. The nature of Ethereum network and addresses make it hard to predict their behavior. The activity period of an ethereum address is not much analyzed. Using machine learning we can make early prediction about the disposability of the address. In this paper we analyzed the lifetime of the addresses. We also identified and predicted the disposable addresses using machine learning models and compared the results.Keywords: blockchain, Ethereum, cryptocurrency, prediction
Procedia PDF Downloads 984331 Predicting Aggregation Propensity from Low-Temperature Conformational Fluctuations
Authors: Hamza Javar Magnier, Robin Curtis
Abstract:
There have been rapid advances in the upstream processing of protein therapeutics, which has shifted the bottleneck to downstream purification and formulation. Finding liquid formulations with shelf lives of up to two years is increasingly difficult for some of the newer therapeutics, which have been engineered for activity, but their formulations are often viscous, can phase separate, and have a high propensity for irreversible aggregation1. We explore means to develop improved predictive ability from a better understanding of how protein-protein interactions on formulation conditions (pH, ionic strength, buffer type, presence of excipients) and how these impact upon the initial steps in protein self-association and aggregation. In this work, we study the initial steps in the aggregation pathways using a minimal protein model based on square-well potentials and discontinuous molecular dynamics. The effect of model parameters, including range of interaction, stiffness, chain length, and chain sequence, implies that protein models fold according to various pathways. By reducing the range of interactions, the folding- and collapse- transition come together, and follow a single-step folding pathway from the denatured to the native state2. After parameterizing the model interaction-parameters, we developed an understanding of low-temperature conformational properties and fluctuations, and the correlation to the folding transition of proteins in isolation. The model fluctuations increase with temperature. We observe a low-temperature point, below which large fluctuations are frozen out. This implies that fluctuations at low-temperature can be correlated to the folding transition at the melting temperature. Because proteins “breath” at low temperatures, defining a native-state as a single structure with conserved contacts and a fixed three-dimensional structure is misleading. Rather, we introduce a new definition of a native-state ensemble based on our understanding of the core conservation, which takes into account the native fluctuations at low temperatures. This approach permits the study of a large range of length and time scales needed to link the molecular interactions to the macroscopically observed behaviour. In addition, these models studied are parameterized by fitting to experimentally observed protein-protein interactions characterized in terms of osmotic second virial coefficients.Keywords: protein folding, native-ensemble, conformational fluctuation, aggregation
Procedia PDF Downloads 3634330 Heat Transfer and Trajectory Models for a Cloud of Spray over a Marine Vessel
Authors: S. R. Dehghani, G. F. Naterer, Y. S. Muzychka
Abstract:
Wave-impact sea spray creates many droplets which form a spray cloud traveling over marine objects same as marine vessels and offshore structures. In cold climates such as Arctic reigns, sea spray icing, which is ice accretion on cold substrates, is strongly dependent on the wave-impact sea spray. The rate of cooling of droplets affects the process of icing that can yield to dry or wet ice accretion. Trajectories of droplets determine the potential places for ice accretion. Combining two models of trajectories and heat transfer for droplets can predict the risk of ice accretion reasonably. The majority of the cooling of droplets is because of droplet evaporations. In this study, a combined model using trajectory and heat transfer evaluate the situation of a cloud of spray from the generation to impingement. The model uses some known geometry and initial information from the previous case studies. The 3D model is solved numerically using a standard numerical scheme. Droplets are generated in various size ranges from 7 mm to 0.07 mm which is a suggested range for sea spray icing. The initial temperature of droplets is considered to be the sea water temperature. Wind velocities are assumed same as that of the field observations. Evaluations are conducted using some important heading angles and wind velocities. The characteristic of size-velocity dependence is used to establish a relation between initial sizes and velocities of droplets. Time intervals are chosen properly to maintain a stable and fast numerical solution. A statistical process is conducted to evaluate the probability of expected occurrences. The medium size droplets can reach the highest heights. Very small and very large droplets are limited to lower heights. Results show that higher initial velocities create the most expanded cloud of spray. Wind velocities affect the extent of the spray cloud. The rate of droplet cooling at the start of spray formation is higher than the rest of the process. This is because of higher relative velocities and also higher temperature differences. The amount of water delivery and overall temperature for some sample surfaces over a marine vessel are calculated. Comparing results and some field observations show that the model works accurately. This model is suggested as a primary model for ice accretion on marine vessels.Keywords: evaporation, sea spray, marine icing, numerical solution, trajectory
Procedia PDF Downloads 2204329 Virtual Metrology for Copper Clad Laminate Manufacturing
Authors: Misuk Kim, Seokho Kang, Jehyuk Lee, Hyunchang Cho, Sungzoon Cho
Abstract:
In semiconductor manufacturing, virtual metrology (VM) refers to methods to predict properties of a wafer based on machine parameters and sensor data of the production equipment, without performing the (costly) physical measurement of the wafer properties (Wikipedia). Additional benefits include avoidance of human bias and identification of important factors affecting the quality of the process which allow improving the process quality in the future. It is however rare to find VM applied to other areas of manufacturing. In this work, we propose to use VM to copper clad laminate (CCL) manufacturing. CCL is a core element of a printed circuit board (PCB) which is used in smartphones, tablets, digital cameras, and laptop computers. The manufacturing of CCL consists of three processes: Treating, lay-up, and pressing. Treating, the most important process among the three, puts resin on glass cloth, heat up in a drying oven, then produces prepreg for lay-up process. In this process, three important quality factors are inspected: Treated weight (T/W), Minimum Viscosity (M/V), and Gel Time (G/T). They are manually inspected, incurring heavy cost in terms of time and money, which makes it a good candidate for VM application. We developed prediction models of the three quality factors T/W, M/V, and G/T, respectively, with process variables, raw material, and environment variables. The actual process data was obtained from a CCL manufacturer. A variety of variable selection methods and learning algorithms were employed to find the best prediction model. We obtained prediction models of M/V and G/T with a high enough accuracy. They also provided us with information on “important” predictor variables, some of which the process engineers had been already aware and the rest of which they had not. They were quite excited to find new insights that the model revealed and set out to do further analysis on them to gain process control implications. T/W did not turn out to be possible to predict with a reasonable accuracy with given factors. The very fact indicates that the factors currently monitored may not affect T/W, thus an effort has to be made to find other factors which are not currently monitored in order to understand the process better and improve the quality of it. In conclusion, VM application to CCL’s treating process was quite successful. The newly built quality prediction model allowed one to reduce the cost associated with actual metrology as well as reveal some insights on the factors affecting the important quality factors and on the level of our less than perfect understanding of the treating process.Keywords: copper clad laminate, predictive modeling, quality control, virtual metrology
Procedia PDF Downloads 3514328 Employer Brand Image and Employee Engagement: An Exploratory Study in Britain
Authors: Melisa Mete, Gary Davies, Susan Whelan
Abstract:
Maintaining a good employer brand image is crucial for companies since it has numerous advantages such as better recruitment, retention and employee engagement, and commitment. This study aims to understand the relationship between employer brand image and employee satisfaction and engagement in the British context. A panel survey data (N=228) is tested via the regression models from the Hayes (2012) PROCESS macro, in IBM SPSS 23.0. The results are statistically significant and proves that the more positive employer brand image, the greater employee’ engagement and satisfaction, and the greater is employee satisfaction, the greater their engagement.Keywords: employer brand, employer brand image, employee engagement, employee satisfaction
Procedia PDF Downloads 3374327 Fast Track to the Physical Internet: A Cross-Industry Project from Upper Austria
Authors: Laura Simmer, Maria Kalt, Oliver Schauer
Abstract:
Freight transport is growing fast, but many vehicles are empty or just partially loaded. The vision and concepts of the Physical Internet (PI) proposes to eliminate these inefficiencies. Aiming for a radical sustainability improvement, the PI – inspired by the Digital Internet – is a hyperconnected global logistic system, enabling seamless asset sharing and flow consolidation. The implementation of a PI in its full expression will be a huge challenge: the industry needs innovation and implementation support including change management approaches, awareness creation and good practices diffusion, legislative actions to remove antitrust and international commerce barriers, standardization and public incentives policies. In order to take a step closer to this future the project ‘Atropine - Fast Track to the Physical Internet’ funded under the Strategic Economic and Research Program ‘Innovative Upper Austria 2020’ was set up. The two-year research project unites several research partners in this field, but also industrial partners and logistics service providers. With Atropine, the consortium wants to actively shape the mobility landscape in Upper Austria and make an innovative contribution to an energy-efficient, environmentally sound and sustainable development in the transport area. This paper should, on the one hand, clarify the questions what the project Atropine is about and, on the other hand, how a proof of concept will be reached. Awareness building plays an important role in the project as the PI requires a reorganization of the supply chain and the design of completely new forms of inter-company co-operation. New business models have to be developed and should be verified by simulation. After the simulation process one of these business models will be chosen and tested in real life with the partner companies. The developed results - simulation model and demonstrator - are used to determine how the concept of the PI can be applied in Upper Austria. Atropine shall pave the way for a full-scale development of the PI vision in the next few decades and provide the basis for pushing the industry toward a new level of co-operation with more shared resources and increased standardization.Keywords: Atropine, inter-company co-operation, Physical Internet, shared resources, sustainable logistics
Procedia PDF Downloads 2244326 Remaining Useful Life (RUL) Assessment Using Progressive Bearing Degradation Data and ANN Model
Authors: Amit R. Bhende, G. K. Awari
Abstract:
Remaining useful life (RUL) prediction is one of key technologies to realize prognostics and health management that is being widely applied in many industrial systems to ensure high system availability over their life cycles. The present work proposes a data-driven method of RUL prediction based on multiple health state assessment for rolling element bearings. Bearing degradation data at three different conditions from run to failure is used. A RUL prediction model is separately built in each condition. Feed forward back propagation neural network models are developed for prediction modeling.Keywords: bearing degradation data, remaining useful life (RUL), back propagation, prognosis
Procedia PDF Downloads 437