Search results for: computational chemistry
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2601

Search results for: computational chemistry

1551 Development of a Model Based on Wavelets and Matrices for the Treatment of Weakly Singular Partial Integro-Differential Equations

Authors: Somveer Singh, Vineet Kumar Singh

Abstract:

We present a new model based on viscoelasticity for the Non-Newtonian fluids.We use a matrix formulated algorithm to approximate solutions of a class of partial integro-differential equations with the given initial and boundary conditions. Some numerical results are presented to simplify application of operational matrix formulation and reduce the computational cost. Convergence analysis, error estimation and numerical stability of the method are also investigated. Finally, some test examples are given to demonstrate accuracy and efficiency of the proposed method.

Keywords: Legendre Wavelets, operational matrices, partial integro-differential equation, viscoelasticity

Procedia PDF Downloads 331
1550 Green Electrochemical Nitration of Bioactive Compounds: Biological Evaluation with Molecular Modelling

Authors: Sara Torabi, Sadegh Khazalpour, Mahdi Jamshidi

Abstract:

Nitro aromatic compounds are valuable materials because of their applications in the preparation of chemical intermediates for the synthesis of dyes, plastics, perfumes, energetic materials, and pharmaceuticals. Chemical and electrochemical procedures are reported for nitration of aromatic compounds. Flavonoid derivatives are present in many vegetables and fruits and are constituent of many common pharmaceuticals and dietary supplements. Electrochemistry provides very versatile means for the electrosynthesis, mechanistic and kinetic studies. To the best of our knowledge, and despite the importance of these compounds in numerous scientific fields, there are no reports on the electrochemical nitration of Quercetin derivatives. Herein, we describe a green electrochemical synthesis of a nitro compound. In this work, electrochemical oxidation of Quercetin has been studied in the presence of nitrite ion as a nucleophile in acetate buffer solution (c = 0.2 M, pH = 6.0), by means of cyclic voltammetry and controlled-potential coulometry. The results indicate the participation of produced o-benzoquinones in Michael reaction with nitrite ion (in the divided cell) to form the corresponding nitro diol (EC mechanism). The purity of product and characterization was done using ¹H NMR, ¹³C NMR, FTIR spectroscopic techniques. The presented strategies use a water/ethanol mixture as solvent. Ethanol as cosolvent was also used in the previous studies because of its low cost, safety, easy availability, recyclability, bioproductability, and biodegradability. These strategies represent a one-pot and facile process for the synthesis of nitro compound in high yield and purity under green conditions.

Keywords: electrochemical synthesis, green chemistry, cyclic voltammetry, molecular docking

Procedia PDF Downloads 141
1549 Spectral Efficiency Improvement in 5G Systems by Polyphase Decomposition

Authors: Wilson Enríquez, Daniel Cardenas

Abstract:

This article proposes a filter bank format combined with the mathematical tool called polyphase decomposition and the discrete Fourier transform (DFT) with the purpose of improving the performance of the fifth-generation communication systems (5G). We started with a review of the literature and the study of the filter bank theory and its combination with DFT in order to improve the performance of wireless communications since it reduces the computational complexity of these communication systems. With the proposed technique, several experiments were carried out in order to evaluate the structures in 5G systems. Finally, the results are presented in graphical form in terms of bit error rate against the ratio bit energy/noise power spectral density (BER vs. Eb / No).

Keywords: multi-carrier system (5G), filter bank, polyphase decomposition, FIR equalizer

Procedia PDF Downloads 197
1548 Programming with Grammars

Authors: Peter M. Maurer Maurer

Abstract:

DGL is a context free grammar-based tool for generating random data. Many types of simulator input data require some computation to be placed in the proper format. For example, it might be necessary to generate ordered triples in which the third element is the sum of the first two elements, or it might be necessary to generate random numbers in some sorted order. Although DGL is universal in computational power, generating these types of data is extremely difficult. To overcome this problem, we have enhanced DGL to include features that permit direct computation within the structure of a context free grammar. The features have been implemented as special types of productions, preserving the context free flavor of DGL specifications.

Keywords: DGL, Enhanced Context Free Grammars, Programming Constructs, Random Data Generation

Procedia PDF Downloads 143
1547 Co-Creational Model for Blended Learning in a Flipped Classroom Environment Focusing on the Combination of Coding and Drone-Building

Authors: A. Schuchter, M. Promegger

Abstract:

The outbreak of the COVID-19 pandemic has shown us that online education is so much more than just a cool feature for teachers – it is an essential part of modern teaching. In online math teaching, it is common to use tools to share screens, compute and calculate mathematical examples, while the students can watch the process. On the other hand, flipped classroom models are on the rise, with their focus on how students can gather knowledge by watching videos and on the teacher’s use of technological tools for information transfer. This paper proposes a co-educational teaching approach for coding and engineering subjects with the help of drone-building to spark interest in technology and create a platform for knowledge transfer. The project combines aspects from mathematics (matrices, vectors, shaders, trigonometry), physics (force, pressure and rotation) and coding (computational thinking, block-based programming, JavaScript and Python) and makes use of collaborative-shared 3D Modeling with clara.io, where students create mathematics knowhow. The instructor follows a problem-based learning approach and encourages their students to find solutions in their own time and in their own way, which will help them develop new skills intuitively and boost logically structured thinking. The collaborative aspect of working in groups will help the students develop communication skills as well as structural and computational thinking. Students are not just listeners as in traditional classroom settings, but play an active part in creating content together by compiling a Handbook of Knowledge (called “open book”) with examples and solutions. Before students start calculating, they have to write down all their ideas and working steps in full sentences so other students can easily follow their train of thought. Therefore, students will learn to formulate goals, solve problems, and create a ready-to use product with the help of “reverse engineering”, cross-referencing and creative thinking. The work on drones gives the students the opportunity to create a real-life application with a practical purpose, while going through all stages of product development.

Keywords: flipped classroom, co-creational education, coding, making, drones, co-education, ARCS-model, problem-based learning

Procedia PDF Downloads 117
1546 A Two Phase VNS Algorithm for the Combined Production Routing Problem

Authors: Nejah Ben Mabrouk, Bassem Jarboui, Habib Chabchoub

Abstract:

Production and distribution planning is the most important part in supply chain management. In this paper, a NP-hard production-distribution problem for one product over a multi-period horizon is investigated. The aim is to minimize the sum of costs of three items: production setups, inventories and distribution, while determining, for each period, the amount produced, the inventory levels and the delivery trips. To solve this difficult problem, we propose a bi-phase approach based on a Variable Neighbourhood Search (VNS). This heuristic is tested on 90 randomly generated instances from the literature, with 20 periods and 50, 100, 200 customers. Computational results show that our approach outperforms existing solution procedures available in the literature

Keywords: logistic, production, distribution, variable neighbourhood search

Procedia PDF Downloads 333
1545 Blended Wing Body (BWB) Vertical Takeoff and Landing (VTOL) Hybrids: Bridging Urban Gaps Through Computational Design and Optimization, A Comparative Study

Authors: Sai Siddharth S., Prasanna Kumar G. M., Alagarsamy R.

Abstract:

This research introduces an alternative approach to urban road maintenance by utilizing Blended Wing Body (BWB) design and Vertical Takeoff and Landing (VTOL) drones. The integration of this aerospace innovation, combining blended wing efficiency with VTOL maneuverability, aims to optimize fuel consumption and explore versatile applications in solving urban problems. A few problems are discussed along with optimization of the design and comparative study with other drone configurations.

Keywords: design optimization, CFD, CAD, VTOL, blended wing body

Procedia PDF Downloads 89
1544 Effect of Acid-Basic Treatments of Lingocellulosic Material Forest Wastes Wild Carob on Ethyl Violet Dye Adsorption

Authors: Abdallah Bouguettoucha, Derradji Chebli, Tariq Yahyaoui, Hichem Attout

Abstract:

The effect of acid -basic treatment of lingocellulosic material (forest wastes wild carob) on Ethyl violet adsorption was investigated. It was found that surface chemistry plays an important role in Ethyl violet (EV) adsorption. HCl treatment produces more active acidic surface groups such as carboxylic and lactone, resulting in an increase in the adsorption of EV dye. The adsorption efficiency was higher for treated of lingocellulosic material with HCl than for treated with KOH. Maximum biosorption capacity was 170 and 130 mg/g, for treated of lingocellulosic material with HCl than for treated with KOH at pH 6 respectively. It was also found that the time to reach equilibrium takes less than 25 min for both treated materials. The adsorption of basic dye (i.e., ethyl violet or basic violet 4) was carried out by varying some process parameters, such as initial concentration, pH and temperature. The adsorption process can be well described by means of a pseudo-second-order reaction model showing that boundary layer resistance was not the rate-limiting step, as confirmed by intraparticle diffusion since the linear plot of Qt versus t^0.5 did not pass through the origin. In addition, experimental data were accurately expressed by the Sips equation if compared with the Langmuir and Freundlich isotherms. The values of ΔG° and ΔH° confirmed that the adsorption of EV on acid-basic treated forest wast wild carob was spontaneous and endothermic in nature. The positive values of ΔS° suggested an irregular increase of the randomness at the treated lingocellulosic material -solution interface during the adsorption process.

Keywords: adsorption, isotherm models, thermodynamic parameters, wild carob

Procedia PDF Downloads 273
1543 A Genetic Algorithm to Schedule the Flow Shop Problem under Preventive Maintenance Activities

Authors: J. Kaabi, Y. Harrath

Abstract:

This paper studied the flow shop scheduling problem under machine availability constraints. The machines are subject to flexible preventive maintenance activities. The nonresumable scenario for the jobs was considered. That is, when a job is interrupted by an unavailability period of a machine it should be restarted from the beginning. The objective is to minimize the total tardiness time for the jobs and the advance/tardiness for the maintenance activities. To solve the problem, a genetic algorithm was developed and successfully tested and validated on many problem instances. The computational results showed that the new genetic algorithm outperforms another earlier proposed algorithm.

Keywords: flow shop scheduling, genetic algorithm, maintenance, priority rules

Procedia PDF Downloads 469
1542 Selective Oxidation of 6Mn-2Si Advanced High Strength Steels during Intercritical Annealing Treatment

Authors: Maedeh Pourmajidian, Joseph R. McDermid

Abstract:

Advanced High Strength Steels are revolutionizing both the steel and automotive industries due to their high specific strength and ability to absorb energy during crash events. This allows manufacturers to design vehicles with significantly increased fuel efficiency without compromising passenger safety. To maintain the structural integrity of the fabricated parts, they must be protected from corrosion damage through continuous hot-dip galvanizing process, which is challenging due to selective oxidation of Mn and Si on the surface of this AHSSs. The effects of process atmosphere oxygen partial pressure and small additions of Sn on the selective oxidation of a medium-Mn C-6Mn-2Si advanced high strength steel was investigated. Intercritical annealing heat treatments were carried out at 690˚C in an N2-5%H2 process atmosphere under dew points ranging from –50˚C to +5˚C. Surface oxide chemistries, morphologies, and thicknesses were determined at a variety of length scales by several techniques, including SEM, TEM+EELS, and XPS. TEM observations of the sample cross-sections revealed the transition to internal oxidation at the +5˚C dew point. EELS results suggested that the internal oxides network was composed of a multi-layer oxide structure with varying chemistry from oxide core towards the outer part. The combined effect of employing a known surface active element as a function of process atmosphere on the surface structure development and the possible impact on reactive wetting of the steel substrates by the continuous galvanizing zinc bath will be discussed.

Keywords: 3G AHSS, hot-dip galvanizing, oxygen partial pressure, selective oxidation

Procedia PDF Downloads 396
1541 Analysis of Waterjet Propulsion System for an Amphibious Vehicle

Authors: Nafsi K. Ashraf, C. V. Vipin, V. Anantha Subramanian

Abstract:

This paper reports the design of a waterjet propulsion system for an amphibious vehicle based on circulation distribution over the camber line for the sections of the impeller and stator. In contrast with the conventional waterjet design, the inlet duct is straight for water entry parallel and in line with the nozzle exit. The extended nozzle after the stator bowl makes the flow more axial further improving thrust delivery. Waterjet works on the principle of volume flow rate through the system and unlike the propeller, it is an internal flow system. The major difference between the propeller and the waterjet occurs at the flow passing the actuator. Though a ducted propeller could constitute the equivalent of waterjet propulsion, in a realistic situation, the nozzle area for the Waterjet would be proportionately larger to the inlet area and propeller disc area. Moreover, the flow rate through impeller disk is controlled by nozzle area. For these reasons the waterjet design is based on pump systems rather than propellers and therefore it is important to bring out the characteristics of the flow from this point of view. The analysis is carried out using computational fluid dynamics. Design of waterjet propulsion is carried out adapting the axial flow pump design and performance analysis was done with three-dimensional computational fluid dynamics (CFD) code. With the varying environmental conditions as well as with the necessity of high discharge and low head along with the space confinement for the given amphibious vehicle, an axial pump design is suitable. The major problem of inlet velocity distribution is the large variation of velocity in the circumferential direction which gives rise to heavy blade loading that varies with time. The cavitation criteria have also been taken into account as per the hydrodynamic pump design. Generally, waterjet propulsion system can be parted into the inlet, the pump, the nozzle and the steering device. The pump further comprises an impeller and a stator. Analytical and numerical approaches such as RANSE solver has been undertaken to understand the performance of designed waterjet propulsion system. Unlike in case of propellers the analysis was based on head flow curve with efficiency and power curves. The modeling of the impeller is performed using rigid body motion approach. The realizable k-ϵ model has been used for turbulence modeling. The appropriate boundary conditions are applied for the domain, domain size and grid dependence studies are carried out.

Keywords: amphibious vehicle, CFD, impeller design, waterjet propulsion

Procedia PDF Downloads 221
1540 On-Road Text Detection Platform for Driver Assistance Systems

Authors: Guezouli Larbi, Belkacem Soundes

Abstract:

The automation of the text detection process can help the human in his driving task. Its application can be very useful to help drivers to have more information about their environment by facilitating the reading of road signs such as directional signs, events, stores, etc. In this paper, a system consisting of two stages has been proposed. In the first one, we used pseudo-Zernike moments to pinpoint areas of the image that may contain text. The architecture of this part is based on three main steps, region of interest (ROI) detection, text localization, and non-text region filtering. Then, in the second step, we present a convolutional neural network architecture (On-Road Text Detection Network - ORTDN) which is considered a classification phase. The results show that the proposed framework achieved ≈ 35 fps and an mAP of ≈ 90%, thus a low computational time with competitive accuracy.

Keywords: text detection, CNN, PZM, deep learning

Procedia PDF Downloads 80
1539 Automated Transformation of 3D Point Cloud to BIM Model: Leveraging Algorithmic Modeling for Efficient Reconstruction

Authors: Radul Shishkov, Orlin Davchev

Abstract:

The digital era has revolutionized architectural practices, with building information modeling (BIM) emerging as a pivotal tool for architects, engineers, and construction professionals. However, the transition from traditional methods to BIM-centric approaches poses significant challenges, particularly in the context of existing structures. This research introduces a technical approach to bridge this gap through the development of algorithms that facilitate the automated transformation of 3D point cloud data into detailed BIM models. The core of this research lies in the application of algorithmic modeling and computational design methods to interpret and reconstruct point cloud data -a collection of data points in space, typically produced by 3D scanners- into comprehensive BIM models. This process involves complex stages of data cleaning, feature extraction, and geometric reconstruction, which are traditionally time-consuming and prone to human error. By automating these stages, our approach significantly enhances the efficiency and accuracy of creating BIM models for existing buildings. The proposed algorithms are designed to identify key architectural elements within point clouds, such as walls, windows, doors, and other structural components, and to translate these elements into their corresponding BIM representations. This includes the integration of parametric modeling techniques to ensure that the generated BIM models are not only geometrically accurate but also embedded with essential architectural and structural information. Our methodology has been tested on several real-world case studies, demonstrating its capability to handle diverse architectural styles and complexities. The results showcase a substantial reduction in time and resources required for BIM model generation while maintaining high levels of accuracy and detail. This research contributes significantly to the field of architectural technology by providing a scalable and efficient solution for the integration of existing structures into the BIM framework. It paves the way for more seamless and integrated workflows in renovation and heritage conservation projects, where the accuracy of existing conditions plays a critical role. The implications of this study extend beyond architectural practices, offering potential benefits in urban planning, facility management, and historic preservation.

Keywords: BIM, 3D point cloud, algorithmic modeling, computational design, architectural reconstruction

Procedia PDF Downloads 58
1538 Use of Chlorophyll Meters to Assess In-Season Wheat Nitrogen Fertilizer Requirements in the Southern San Joaquin Valley

Authors: Brian Marsh

Abstract:

Nitrogen fertilizer is the most used and often the most mismanaged nutrient input. Nitrogen management has tremendous implications on crop productivity, quality and environmental stewardship. Sufficient nitrogen is needed to optimum yield and quality. Soil and in-season plant tissue testing for nitrogen status are a time consuming and expensive process. Real time sensing of plant nitrogen status can be a useful tool in managing nitrogen inputs. The objectives of this project were to assess the reliability of remotely sensed non-destructive plant nitrogen measurements compared to wet chemistry data from sampled plant tissue, develop in-season nitrogen recommendations based on remotely sensed data for improved nitrogen use efficiency and assess the potential for determining yield and quality from remotely sensed data. Very good correlations were observed between early-season remotely sensed crop nitrogen status and plant nitrogen concentrations and subsequent in-season fertilizer recommendations. The transmittance/absorbance type meters gave the most accurate readings. Early in-season fertilizer recommendation would be to apply 40 kg nitrogen per hectare plus 16 kg nitrogen per hectare for each unit difference measured with the SPAD meter between the crop and reference area or 25 kg plus 13 kg per hectare for each unit difference measured with the CCM 200. Once the crop was sufficiently fertilized meter readings became inconclusive and were of no benefit for determining nitrogen status, silage yield and quality and grain yield and protein.

Keywords: wheat, nitrogen fertilization, chlorophyll meter

Procedia PDF Downloads 390
1537 Identification and Characterization of Inhibitors of Epoxide Hydrolase from Trichoderma reesei

Authors: Gabriel S. De Oliveira, Patricia P. Adriani, Christophe Moriseau, Bruce D. Hammock, Felipe S. Chambergo

Abstract:

Epoxide hydrolases (EHs) are enzymes that are present in all living organisms and catalyze the hydrolysis of epoxides to the corresponding vicinal diols. EHs have high biotechnological interest for the drug design and chemistry transformation for industries. In this study, we describe the identification of substrates and inhibitors of epoxide hydrolase enzyme from the filamentous fungus Trichoderma reesei (TrEH), and these inhibitors showed the fungal growth inhibitory activity. We have used the cloned enzyme and expressed in E. coli to develop the screening in the library of fluorescent substrates with the objective of finding the best substrate to be used in the identification of good inhibitors for the enzyme TrEH. The substrate (3-phenyloxiranyl)-acetic acid cyano-(6-methoxy-naphthalen-2-yl)-methyl ester showed the highest specific activity and was chosen for the next steps of the study. The inhibitors screening was performed in the library with more than three thousand molecules and we could identify the 6 best inhibitors. The IC50 of these molecules were determined in nM and all the best inhibitors have urea or amide in their structure, because It has been recognized that these groups fit well in the hydrolase catalytic pocket of the epoxide hydrolases. Then the growth of T. reesei in PDA medium containing these TrEH inhibitors was tested, and fungal growth inhibition activity was demonstrated with more than 60% of inhibition of fungus growth in the assay with the TrEH inhibitor with the lowest IC50. Understanding how this EH enzyme from T. reesei responds to inhibitors may contribute for the study of fungal metabolism and drug design against pathogenic fungi.

Keywords: epoxide hydrolases, fungal growth inhibition, inhibitor, Trichoderma reesei

Procedia PDF Downloads 197
1536 Solving Stochastic Eigenvalue Problem of Wick Type

Authors: Hassan Manouzi, Taous-Meriem Laleg-Kirati

Abstract:

In this paper we study mathematically the eigenvalue problem for stochastic elliptic partial differential equation of Wick type. Using the Wick-product and the Wiener-Ito chaos expansion, the stochastic eigenvalue problem is reformulated as a system of an eigenvalue problem for a deterministic partial differential equation and elliptic partial differential equations by using the Fredholm alternative. To reduce the computational complexity of this system, we shall use a decomposition-coordination method. Once this approximation is performed, the statistics of the numerical solution can be easily evaluated.

Keywords: eigenvalue problem, Wick product, SPDEs, finite element, Wiener-Ito chaos expansion

Procedia PDF Downloads 355
1535 Optimization of Reliability Test Plans: Increase Wafer Fabrication Equipments Uptime

Authors: Swajeeth Panchangam, Arun Rajendran, Swarnim Gupta, Ahmed Zeouita

Abstract:

Semiconductor processing chambers tend to operate in controlled but aggressive operating conditions (chemistry, plasma, high temperature etc.) Owing to this, the design of this equipment requires developing robust and reliable hardware and software. Any equipment downtime due to reliability issues can have cost implications both for customers in terms of tool downtime (reduced throughput) and for equipment manufacturers in terms of high warranty costs and customer trust deficit. A thorough reliability assessment of critical parts and a plan for preventive maintenance/replacement schedules need to be done before tool shipment. This helps to save significant warranty costs and tool downtimes in the field. However, designing a proper reliability test plan to accurately demonstrate reliability targets with proper sample size and test duration is quite challenging. This is mainly because components can fail in different failure modes that fit into different Weibull beta value distributions. Without apriori Weibull beta of a failure mode under consideration, it always leads to over/under utilization of resources, which eventually end up in false positives or false negatives estimates. This paper proposes a methodology to design a reliability test plan with optimal model size/duration/both (independent of apriori Weibull beta). This methodology can be used in demonstration tests and can be extended to accelerated life tests to further decrease sample size/test duration.

Keywords: reliability, stochastics, preventive maintenance

Procedia PDF Downloads 3
1534 A Study of Cloud Computing Solution for Transportation Big Data Processing

Authors: Ilgin Gökaşar, Saman Ghaffarian

Abstract:

The need for fast processed big data of transportation ridership (eg., smartcard data) and traffic operation (e.g., traffic detectors data) which requires a lot of computational power is incontrovertible in Intelligent Transportation Systems. Nowadays cloud computing is one of the important subjects and popular information technology solution for data processing. It enables users to process enormous measure of data without having their own particular computing power. Thus, it can also be a good selection for transportation big data processing as well. This paper intends to examine how the cloud computing can enhance transportation big data process with contrasting its advantages and disadvantages, and discussing cloud computing features.

Keywords: big data, cloud computing, Intelligent Transportation Systems, ITS, traffic data processing

Procedia PDF Downloads 461
1533 Characterization of the Dispersion Phenomenon in an Optical Biosensor

Authors: An-Shik Yang, Chin-Ting Kuo, Yung-Chun Yang, Wen-Hsin Hsieh, Chiang-Ho Cheng

Abstract:

Optical biosensors have become a powerful detection and analysis tool for wide-ranging applications in biomedical research, pharmaceuticals and environmental monitoring. This study carried out the computational fluid dynamics (CFD)-based simulations to explore the dispersion phenomenon in the microchannel of a optical biosensor. The predicted time sequences of concentration contours were utilized to better understand the dispersion development occurred in different geometric shapes of microchannels. The simulation results showed the surface concentrations at the sensing probe (with the best performance of a grating coupler) in respect of time to appraise the dispersion effect and therefore identify the design configurations resulting in minimum dispersion.

Keywords: CFD simulations, dispersion, microfluidic, optical waveguide sensors

Procedia PDF Downloads 542
1532 Element-Independent Implementation for Method of Lagrange Multipliers

Authors: Gil-Eon Jeong, Sung-Kie Youn, K. C. Park

Abstract:

Treatment for the non-matching interface is an important computational issue. To handle this problem, the method of Lagrange multipliers including classical and localized versions are the most popular technique. It essentially imposes the interface compatibility conditions by introducing Lagrange multipliers. However, the numerical system becomes unstable and inefficient due to the Lagrange multipliers. The interface element-independent formulation that does not include the Lagrange multipliers can be obtained by modifying the independent variables mathematically. Through this modification, more efficient and stable system can be achieved while involving equivalent accuracy comparing with the conventional method. A numerical example is conducted to verify the validity of the presented method.

Keywords: element-independent formulation, interface coupling, methods of Lagrange multipliers, non-matching interface

Procedia PDF Downloads 402
1531 Assessing Future Isoprene Emissions in Southeast Asia: Climate Change Implications

Authors: Justin Sentian, Franky Herman, Maggie Chel Gee Ooi, Vivian Kong WAN Yee, Teo You Rou, Chin Jia Hui

Abstract:

Isoprene emission is known to depend heavily on temperature and radiation. Considering these environmental factors together is crucial for a comprehensive understanding of the impact of climate change on isoprene emissions and atmospheric chemistry. Therefore, the aim of this study is to investigate how isoprene emission responds to changing climate scenarios in Southeast Asia (SEA). Two climate change scenarios, RCP4.5 and RCP8.5, were used to simulate climate change using the Weather Research Forecasting (WRF v3.9.1) model in three different time periods: near-future (2030-2039), mid-century (2050-2059), and far future (2090-2099), with 2010 (2005-2014) as the baseline period. The output from WRF was then used to investigate how isoprene emission changes under a changing climate by using the Model Emission of Gases and Aerosol from Nature (MEGAN v2.1). The results show that the overall isoprene emissions during the baseline period are 1.41 tons hr-1 during DJF and 1.64 tons hr-1 during JJA. The overall emissions for both RCPs slightly increase during DJF, ranging from 0.03 to 0.06 tons hr-1 in the near future, 0.11 to 0.19 tons hr-1 in the mid-century, and 0.24 to 0.52 tons hr-1 in the far future. During JJA season, environmental conditions often favour higher emission rates in MEGAN due to their optimal state. Isoprene emissions also show a strong positive correlation (0.81 – 1.00) with temperature and photosynthetic active radiation (PAR). The future emission rate of isoprene is strongly modulated by both temperature and PAR, as indicated by a strong positive correlation (0.81 - 1.00). This relationship underscores the fact that future warming will not be the sole driver impacting isoprene emissions. Therefore, it is essential to consider the multifaceted effect of climate change in shaping the levels of isoprene in the future.

Keywords: isoprene, climate change, Southeast Asia, WRF, MEGAN.

Procedia PDF Downloads 18
1530 A Parallel Cellular Automaton Model of Tumor Growth for Multicore and GPU Programming

Authors: Manuel I. Capel, Antonio Tomeu, Alberto Salguero

Abstract:

Tumor growth from a transformed cancer-cell up to a clinically apparent mass spans through a range of spatial and temporal magnitudes. Through computer simulations, Cellular Automata (CA) can accurately describe the complexity of the development of tumors. Tumor development prognosis can now be made -without making patients undergo through annoying medical examinations or painful invasive procedures- if we develop appropriate CA-based software tools. In silico testing mainly refers to Computational Biology research studies of application to clinical actions in Medicine. To establish sound computer-based models of cellular behavior, certainly reduces costs and saves precious time with respect to carrying out experiments in vitro at labs or in vivo with living cells and organisms. These aim to produce scientifically relevant results compared to traditional in vitro testing, which is slow, expensive, and does not generally have acceptable reproducibility under the same conditions. For speeding up computer simulations of cellular models, specific literature shows recent proposals based on the CA approach that include advanced techniques, such the clever use of supporting efficient data structures when modeling with deterministic stochastic cellular automata. Multiparadigm and multiscale simulation of tumor dynamics is just beginning to be developed by the concerned research community. The use of stochastic cellular automata (SCA), whose parallel programming implementations are open to yield a high computational performance, are of much interest to be explored up to their computational limits. There have been some approaches based on optimizations to advance in multiparadigm models of tumor growth, which mainly pursuit to improve performance of these models through efficient memory accesses guarantee, or considering the dynamic evolution of the memory space (grids, trees,…) that holds crucial data in simulations. In our opinion, the different optimizations mentioned above are not decisive enough to achieve the high performance computing power that cell-behavior simulation programs actually need. The possibility of using multicore and GPU parallelism as a promising multiplatform and framework to develop new programming techniques to speed-up the computation time of simulations is just starting to be explored in the few last years. This paper presents a model that incorporates parallel processing, identifying the synchronization necessary for speeding up tumor growth simulations implemented in Java and C++ programming environments. The speed up improvement that specific parallel syntactic constructs, such as executors (thread pools) in Java, are studied. The new tumor growth parallel model is proved using implementations with Java and C++ languages on two different platforms: chipset Intel core i-X and a HPC cluster of processors at our university. The parallelization of Polesczuk and Enderling model (normally used by researchers in mathematical oncology) proposed here is analyzed with respect to performance gain. We intend to apply the model and overall parallelization technique presented here to solid tumors of specific affiliation such as prostate, breast, or colon. Our final objective is to set up a multiparadigm model capable of modelling angiogenesis, or the growth inhibition induced by chemotaxis, as well as the effect of therapies based on the presence of cytotoxic/cytostatic drugs.

Keywords: cellular automaton, tumor growth model, simulation, multicore and manycore programming, parallel programming, high performance computing, speed up

Procedia PDF Downloads 241
1529 Flow Characterization in Complex Terrain for Aviation Safety

Authors: Adil Rasheed, Mandar Tabib

Abstract:

The paper describes the ability of a high-resolution Computational Fluid Dynamics model to predict terrain-induced turbulence and wind shear close to the ground. Various sensitivity studies to choose the optimal simulation setup for modeling the flow characteristics in a complex terrain are presented. The capabilities of the model are demonstrated by applying it to the Sandnessjøen Airport, Stokka in Norway, an airport that is located in a mountainous area. The model is able to forecast turbulence in real time and trigger an alert when atmospheric conditions might result in high wind shear and turbulence.

Keywords: aviation safety, terrain-induced turbulence, atmospheric flow, alert system

Procedia PDF Downloads 412
1528 Alteration of Bone Strength in Osteoporosis of Mouse Femora: Computational Study Based on Micro CT Images

Authors: Changsoo Chon, Sangkuy Han, Donghyun Seo, Jihyung Park, Bokku Kang, Hansung Kim, Keyoungjin Chun, Cheolwoong Ko

Abstract:

The purpose of the study is to develop a finite element model based on 3D bone structural images of Micro-CT and to analyze the stress distribution for the osteoporosis mouse femora. In this study, results of finite element analysis show that the early osteoporosis of mouse model decreased a bone density in trabecular region; however, the bone density in cortical region increased.

Keywords: micro-CT, finite element analysis, osteoporosis, bone strength

Procedia PDF Downloads 358
1527 Impact of Agriculture on the Groundwater Quality: Case of the Alluvial Plain of Nil River (North-Eastern Algerian)

Authors: S. Benessam, T. H. Debieche, A. Drouiche, F. Zahi, S. Mahdid

Abstract:

The intensive use of the chemical fertilizers and the pesticides in agriculture often produces a contamination of the groundwater by organic pollutants. The irrigation and/or rainwater transport the pollutants towards groundwater or water surface. Among these pollutants, one finds the nitrogen, often observed in the agricultural zones in the nitrate form. In order to understand the form and chemical mobility of nitrogen in groundwater, this study was conducted. A two-monthly monitoring of the parameters physicochemical and chemistry of water of the alluvial plain of Nil river (North-eastern Algerian) were carried out during the period from November 2013 to January 2015 as well as an in-situ investigation of the various chemical products used by the farmers. The results show a raise concentration of nitrates in the wells (depth < 20 m) of the plain, which the concentrations arrive at 50 mg/L (standard of potable water). On the other hand in drillings (depth > 20 m), one observes two behaviors. The first in the upstream part, where the aquifer is unconfined and the medium is oxidizing, one observes the weak nitrate concentrations, indicating its absorption by the ground during the infiltration of water towards the groundwater. The second in the central and downstream parts, where the groundwater is locally confined and the reducing medium, one observes an absence of nitrates and the appearance of nitrites and ammonium, indicating the reduction of nitrates. The projection of the analyses on diagrams Eh-pH of nitrogen has enabled to us to determine the intervals of variation of the nitrogen forms. This study also highlighted the effect of the rains, the pumping and the nature of the geological formations in the form and the mobility of nitrogen in the plain.

Keywords: groundwater, nitrogen, mobility, speciation

Procedia PDF Downloads 245
1526 Hierarchical Checkpoint Protocol in Data Grids

Authors: Rahma Souli-Jbali, Minyar Sassi Hidri, Rahma Ben Ayed

Abstract:

Grid of computing nodes has emerged as a representative means of connecting distributed computers or resources scattered all over the world for the purpose of computing and distributed storage. Since fault tolerance becomes complex due to the availability of resources in decentralized grid environment, it can be used in connection with replication in data grids. The objective of our work is to present fault tolerance in data grids with data replication-driven model based on clustering. The performance of the protocol is evaluated with Omnet++ simulator. The computational results show the efficiency of our protocol in terms of recovery time and the number of process in rollbacks.

Keywords: data grids, fault tolerance, clustering, chandy-lamport

Procedia PDF Downloads 335
1525 Solving the Pseudo-Geometric Traveling Salesman Problem with the “Union Husk” Algorithm

Authors: Boris Melnikov, Ye Zhang, Dmitrii Chaikovskii

Abstract:

This study explores the pseudo-geometric version of the extensively researched Traveling Salesman Problem (TSP), proposing a novel generalization of existing algorithms which are traditionally confined to the geometric version. By adapting the "onion husk" method and introducing auxiliary algorithms, this research fills a notable gap in the existing literature. Through computational experiments using randomly generated data, several metrics were analyzed to validate the proposed approach's efficacy. Preliminary results align with expected outcomes, indicating a promising advancement in TSP solutions.

Keywords: optimization problems, traveling salesman problem, heuristic algorithms, “onion husk” algorithm, pseudo-geometric version

Procedia PDF Downloads 202
1524 Green approach of Anticorrosion Coating of Steel Based on Polybenzoxazine/Henna Nanocomposites

Authors: Salwa M. Elmesallamy, Ahmed A. Farag, Magd M. Badr, Dalia S. Fathy, Ahmed Bakry, Mona A. El-Etre

Abstract:

The term green environment is an international trend. It is become imperative to treat the corrosion of steel with a green coating to protect the environment. From the potential adverse effects of the traditional materials.A series of polybenzoxazine/henna composites (PBZ/henna), with different weight percent (3,5, and 7 wt % (of henna), were prepared for corrosion protection of carbon steel. The structures of the prepared composites were verified using FTIR analysis. The mechanical properties of the resins, such as adhesion, hardness, binding, and tensile strength, were also measured. It was found that the tensile strength increases by henna loading up to 25% higher than the tidy resin. The thermal stability was investigated by thermogravimetric analysis (TGA) the loading of lawsone (henna) molecules into the PBZ matrix increases the thermal stability of the composite. UV stability was tested by the UV weathering accelerator to examine the possibility that henna can also act as an aging UV stabilizer. The effect of henna content on the corrosion resistance of composite coatings was tested using potentiostatic polarization and electrochemical spectroscopy. The presence of henna in the coating matrix enhances the protection efficiency of polybenzoxazine coats. Increasing henna concentration increases the protection efficiency of composites. The quantum chemical calculations for polybenzoxazine/henna composites have resulted that the highest corrosion inhibition efficiency, has the highest EHOMO and lowest ELUMO; which is in good agreement with results obtained from experiments.

Keywords: polybenzoxazine, corrosion, green chemistry, carbon steel

Procedia PDF Downloads 92
1523 Application of Wavelet Based Approximation for the Solution of Partial Integro-Differential Equation Arising from Viscoelasticity

Authors: Somveer Singh, Vineet Kumar Singh

Abstract:

This work contributes a numerical method based on Legendre wavelet approximation for the treatment of partial integro-differential equation (PIDE). Operational matrices of Legendre wavelets reduce the solution of PIDE into the system of algebraic equations. Some useful results concerning the computational order of convergence and error estimates associated to the suggested scheme are presented. Illustrative examples are provided to show the effectiveness and accuracy of proposed numerical method.

Keywords: legendre wavelets, operational matrices, partial integro-differential equation, viscoelasticity

Procedia PDF Downloads 445
1522 A Hebbian Neural Network Model of the Stroop Effect

Authors: Vadim Kulikov

Abstract:

The classical Stroop effect is the phenomenon that it takes more time to name the ink color of a printed word if the word denotes a conflicting color than if it denotes the same color. Over the last 80 years, there have been many variations of the experiment revealing various mechanisms behind semantic, attentional, behavioral and perceptual processing. The Stroop task is known to exhibit asymmetry. Reading the words out loud is hardly dependent on the ink color, but naming the ink color is significantly influenced by the incongruent words. This asymmetry is reversed, if instead of naming the color, one has to point at a corresponding color patch. Another debated aspects are the notions of automaticity and how much of the effect is due to semantic and how much due to response stage interference. Is automaticity a continuous or an all-or-none phenomenon? There are many models and theories in the literature tackling these questions which will be discussed in the presentation. None of them, however, seems to capture all the findings at once. A computational model is proposed which is based on the philosophical idea developed by the author that the mind operates as a collection of different information processing modalities such as different sensory and descriptive modalities, which produce emergent phenomena through mutual interaction and coherence. This is the framework theory where ‘framework’ attempts to generalize the concepts of modality, perspective and ‘point of view’. The architecture of this computational model consists of blocks of neurons, each block corresponding to one framework. In the simplest case there are four: visual color processing, text reading, speech production and attention selection modalities. In experiments where button pressing or pointing is required, a corresponding block is added. In the beginning, the weights of the neural connections are mostly set to zero. The network is trained using Hebbian learning to establish connections (corresponding to ‘coherence’ in framework theory) between these different modalities. The amount of data fed into the network is supposed to mimic the amount of practice a human encounters, in particular it is assumed that converting written text into spoken words is a more practiced skill than converting visually perceived colors to spoken color-names. After the training, the network performs the Stroop task. The RT’s are measured in a canonical way, as these are continuous time recurrent neural networks (CTRNN). The above-described aspects of the Stroop phenomenon along with many others are replicated. The model is similar to some existing connectionist models but as will be discussed in the presentation, has many advantages: it predicts more data, the architecture is simpler and biologically more plausible.

Keywords: connectionism, Hebbian learning, artificial neural networks, philosophy of mind, Stroop

Procedia PDF Downloads 263