Search results for: continuum damage model
17329 Further Investigation of α+12C and α+16O Elastic Scattering
Authors: Sh. Hamada
Abstract:
The current work aims to study the rainbow like-structure observed in the elastic scattering of alpha particles on both 12C and 16O nuclei. We reanalyzed the experimental elastic scattering angular distributions data for α+12C and α+16O nuclear systems at different energies using both optical model and double folding potential of different interaction models such as: CDM3Y1, DDM3Y1, CDM3Y6 and BDM3Y1. Potential created by BDM3Y1 interaction model has the shallowest depth which reflects the necessity to use higher renormalization factor (Nr). Both optical model and double folding potential of different interaction models fairly reproduce the experimental data.Keywords: density distribution, double folding, elastic scattering, nuclear rainbow, optical model
Procedia PDF Downloads 23717328 Evaluation of the Gamma-H2AX Expression as a Biomarker of DNA Damage after X-Ray Radiation in Angiography Patients
Authors: Reza Fardid, Aliyeh Alipour
Abstract:
Introduction: Coronary heart disease (CHD) is the most common and deadliest diseases. A coronary angiography is an important tool for the diagnosis and treatment of this disease. Because angiography is performed by exposure to ionizing radiation, it can lead to harmful effects. Ionizing radiation induces double-stranded breaks in DNA, which is a potentially life-threatening injury. The purpose of the present study is an investigation of the phosphorylation of histone H2AX in the location of the double-stranded break in Peripheral blood lymphocytes as an indication of Biological effects of radiation on angiography patients. Materials and Methods: This method is based on measurement of the phosphorylation of histone (gamma-H2AX, gH2AX) level on serine 139 after formation of DNA double-strand break. 5 cc of blood from 24 patients with angiography were sampled before and after irradiation. Blood lymphocytes were removed, fixed and were stained with specific ϒH2AX antibodies. Finally, ϒH2AX signal as an indicator of the double-strand break was measured with Flow Cytometry Technique. Results and discussion: In all patients, an increase was observed in the number of breaks in double-stranded DNA after irradiation (20.15 ± 14.18) compared to before exposure (1.52 ± 0.34). Also, the mean of DNA double-strand break was showed a linear correlation with DAP. However, although induction of DNA double-strand breaks associated with radiation dose in patients, the effect of individual factors such as radiosensitivity and regenerative capacity should not be ignored. If in future we can measure DNA damage response in every patient angiography and it will be used as a biomarker patient dose, will look very impressive on the public health level. Conclusion: Using flow cytometry readings which are done automatically, it is possible to detect ϒH2AX in the number of blood cells. Therefore, the use of this technique could play a significant role in monitoring patients.Keywords: coronary angiography, DSB of DNA, ϒH2AX, ionizing radiation
Procedia PDF Downloads 18417327 Computational Model of Human Cardiopulmonary System
Authors: Julian Thrash, Douglas Folk, Michael Ciracy, Audrey C. Tseng, Kristen M. Stromsodt, Amber Younggren, Christopher Maciolek
Abstract:
The cardiopulmonary system is comprised of the heart, lungs, and many dynamic feedback mechanisms that control its function based on a multitude of variables. The next generation of cardiopulmonary medical devices will involve adaptive control and smart pacing techniques. However, testing these smart devices on living systems may be unethical and exceedingly expensive. As a solution, a comprehensive computational model of the cardiopulmonary system was implemented in Simulink. The model contains over 240 state variables and over 100 equations previously described in a series of published articles. Simulink was chosen because of its ease of introducing machine learning elements. Initial results indicate that physiologically correct waveforms of pressures and volumes were obtained in the simulation. With the development of a comprehensive computational model, we hope to pioneer the future of predictive medicine by applying our research towards the initial stages of smart devices. After validation, we will introduce and train reinforcement learning agents using the cardiopulmonary model to assist in adaptive control system design. With our cardiopulmonary model, we will accelerate the design and testing of smart and adaptive medical devices to better serve those with cardiovascular disease.Keywords: adaptive control, cardiopulmonary, computational model, machine learning, predictive medicine
Procedia PDF Downloads 18117326 Behaviour of Rc Column under Biaxial Cyclic Loading-State of the Art
Authors: L. Pavithra, R. Sharmila, Shivani Sridhar
Abstract:
Columns severe structural damage needs proportioning a significant portion of earthquake energy can be dissipated yielding in the beams. Presence of axial load along with cyclic loading has a significant influence on column. The objective of this paper is to present the analytical results of columns subjected to biaxial cyclic loading.Keywords: RC column, Seismic behaviour, cyclic behaviour, biaxial testing, ductile behaviour
Procedia PDF Downloads 36617325 Functional Instruction Set Simulator (ISS) of a Neural Network (NN) IP with Native BF-16 Generator
Authors: Debajyoti Mukherjee, Arathy B. S., Arpita Sahu, Saranga P. Pogula
Abstract:
A Functional Model to mimic the functional correctness of a Neural Network Compute Accelerator IP is very crucial for design validation. Neural network workloads are based on a Brain Floating Point (BF-16) data type. The major challenge we were facing was the incompatibility of gcc compilers to BF-16 datatype, which we addressed with a native BF-16 generator integrated to our functional model. Moreover, working with big GEMM (General Matrix Multiplication) or SpMM (Sparse Matrix Multiplication) Work Loads (Dense or Sparse) and debugging the failures related to data integrity is highly painstaking. In this paper, we are addressing the quality challenge of such a complex Neural Network Accelerator design by proposing a Functional Model-based scoreboard or Software model using SystemC. The proposed Functional Model executes the assembly code based on the ISA of the processor IP, decodes all instructions, and executes as expected to be done by the DUT. The said model would give a lot of visibility and debug capability in the DUT bringing up micro-steps of execution.Keywords: ISA (instruction set architecture), NN (neural network), TLM (transaction-level modeling), GEMM (general matrix multiplication)
Procedia PDF Downloads 8617324 Prediction of Bubbly Plume Characteristics Using the Self-Similarity Model
Authors: Li Chen, Alex Skvortsov, Chris Norwood
Abstract:
Gas releasing into water can be found in for many industrial situations. This process results in the formation of bubbles and acoustic emission which depends upon the bubble characteristics. If the bubble creation rates (bubble volume flow rate) are of interest, an inverse method has to be used based on the measurement of acoustic emission. However, there will be sound attenuation through the bubbly plume which will influence the measurement and should be taken into consideration in the model. The sound transmission through the bubbly plume depends on the characteristics of the bubbly plume, such as the shape and the bubble distributions. In this study, the bubbly plume shape is modelled using a self-similarity model, which has been normally applied for a single phase buoyant plume. The prediction is compared with the experimental data. It has been found the model can be applied to a buoyant plume of gas-liquid mixture. The influence of the gas flow rate and discharge nozzle size is studied.Keywords: bubbly plume, buoyant plume, bubble acoustics, self-similarity model
Procedia PDF Downloads 28717323 End-to-End Spanish-English Sequence Learning Translation Model
Authors: Vidhu Mitha Goutham, Ruma Mukherjee
Abstract:
The low availability of well-trained, unlimited, dynamic-access models for specific languages makes it hard for corporate users to adopt quick translation techniques and incorporate them into product solutions. As translation tasks increasingly require a dynamic sequence learning curve; stable, cost-free opensource models are scarce. We survey and compare current translation techniques and propose a modified sequence to sequence model repurposed with attention techniques. Sequence learning using an encoder-decoder model is now paving the path for higher precision levels in translation. Using a Convolutional Neural Network (CNN) encoder and a Recurrent Neural Network (RNN) decoder background, we use Fairseq tools to produce an end-to-end bilingually trained Spanish-English machine translation model including source language detection. We acquire competitive results using a duo-lingo-corpus trained model to provide for prospective, ready-made plug-in use for compound sentences and document translations. Our model serves a decent system for large, organizational data translation needs. While acknowledging its shortcomings and future scope, it also identifies itself as a well-optimized deep neural network model and solution.Keywords: attention, encoder-decoder, Fairseq, Seq2Seq, Spanish, translation
Procedia PDF Downloads 17517322 Evaluation of the Heating Capability and in vitro Hemolysis of Nanosized MgxMn1-xFe2O4 (x = 0.3 and 0.4) Ferrites Prepared by Sol-gel Method
Authors: Laura Elena De León Prado, Dora Alicia Cortés Hernández, Javier Sánchez
Abstract:
Among the different cancer treatments that are currently used, hyperthermia has a promising potential due to the multiple benefits that are obtained by this technique. In general terms, hyperthermia is a method that takes advantage of the sensitivity of cancer cells to heat, in order to damage or destroy them. Within the different ways of supplying heat to cancer cells and achieve their destruction or damage, the use of magnetic nanoparticles has attracted attention due to the capability of these particles to generate heat under the influence of an external magnetic field. In addition, these nanoparticles have a high surface area and sizes similar or even lower than biological entities, which allow their approaching and interaction with a specific region of interest. The most used magnetic nanoparticles for hyperthermia treatment are those based on iron oxides, mainly magnetite and maghemite, due to their biocompatibility, good magnetic properties and chemical stability. However, in order to fulfill more efficiently the requirements that demand the treatment of magnetic hyperthermia, there have been investigations using ferrites that incorporate different metallic ions, such as Mg, Mn, Co, Ca, Ni, Cu, Li, Gd, etc., in their structure. This paper reports the synthesis of nanosized MgxMn1-xFe2O4 (x = 0.3 and 0.4) ferrites by sol-gel method and their evaluation in terms of heating capability and in vitro hemolysis to determine the potential use of these nanoparticles as thermoseeds for the treatment of cancer by magnetic hyperthermia. It was possible to obtain ferrites with nanometric sizes, a single crystalline phase with an inverse spinel structure and a behavior near to that of superparamagnetic materials. Additionally, at concentrations of 10 mg of magnetic material per mL of water, it was possible to reach a temperature of approximately 45°C, which is within the range of temperatures used for the treatment of hyperthermia. The results of the in vitro hemolysis assay showed that, at the concentrations tested, these nanoparticles are non-hemolytic, as their percentage of hemolysis is close to zero. Therefore, these materials can be used as thermoseeds for the treatment of cancer by magnetic hyperthermia.Keywords: ferrites, heating capability, hemolysis, nanoparticles, sol-gel
Procedia PDF Downloads 34217321 Interoperable Design Coordination Method for Sharing Communication Information Using Building Information Model Collaboration Format
Authors: Jin Gang Lee, Hyun-Soo Lee, Moonseo Park
Abstract:
The utilization of BIM and IFC allows project participants to collaborate across different areas by consistently sharing interoperable product information represented in a model. Comments or markups generated during the coordination process can be categorized as communication information, which can be shared in less standardized manner. It can be difficult to manage and reuse such information compared to the product information in a model. The present study proposes an interoperable coordination method using BCF (the BIM Collaboration Format) for managing and sharing the communication information during BIM based coordination process. A management function for coordination in the BIM collaboration system is developed to assess its ability to share the communication information in BIM collaboration projects. This approach systematically links communication information during the coordination process to the building model and serves as a type of storage system for retrieving knowledge created during BIM collaboration projects.Keywords: design coordination, building information model, BIM collaboration format, industry foundation classes
Procedia PDF Downloads 43417320 Economical Dependency Evolution and Complexity
Authors: Allé Dieng, Mamadou Bousso, Latif Dramani
Abstract:
The purpose of this work is to show the complexity behind economical interrelations in a country and provide a linear dynamic model of economical dependency evolution in a country. The model is based on National Transfer Account which is one of the most robust methodology developed in order to measure a level of demographic dividend captured in a country. It is built upon three major factors: demography, economical dependency and migration. The established mathematical model has been simulated using Netlogo software. The innovation of this study is in describing economical dependency as a complex system and simulating using mathematical equation the evolution of the two populations: the economical dependent and the non-economical dependent as defined in the National Transfer Account methodology. It also allows us to see the interactions and behaviors of both populations. The model can track individual characteristics and look at the effect of birth and death rates on the evolution of these two populations. The developed model is useful to understand how demographic and economic phenomenon are relatedKeywords: ABM, demographic dividend, National Transfer Accounts (NTA), ODE
Procedia PDF Downloads 20517319 Indigenous Hair Treatment in Abyssinia
Authors: Makda Yeshitela Kifele
Abstract:
Hair treatment prevents the hair from loss of volume, changing colour, and damaging its properties of the hair. Hair is the beauty of human beings that makes people beautiful and takes the other hearts to see them and to give them an appreciation for their effort to treat their hair and save it from damage. There are different methods to protect human hair from loss and damage that influence human psychology better than the problems. Chemicals products are available in the world that keeps safely the hair and provide beauty for the hair. But chemical products have side effects and are not cost-effective. Even some of the chemicals are allergic for users and left some changes in the hair. Indigenous hair treatment is an effective method that reduces the bad effects and the problems of the chemical that are lefts in human being’slife. Indigenous hair treatment can treat the hair safely and effectively that does not have much effect or spots in the human hair the users rather, it improves some attributes of the hair such that shine, quality, quantity improvements, length, and flexibility can be modified by these indigenous treatments. Rate is the local plant that plays a significant role in hair treatment. Rate is the local plant that can be available everywhere in the country, and anybody can be used for hair treatments. For this research, 50 women are identified as sample populations with different hair characteristics. The treatments were collected from the fields and squeezed into the pots to be prepared as specimens. The squeezed plants were deposited in the refrigerator for three days with some amounts of salts to prevent some bacteria. Chemical analysis has been done to sort out some detrimental substances. So the result showed that there are no detrimental substances that affect the hair properties and the health of the users. The sample population used the oil for one month without any other oily cosmetics that disturbs the treatment. The output is very effective and brings shining the hair, preventing greying of the hair, showing fast-growing, increasing the volume of the hair, and becoming flexible and curly, straight hair, thicker, and with no allergic effects.Keywords: indigenous, chemicals, curly, treatment
Procedia PDF Downloads 10817318 Analysis of the Contribution of Drude and Brendel Model Terms to the Dielectric Function
Authors: Christopher Mkirema Maghanga, Maurice Mghendi Mwamburi
Abstract:
Parametric modeling provides a means to deeper understand the properties of materials. Drude, Brendel, Lorentz and OJL incorporated in SCOUT® software are some of the models used to study dielectric films. In our work, we utilized Brendel and Drude models to extract the optical constants from spectroscopic data of fabricated undoped and niobium doped titanium oxide thin films. The individual contributions by the two models were studied to establish how they influence the dielectric function. The effect of dopants on their influences was also analyzed. For the undoped films, results indicate minimal contribution from the Drude term due to the dielectric nature of the films. However as doping levels increase, the rise in the concentration of free electrons favors the use of Drude model. Brendel model was confirmed to work well with dielectric films - the undoped titanium Oxide films in our case.Keywords: modeling, Brendel model, optical constants, titanium oxide, Drude Model
Procedia PDF Downloads 18317317 Deconstructing the Niger-Delta Crises: In Esiaba Irobi's Cemetery Road and Hangmen Also Die
Authors: Chukwukelue Uzodinma Umenyilorah
Abstract:
The history of the crises in Niger-Delta is readily traceable to the post-colonial oil boom of the early 70s. Prior to this time, it was widely believed that the people of Niger-Delta; especially those in the present day Rivers, Delta and Bayelsa States enjoyed a peaceful coexistence pretty much as the rest of Nigerians. In the early 70s however, crude oil was discovered in commercial quantities in these areas and tranquility has become a far cry over the years ever since then. First, a number of multi-national oil explorers moved into the Niger-Delta for business, and then certain conditions resulted in sundry instances of oil spillage, which caused a lot of environmental damage, destroying nearly all of the people’s sources of livelihood. The result was a multiple chain reaction ranging from incessant agitations from the natives to institutionalized dialogue between the oil business owners, the natives and the government, and then to a proposition of compensation packages for the affected communities. The said compensation, which was meant to bring peace seem to have brought even more crises instead. Corruption and greed crept in, money changed hands, suffering increased and so was the agitation from the people. The whole turn of events gradually snowballed into the formation of various militant groups who are now fingered as responsible for the sundry cases of violence in the Niger-Delta. The oil boom can, therefore, be said to be the immediate cause of the Niger-Delta crises, but there are other remote causes as well; including poverty, neglect and illiteracy to mention but a few. This study is therefore aimed at examining the various reasons behind the seemingly unending crises in the Niger-Delta. It will also take a critical look at the roles played by the various parties in the Niger-Delta crises from the 70s to date; as well as the various human and environmental devastations done in the area with a view to making informed suggestions on how to stop further damage and start fixing that, which is already done. Esiaba Irobi’s Cemetery Road and Hangmen Also Die seem to vividly capture the realities of the Niger-Delta situation, and shall, therefore, be reviewed in this study.Keywords: corruption, Niger-delta, oil boom, post-colonial
Procedia PDF Downloads 31417316 A Multicriteria Mathematical Programming Model for Farm Planning in Greece
Authors: Basil Manos, Parthena Chatzinikolaou, Fedra Kiomourtzi
Abstract:
This paper presents a Multicriteria Mathematical Programming model for farm planning and sustainable optimization of agricultural production. The model can be used as a tool for the analysis and simulation of agricultural production plans, as well as for the study of impacts of various measures of Common Agriculture Policy in the member states of European Union. The model can achieve the optimum production plan of a farm or an agricultural region combining in one utility function different conflicting criteria as the maximization of gross margin and the minimization of fertilizers used, under a set of constraints for land, labor, available capital, Common Agricultural Policy etc. The proposed model was applied to the region of Larisa in central Greece. The optimum production plan achieves a greater gross return, a less fertilizers use, and a less irrigated water use than the existent production plan.Keywords: sustainable optimization, multicriteria analysis, agricultural production, farm planning
Procedia PDF Downloads 60417315 A Comparative Analysis of E-Government Quality Models
Authors: Abdoullah Fath-Allah, Laila Cheikhi, Rafa E. Al-Qutaish, Ali Idri
Abstract:
Many quality models have been used to measure e-government portals quality. However, the absence of an international consensus for e-government portals quality models results in many differences in terms of quality attributes and measures. The aim of this paper is to compare and analyze the existing e-government quality models proposed in literature (those that are based on ISO standards and those that are not) in order to propose guidelines to build a good and useful e-government portals quality model. Our findings show that, there is no e-government portal quality model based on the new international standard ISO 25010. Besides that, the quality models are not based on a best practice model to allow agencies to both; measure e-government portals quality and identify missing best practices for those portals.Keywords: e-government, portal, best practices, quality model, ISO, standard, ISO 25010, ISO 9126
Procedia PDF Downloads 56017314 Human Wildlife Conflict Outside Protected Areas of Nepal: Causes, Consequences and Mitigation Strategies
Authors: Kedar Baral
Abstract:
This study was carried out in Mustang, Kaski, Tanahun, Baitadi, and Jhapa districts of Nepal. The study explored the spatial and temporal pattern of HWC, socio economic factors associated with it, impacts of conflict on life / livelihood of people and survival of wildlife species, and impact of climate change and forest fire onHWC. Study also evaluated people’s attitude towards wildlife conservation and assessed relevant policies and programs. Questionnaire survey was carried out with the 250 respondents, and both socio-demographic and HWC related information werecollected. Secondary information were collected from Divisional Forest Offices and Annapurna Conservation Area Project.HWC events were grouped by season /months/sites (forest type, distances from forest, and settlement), and the coordinates of the events were exported to ArcGIS. Collected data were analyzed using descriptive statistics in Excel and R Program. A total of 1465 events were recorded in 5 districts during 2015 and 2019. Out of that, livestock killing, crop damage, human attack, and cattle shed damage events were 70 %, 12%, 11%, and 7%, respectively. Among 151 human attack cases, 23 people were killed, and 128 were injured. Elephant in Terai, common leopard and monkey in Middle Mountain, and snow leopard in high mountains were found as major problematic animals. Common leopard attacks were found more in the autumn, evening, and on human settlement area. Whereas elephant attacks were found higher in winter, day time, and on farmland. Poor people farmers were found highly victimized, and they were losing 26% of their income due to crop raiding and livestock depredation. On the other hand, people are killing many wildlife in revenge, and this number is increasing every year. Based on the people's perception, climate change is causing increased temperature and forest fire events and decreased water sources within the forest. Due to the scarcity of food and water within forests, wildlife are compelled to dwell at human settlement area, hence HWC events are increasing. Nevertheless, more than half of the respondents were found positive about conserving entire wildlife species. Forests outside PAs are under the community forestry (CF) system, which restored the forest, improved the habitat, and increased the wildlife.However, CF policies and programs were found to be more focused on forest management with least priority on wildlife conservation and HWC mitigation. Compensation / relief scheme of government for wildlife damage was found some how effective to manage HWC, but the lengthy process, being applicable to the damage of few wildlife species and highly increasing events made it necessary to revisit. Based on these facts, the study suggest to carry out awareness generation activities to the poor farmers, linking the property of people with the insurance scheme, conducting habitat management activities within CF, promoting the unpalatable crops, improvement of shed house of livestock, simplifying compensation scheme and establishing a fund at the district level and incorporating the wildlife conservation and HWCmitigation programs in CF. Finally, the study suggests to carry out rigorous researches to understand the impacts of current forest management practices on forest, biodiversity, wildlife, and HWC.Keywords: community forest, conflict mitigation, wildlife conservation, climate change
Procedia PDF Downloads 11717313 Predicting Options Prices Using Machine Learning
Authors: Krishang Surapaneni
Abstract:
The goal of this project is to determine how to predict important aspects of options, including the ask price. We want to compare different machine learning models to learn the best model and the best hyperparameters for that model for this purpose and data set. Option pricing is a relatively new field, and it can be very complicated and intimidating, especially to inexperienced people, so we want to create a machine learning model that can predict important aspects of an option stock, which can aid in future research. We tested multiple different models and experimented with hyperparameter tuning, trying to find some of the best parameters for a machine-learning model. We tested three different models: a Random Forest Regressor, a linear regressor, and an MLP (multi-layer perceptron) regressor. The most important feature in this experiment is the ask price; this is what we were trying to predict. In the field of stock pricing prediction, there is a large potential for error, so we are unable to determine the accuracy of the models based on if they predict the pricing perfectly. Due to this factor, we determined the accuracy of the model by finding the average percentage difference between the predicted and actual values. We tested the accuracy of the machine learning models by comparing the actual results in the testing data and the predictions made by the models. The linear regression model performed worst, with an average percentage error of 17.46%. The MLP regressor had an average percentage error of 11.45%, and the random forest regressor had an average percentage error of 7.42%Keywords: finance, linear regression model, machine learning model, neural network, stock price
Procedia PDF Downloads 7617312 Experimental and Numerical Analysis of Mustafa Paşa Mosque in Skopje
Authors: Ozden Saygili, Eser Cakti
Abstract:
The masonry building stock in Istanbul and in other cities of Turkey are exposed to significant earthquake hazard. Determination of the safety of masonry structures against earthquakes is a complex challenge. This study deals with experimental tests and non-linear dynamic analysis of masonry structures modeled through discrete element method. The 1:10 scale model of Mustafa Paşa Mosque was constructed and the data were obtained from the sensors on it during its testing on the shake table. The results were used in the calibration/validation of the numerical model created on the basis of the 1:10 scale model built for shake table testing. 3D distinct element model was developed that represents the linear and nonlinear behavior of the shake table model as closely as possible during experimental tests. Results of numerical analyses with those from the experimental program were compared and discussed.Keywords: dynamic analysis, non-linear modeling, shake table tests, masonry
Procedia PDF Downloads 42617311 Strategic Shear Wall Arrangement in Buildings under Seismic Loads
Authors: Akram Khelaifia, Salah Guettala, Nesreddine Djafar Henni, Rachid Chebili
Abstract:
Reinforced concrete shear walls are pivotal in protecting buildings from seismic forces by providing strength and stiffness. This study highlights the importance of strategically placing shear walls and optimizing the shear wall-to-floor area ratio in building design. Nonlinear analyses were conducted on an eight-story building situated in a high seismic zone, exploring various scenarios of shear wall positioning and ratios to floor area. Employing the performance-based seismic design (PBSD) approach, the study aims to meet acceptance criteria such as inter-story drift ratio and damage levels. The results indicate that concentrating shear walls in the middle of the structure during the design phase yields superior performance compared to peripheral distributions. Utilizing shear walls that fully infill the frame and adopting compound shapes (e.g., Box, U, and L) enhances reliability in terms of inter-story drift. Conversely, the absence of complete shear walls within the frame leads to decreased stiffness and degradation of shorter beams. Increasing the shear wall-to-floor area ratio in building design enhances structural rigidity and reliability regarding inter-story drift, facilitating the attainment of desired performance levels. The study suggests that a shear wall ratio of 1.0% is necessary to meet validation criteria for inter-story drift and structural damage, as exceeding this percentage leads to excessive performance levels, proving uneconomical as structural elements operate near the elastic range.Keywords: nonlinear analyses, pushover analysis, shear wall, plastic hinge, performance level
Procedia PDF Downloads 5017310 Tolerating Input Faults in Asynchronous Sequential Machines
Authors: Jung-Min Yang
Abstract:
A method of tolerating input faults for input/state asynchronous sequential machines is proposed. A corrective controller is placed in front of the considered asynchronous machine to realize model matching with a reference model. The value of the external input transmitted to the closed-loop system may change by fault. We address the existence condition for the controller that can counteract adverse effects of any input fault while maintaining the objective of model matching. A design procedure for constructing the controller is outlined. The proposed reachability condition for the controller design is validated in an illustrative example.Keywords: asynchronous sequential machines, corrective control, fault tolerance, input faults, model matching
Procedia PDF Downloads 42417309 Role of Microplastics on Reducing Heavy Metal Pollution from Wastewater
Authors: Derin Ureten
Abstract:
Plastic pollution does not disappear, it gets smaller and smaller through photolysis which are caused mainly by sun’s radiation, thermal oxidation, thermal degradation, and biodegradation which is the action of organisms digesting larger plastics. All plastic pollutants have exceedingly harmful effects on the environment. Together with the COVID-19 pandemic, the number of plastic products such as masks and gloves flowing into the environment has increased more than ever. However, microplastics are not the only pollutants in water, one of the most tenacious and toxic pollutants are heavy metals. Heavy metal solutions are also capable of causing varieties of health problems in organisms such as cancer, organ damage, nervous system damage, and even death. The aim of this research is to prove that microplastics can be used in wastewater treatment systems by proving that they could adsorb heavy metals in solutions. Experiment for this research will include two heavy metal solutions; one including microplastics in a heavy metal contaminated water solution, and one that just includes heavy metal solution. After being sieved, absorbance of both mediums will be measured with the help of a spectrometer. Iron (III) chloride (FeCl3) will be used as the heavy metal solution since the solution becomes darker as the presence of this substance increases. The experiment will be supported by Pure Nile Red powder in order to observe if there are any visible differences under the microscope. Pure Nile Red powder is a chemical that binds to hydrophobic materials such as plastics and lipids. If proof of adsorbance could be observed by the rates of the solutions' final absorbance rates and visuals ensured by the Pure Nile Red powder, the experiment will be conducted with different temperature levels in order to analyze the most accurate temperature level to proceed with removal of heavy metals from water. New wastewater treatment systems could be generated with the help of microplastics, for water contaminated with heavy metals.Keywords: microplastics, heavy metal, pollution, adsorbance, wastewater treatment
Procedia PDF Downloads 8717308 Qsar Studies of Certain Novel Heterocycles Derived From bis-1, 2, 4 Triazoles as Anti-Tumor Agents
Authors: Madhusudan Purohit, Stephen Philip, Bharathkumar Inturi
Abstract:
In this paper we report the quantitative structure activity relationship of novel bis-triazole derivatives for predicting the activity profile. The full model encompassed a dataset of 46 Bis- triazoles. Tripos Sybyl X 2.0 program was used to conduct CoMSIA QSAR modeling. The Partial Least-Squares (PLS) analysis method was used to conduct statistical analysis and to derive a QSAR model based on the field values of CoMSIA descriptor. The compounds were divided into test and training set. The compounds were evaluated by various CoMSIA parameters to predict the best QSAR model. An optimum numbers of components were first determined separately by cross-validation regression for CoMSIA model, which were then applied in the final analysis. A series of parameters were used for the study and the best fit model was obtained using donor, partition coefficient and steric parameters. The CoMSIA models demonstrated good statistical results with regression coefficient (r2) and the cross-validated coefficient (q2) of 0.575 and 0.830 respectively. The standard error for the predicted model was 0.16322. In the CoMSIA model, the steric descriptors make a marginally larger contribution than the electrostatic descriptors. The finding that the steric descriptor is the largest contributor for the CoMSIA QSAR models is consistent with the observation that more than half of the binding site area is occupied by steric regions.Keywords: 3D QSAR, CoMSIA, triazoles, novel heterocycles
Procedia PDF Downloads 44417307 Estimation of Structural Parameters in Time Domain Using One Dimensional Piezo Zirconium Titanium Patch Model
Authors: N. Jinesh, K. Shankar
Abstract:
This article presents a method of using the one dimensional piezo-electric patch on beam model for structural identification. A hybrid element constituted of one dimensional beam element and a PZT sensor is used with reduced material properties. This model is convenient and simple for identification of beams. Accuracy of this element is first verified against a corresponding 3D finite element model (FEM). The structural identification is carried out as an inverse problem whereby parameters are identified by minimizing the deviation between the predicted and measured voltage response of the patch, when subjected to excitation. A non-classical optimization algorithm Particle Swarm Optimization is used to minimize this objective function. The signals are polluted with 5% Gaussian noise to simulate experimental noise. The proposed method is applied on beam structure and identified parameters are stiffness and damping. The model is also validated experimentally.Keywords: inverse problem, particle swarm optimization, PZT patches, structural identification
Procedia PDF Downloads 30917306 Cascaded Neural Network for Internal Temperature Forecasting in Induction Motor
Authors: Hidir S. Nogay
Abstract:
In this study, two systems were created to predict interior temperature in induction motor. One of them consisted of a simple ANN model which has two layers, ten input parameters and one output parameter. The other one consisted of eight ANN models connected each other as cascaded. Cascaded ANN system has 17 inputs. Main reason of cascaded system being used in this study is to accomplish more accurate estimation by increasing inputs in the ANN system. Cascaded ANN system is compared with simple conventional ANN model to prove mentioned advantages. Dataset was obtained from experimental applications. Small part of the dataset was used to obtain more understandable graphs. Number of data is 329. 30% of the data was used for testing and validation. Test data and validation data were determined for each ANN model separately and reliability of each model was tested. As a result of this study, it has been understood that the cascaded ANN system produced more accurate estimates than conventional ANN model.Keywords: cascaded neural network, internal temperature, inverter, three-phase induction motor
Procedia PDF Downloads 34517305 Coefficient of Performance (COP) Optimization of an R134a Cross Vane Expander Compressor Refrigeration System
Authors: Y. D. Lim, K. S. Yap, K. T. Ooi
Abstract:
Cross Vane Expander Compressor (CVEC) is a newly invented expander-compressor combined unit, where it is introduced to replace the compressor and the expansion valve in traditional refrigeration system. The mathematical model of CVEC has been developed to examine its performance, and it was found that the energy consumption of a conventional refrigeration system was reduced by as much as 18%. It is believed that energy consumption can be further reduced by optimizing the device. In this study, the coefficient of performance (COP) of CVEC has been optimized under predetermined operational parameters and constrained main design parameters. Several main design parameters of CVEC were selected to be the variables, and the optimization was done with theoretical model in a simulation program. The theoretical model consists of geometrical model, dynamic model, heat transfer model and valve dynamics model. Complex optimization method, which is a constrained, direct search and multi-variables method was used in the study. As a result, the optimization study suggested that with an appropriate combination of design parameters, a 58% COP improvement in CVEC R134a refrigeration system is possible.Keywords: COP, cross vane expander-compressor, CVEC, design, simulation, refrigeration system, air-conditioning, R134a, multi variables
Procedia PDF Downloads 33417304 Rainfall–Runoff Simulation Using WetSpa Model in Golestan Dam Basin, Iran
Authors: M. R. Dahmardeh Ghaleno, M. Nohtani, S. Khaledi
Abstract:
Flood simulation and prediction is one of the most active research areas in surface water management. WetSpa is a distributed, continuous, and physical model with daily or hourly time step that explains precipitation, runoff, and evapotranspiration processes for both simple and complex contexts. This model uses a modified rational method for runoff calculation. In this model, runoff is routed along the flow path using Diffusion-Wave equation which depends on the slope, velocity, and flow route characteristics. Golestan Dam Basin is located in Golestan province in Iran and it is passing over coordinates 55° 16´ 50" to 56° 4´ 25" E and 37° 19´ 39" to 37° 49´ 28"N. The area of the catchment is about 224 km2, and elevations in the catchment range from 414 to 2856 m at the outlet, with average slope of 29.78%. Results of the simulations show a good agreement between calculated and measured hydrographs at the outlet of the basin. Drawing upon Nash-Sutcliffe model efficiency coefficient for calibration periodic model estimated daily hydrographs and maximum flow rate with an accuracy up to 59% and 80.18%, respectively.Keywords: watershed simulation, WetSpa, stream flow, flood prediction
Procedia PDF Downloads 24417303 Reinforcement Learning for Self Driving Racing Car Games
Authors: Adam Beaunoyer, Cory Beaunoyer, Mohammed Elmorsy, Hanan Saleh
Abstract:
This research aims to create a reinforcement learning agent capable of racing in challenging simulated environments with a low collision count. We present a reinforcement learning agent that can navigate challenging tracks using both a Deep Q-Network (DQN) and a Soft Actor-Critic (SAC) method. A challenging track includes curves, jumps, and varying road widths throughout. Using open-source code on Github, the environment used in this research is based on the 1995 racing game WipeOut. The proposed reinforcement learning agent can navigate challenging tracks rapidly while maintaining low racing completion time and collision count. The results show that the SAC model outperforms the DQN model by a large margin. We also propose an alternative multiple-car model that can navigate the track without colliding with other vehicles on the track. The SAC model is the basis for the multiple-car model, where it can complete the laps quicker than the single-car model but has a higher collision rate with the track wall.Keywords: reinforcement learning, soft actor-critic, deep q-network, self-driving cars, artificial intelligence, gaming
Procedia PDF Downloads 4717302 Developing a Sustainable Business Model for Platform-Based Applications in Small and Medium-Sized Enterprise Sawmills: A Systematic Approach
Authors: Franziska Mais, Till Gramberg
Abstract:
The paper presents the development of a sustainable business model for a platform-based application tailored for sawing companies in small and medium-sized enterprises (SMEs). The focus is on the integration of sustainability principles into the design of the business model to ensure a technologically advanced, legally sound, and economically efficient solution. Easy2IoT is a research project that aims to enable companies in the prefabrication sheet metal and sheet metal processing industry to enter the Industrial Internet of Things (IIoT) with a low-threshold and cost-effective approach. The methodological approach of Easy2IoT includes an in-depth requirements analysis and customer interviews with stakeholders along the value chain. Based on these insights, actions, requirements, and potential solutions for smart services are derived. The structuring of the business ecosystem within the application plays a central role, whereby the roles of the partners, the management of the IT infrastructure and services, as well as the design of a sustainable operator model are considered. The business model is developed using the value proposition canvas, whereby a detailed analysis of the requirements for the business model is carried out, taking sustainability into account. This includes coordination with the business model patterns, according to Gassmann, and integration into a business model canvas for the Easy2IoT product. Potential obstacles and problems are identified and evaluated in order to formulate a comprehensive and sustainable business model. In addition, sustainable payment models and distribution channels are developed. In summary, the article offers a well-founded insight into the systematic development of a sustainable business model for platform-based applications in SME sawmills, with a particular focus on the synergy of ecological responsibility and economic efficiency.Keywords: business model, sustainable business model, IIoT, IIoT-platform, industrie 4.0, big data
Procedia PDF Downloads 8217301 Two-Warehouse Inventory Model for Deteriorating Items with Inventory-Level-Dependent Demand under Two Dispatching Policies
Authors: Lei Zhao, Zhe Yuan, Wenyue Kuang
Abstract:
This paper studies two-warehouse inventory models for a deteriorating item considering that the demand is influenced by inventory levels. The problem mainly focuses on the optimal order policy and the optimal order cycle with inventory-level-dependent demand in two-warehouse system for retailers. It considers the different deterioration rates and the inventory holding costs in owned warehouse (OW) and rented warehouse (RW), and the conditions of transportation cost, allowed shortage and partial backlogging. Two inventory models are formulated: last-in first-out (LIFO) model and first-in-first-out (FIFO) model based on the policy choices of LIFO and FIFO, and a comparative analysis of LIFO model and FIFO model is made. The study finds that the FIFO policy is more in line with realistic operating conditions. Especially when the inventory holding cost of OW is high, and there is no difference or big difference between deterioration rates of OW and RW, the FIFO policy has better applicability. Meanwhile, this paper considers the differences between the effects of warehouse and shelf inventory levels on demand, and then builds retailers’ inventory decision model and studies the factors of the optimal order quantity, the optimal order cycle and the average inventory cost per unit time. To minimize the average total cost, the optimal dispatching policies are provided for retailers’ decisions.Keywords: FIFO model, inventory-level-dependent, LIFO model, two-warehouse inventory
Procedia PDF Downloads 27917300 Motor Controller Implementation Using Model Based Design
Authors: Cau Tran, Tu Nguyen, Tien Pham
Abstract:
Model-based design (MBD) is a mathematical and visual technique for addressing design issues in the fields of communications, signal processing, and complicated control systems. It is utilized in several automotive, aerospace, industrial, and motion control applications. Virtual models are at the center of the software development process with model based design. A method used in the creation of embedded software is model-based design. In this study, the LAT motor is modeled in a simulation environment, and the LAT motor control is designed with a cascade structure, a speed and current control loop, and a controller that is used in the next part. A PID structure serves as this controller. Based on techniques and motor parameters that match the design goals, the PID controller is created for the model using traditional design principles. The MBD approach will be used to build embedded software for motor control. The paper will be divided into three distinct sections. The first section will introduce the design process and the benefits and drawbacks of the MBD technique. The design of control software for LAT motors will be the main topic of the next section. The experiment's results are the subject of the last section.Keywords: model based design, limited angle torque, intellectual property core, hardware description language, controller area network, user datagram protocol
Procedia PDF Downloads 94