Search results for: exponential smoothing methods
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 15679

Search results for: exponential smoothing methods

14179 Artificial Intelligence in Bioscience: The Next Frontier

Authors: Parthiban Srinivasan

Abstract:

With recent advances in computational power and access to enough data in biosciences, artificial intelligence methods are increasingly being used in drug discovery research. These methods are essentially a series of advanced statistics based exercises that review the past to indicate the likely future. Our goal is to develop a model that accurately predicts biological activity and toxicity parameters for novel compounds. We have compiled a robust library of over 150,000 chemical compounds with different pharmacological properties from literature and public domain databases. The compounds are stored in simplified molecular-input line-entry system (SMILES), a commonly used text encoding for organic molecules. We utilize an automated process to generate an array of numerical descriptors (features) for each molecule. Redundant and irrelevant descriptors are eliminated iteratively. Our prediction engine is based on a portfolio of machine learning algorithms. We found Random Forest algorithm to be a better choice for this analysis. We captured non-linear relationship in the data and formed a prediction model with reasonable accuracy by averaging across a large number of randomized decision trees. Our next step is to apply deep neural network (DNN) algorithm to predict the biological activity and toxicity properties. We expect the DNN algorithm to give better results and improve the accuracy of the prediction. This presentation will review all these prominent machine learning and deep learning methods, our implementation protocols and discuss these techniques for their usefulness in biomedical and health informatics.

Keywords: deep learning, drug discovery, health informatics, machine learning, toxicity prediction

Procedia PDF Downloads 357
14178 An Improved Heat Transfer Prediction Model for Film Condensation inside a Tube with Interphacial Shear Effect

Authors: V. G. Rifert, V. V. Gorin, V. V. Sereda, V. V. Treputnev

Abstract:

The analysis of heat transfer design methods in condensing inside plain tubes under existing influence of shear stress is presented in this paper. The existing discrepancy in more than 30-50% between rating heat transfer coefficients and experimental data has been noted. The analysis of existing theoretical and semi-empirical methods of heat transfer prediction is given. The influence of a precise definition concerning boundaries of phase flow (it is especially important in condensing inside horizontal tubes), shear stress (friction coefficient) and heat flux on design of heat transfer is shown. The substantiation of boundary conditions of the values of parameters, influencing accuracy of rated relationships, is given. More correct relationships for heat transfer prediction, which showed good convergence with experiments made by different authors, are substantiated in this work.

Keywords: film condensation, heat transfer, plain tube, shear stress

Procedia PDF Downloads 245
14177 Effects of Sensory Integration Techniques in Science Education of Autistic Students

Authors: Joanna Estkowska

Abstract:

Sensory integration methods are very useful and improve daily functioning autistic and mentally disabled children. Autism is a neurobiological disorder that impairs one's ability to communicate with and relate to others as well as their sensory system. Children with autism, even highly functioning kids, can find it difficult to process language with surrounding noise or smells. They are hypersensitive to things we can ignore such as sight, sounds and touch. Adolescents with highly functioning autism or Asperger Syndrome can study Science and Math but the social aspect is difficult for them. Nature science is an area of study that attracts many of these kids. It is a systematic field in which the children can focus on a small aspect. If you follow these rules you can come up with an expected result. Sensory integration program and systematic classroom observation are quantitative methods of measuring classroom functioning and behaviors from direct observations. These methods specify both the events and behaviors that are to be observed and how they are to be recorded. Our students with and without autism attended the lessons in the classroom of nature science in the school and in the laboratory of University of Science and Technology in Bydgoszcz. The aim of this study is investigation the effects of sensory integration methods in teaching to students with autism. They were observed during experimental lessons in the classroom and in the laboratory. Their physical characteristics, sensory dysfunction, and behavior in class were taken into consideration by comparing their similarities and differences. In the chemistry classroom, every autistic student is paired with a mentor from their school. In the laboratory, the children are expected to wear goggles, gloves and a lab coat. The chemistry classes in the laboratory were held for four hours with a lunch break, and according to the assistants, the children were engaged the whole time. In classroom of nature science, the students are encouraged to use the interactive exhibition of chemical, physical and mathematical models constructed by the author of this paper. Our students with and without autism attended the lessons in those laboratories. The teacher's goals are: to assist the child in inhibiting and modulating sensory information and support the child in processing a response to sensory stimulation.

Keywords: autism spectrum disorder, science education, sensory integration techniques, student with special educational needs

Procedia PDF Downloads 192
14176 The Effectiveness of Pretreatment Methods on COD and Ammonia Removal from Landfill Leachate

Authors: M. Poveda, S. Lozecznik, J. Oleszkiewicz, Q. Yuan

Abstract:

The goal of this experiment is to evaluate the effectiveness of different leachate pre-treatment options in terms of COD and ammonia removal. This research focused on the evaluation of physical-chemical methods for pre-treatment of leachate that would be effective and rapid in order to satisfy the requirements of the sewer discharge by-laws. The four pre-treatment options evaluated were: air stripping, chemical coagulation, electro-coagulation and advanced oxidation with sodium ferrate. Chemical coagulation reported the best COD removal rate at 43%, compared to 18 % for both air stripping and electro-coagulation, and 20 % for oxidation with sodium ferrate. On the other hand, air stripping was far superior to the other treatment options in terms of ammonia removal with 86 %. Oxidation with sodium ferrate reached only 16 %, while chemical coagulation and electro-coagulation removed less than 10 %. When combined, air stripping and chemical coagulation removed up to 50 % COD and 85 % ammonia.

Keywords: leachate pretreatment, air stripping, chemical coagulation, electro-coagulation, oxidation

Procedia PDF Downloads 843
14175 Eco-Index for Assessing Ecological Disturbances at Downstream of a Hydropower Project

Authors: Chandra Upadhyaya, Arup Kumar Sarma

Abstract:

In the North Eastern part of India several hydro power projects are being proposed and execution for some of them are already initiated. There are controversies surrounding these constructions. Impact of these dams in the downstream part of the rivers needs to be assessed so that eco-system and people living downstream are protected by redesigning the projects if it becomes necessary. This may result in reducing the stresses to the affected ecosystem and people living downstream. At present many index based ecological methods are present to assess impact on ecology. However, none of these methods are capable of assessing the affect resulting from dam induced diurnal variation of flow in the downstream. We need environmental flow methodology based on hydrological index which can address the affect resulting from dam induced diurnal variation of flow and play an important role in a riverine ecosystem management and be able to provide a qualitative idea about changes in the habitat for aquatic and riparian species.

Keywords: ecosystem, environmental flow assessment, entropy, IHA, TNC

Procedia PDF Downloads 384
14174 Contraceptives: Experiences of Agency and Coercion of Young People Living in Colombia

Authors: Paola Montenegro, Maria de los Angeles Balaguera Villa

Abstract:

Contraceptive methods play a fundamental role in preventing unwanted pregnancies and protecting users from sexually transmitted infections (STIs). Despite being known to almost the entire population of reproductive age living in Colombia, there are barriers, practices and complex notions about contraceptives that affect their desired mass use and effectiveness. This work aims to analyse some of the perceptions and practices discussed with young people (13-28 years old) living in Colombia regarding the use of contraceptives in their daily lives, preferences, needs and perceived side effects. This research also examines the perceived paradox in autonomy that young people experience regarding contraceptive use: in one hand, its use (or lack of it) is interpreted as an act of self-determination and primary example of reproductive agency, on the other hand, it was frequently associated with coercion and limited autonomy derived from the gaps in reliable information available for young people, the difficulty of accessing certain preferred methods, and sometimes the experienced coercion exercise by doctors, partners and/or family members. The data and analysis discussed in this work stems from a research project whose objective was to provide information about needs and preferences in sexual and reproductive health of young people living in Colombia in relation to a possible telehealth service that could close the gap in access to quality care and safe information. Through a mixed methods approach, this study collected 5.736 responses to a virtual survey disseminated nationwide in Colombia and 47 inperson interviews (24 of them with people who were assigned female at birth and 21 with local key stakeholders in the abortion ecosystem). Quantitative data was analyzed using Stata SE Version 16.0 and qualitative analysis was completed through NVivo using thematic analysis. Key findings on contraception use in young people living in Colombia reveal that 85,8% of participants had used a contraceptive method in the last two years, and that the most commonly used methods were condoms, contraceptive pills, the morning-after pill and the method of interruption. The remaining 14,2% of respondents who declared to not have used contraceptives in the last two years expressed that the main four barriers to access were: "Lack of knowledge about contraceptive methods and where to obtain information and/or access them (13.9%)", "Have had sex with people who have vaginas (10.2%)", "Cost of contraceptive method (8.4%)" and "Difficulties in obtaining medical authorisations (7.6%)". These barriers coincided with the ones used to explain the non-use of contraceptives in young people, which reveals that limitations in information, cost, and quality care represent structural issues that need to be address in programmes, services, and public policy. Finally, interviews showed that young people perceive contraceptive use and non-use as an example of reaffirming reproductive agency and limitations to this can be explained through the widespread incomplete knowledge about how methods work and the prevalence of other social representations of contraception associated with trust, fidelity, and partner preferences, that in the end create limitations to young people’s autonomy.

Keywords: contraception, family planning, premarital fertility, unplanned pregnancy

Procedia PDF Downloads 76
14173 Utilizing Temporal and Frequency Features in Fault Detection of Electric Motor Bearings with Advanced Methods

Authors: Mohammad Arabi

Abstract:

The development of advanced technologies in the field of signal processing and vibration analysis has enabled more accurate analysis and fault detection in electrical systems. This research investigates the application of temporal and frequency features in detecting faults in electric motor bearings, aiming to enhance fault detection accuracy and prevent unexpected failures. The use of methods such as deep learning algorithms and neural networks in this process can yield better results. The main objective of this research is to evaluate the efficiency and accuracy of methods based on temporal and frequency features in identifying faults in electric motor bearings to prevent sudden breakdowns and operational issues. Additionally, the feasibility of using techniques such as machine learning and optimization algorithms to improve the fault detection process is also considered. This research employed an experimental method and random sampling. Vibration signals were collected from electric motors under normal and faulty conditions. After standardizing the data, temporal and frequency features were extracted. These features were then analyzed using statistical methods such as analysis of variance (ANOVA) and t-tests, as well as machine learning algorithms like artificial neural networks and support vector machines (SVM). The results showed that using temporal and frequency features significantly improves the accuracy of fault detection in electric motor bearings. ANOVA indicated significant differences between normal and faulty signals. Additionally, t-tests confirmed statistically significant differences between the features extracted from normal and faulty signals. Machine learning algorithms such as neural networks and SVM also significantly increased detection accuracy, demonstrating high effectiveness in timely and accurate fault detection. This study demonstrates that using temporal and frequency features combined with machine learning algorithms can serve as an effective tool for detecting faults in electric motor bearings. This approach not only enhances fault detection accuracy but also simplifies and streamlines the detection process. However, challenges such as data standardization and the cost of implementing advanced monitoring systems must also be considered. Utilizing temporal and frequency features in fault detection of electric motor bearings, along with advanced machine learning methods, offers an effective solution for preventing failures and ensuring the operational health of electric motors. Given the promising results of this research, it is recommended that this technology be more widely adopted in industrial maintenance processes.

Keywords: electric motor, fault detection, frequency features, temporal features

Procedia PDF Downloads 47
14172 Virtual and Augmented Reality Based Heritage Gamification: Basilica of Smyrna in Turkey

Authors: Tugba Saricaoglu

Abstract:

This study argues about the potential representation and interpretation of Basilica of Smyrna through gamification. Representation can be defined as a key which plays a role as a converter in order to provide interpretation of something according to the person who perceives. Representation of cultural heritage is a hypothetical and factual approach in terms of its sustainable conservation. Today, both site interpreters and public of cultural heritage have varying perspectives due to their different demographic, social, and even cultural backgrounds. Additionally, gamification application offers diversion of methods suchlike video games to improve user perspective of non-game platforms, contexts, and issues. Hence, cultural heritage and video game decided to be analyzed. Moreover, there are basically different ways of representation of cultural heritage such as digital, physical, and virtual methods in terms of conservation. Virtual reality (VR) and augmented reality (AR) technologies are two of the contemporary digital methods of heritage conservation. In this study, 3D documented ruins of the Basilica will be presented in the virtual and augmented reality based technology as a theoretical gamification sample. Also, this paper will focus on two sub-topics: First, evaluation of the video-game platforms applied to cultural heritage sites, and second, potentials of cultural heritage to be represented in video game platforms. The former will cover the analysis of some case(s) with regard to the concepts and representational aspects of cultural heritage. The latter will include the investigation of cultural heritage sites which carry such a potential and their sustainable conversation. Consequently, after mutual collection of information from cultural heritage and video game platforms, a perspective will be provided in terms of interpretation of representation of cultural heritage by sampling that on Basilica of Smyrna by using VR and AR based technologies.

Keywords: Basilica of Smyrna, cultural heritage, digital heritage, gamification

Procedia PDF Downloads 466
14171 A Survey of Feature Selection and Feature Extraction Techniques in Machine Learning

Authors: Samina Khalid, Shamila Nasreen

Abstract:

Dimensionality reduction as a preprocessing step to machine learning is effective in removing irrelevant and redundant data, increasing learning accuracy, and improving result comprehensibility. However, the recent increase of dimensionality of data poses a severe challenge to many existing feature selection and feature extraction methods with respect to efficiency and effectiveness. In the field of machine learning and pattern recognition, dimensionality reduction is important area, where many approaches have been proposed. In this paper, some widely used feature selection and feature extraction techniques have analyzed with the purpose of how effectively these techniques can be used to achieve high performance of learning algorithms that ultimately improves predictive accuracy of classifier. An endeavor to analyze dimensionality reduction techniques briefly with the purpose to investigate strengths and weaknesses of some widely used dimensionality reduction methods is presented.

Keywords: age related macular degeneration, feature selection feature subset selection feature extraction/transformation, FSA’s, relief, correlation based method, PCA, ICA

Procedia PDF Downloads 496
14170 Radiochemical Purity of 68Ga-BCA-Peptides: Separation of All 68Ga Species with a Single iTLC Strip

Authors: Anton A. Larenkov, Alesya Ya Maruk

Abstract:

In the present study, highly effective iTLC single strip method for the determination of radiochemical purity (RCP) of 68Ga-BCA-peptides was developed (with no double-developing, changing of eluents or other additional manipulation). In this method iTLC-SG strips and commonly used eluent TFAaq. (3-5 % (v/v)) are used. The method allows determining each of the key radiochemical forms of 68Ga (colloidal, bound, ionic) separately with the peaks separation being no less than 4 σ. Rf = 0.0-0.1 for 68Ga-colloid; Rf = 0.5-0.6 for 68Ga-BCA-peptides; Rf = 0.9-1.0 for ionic 68Ga. The method is simple and fast: For developing length of 75 mm only 4-6 min is required (versus 18-20 min for pharmacopoeial method). The method has been tested on various compounds (including 68Ga-DOTA-TOC, 68Ga-DOTA-TATE, 68Ga-NODAGA-RGD2 etc.). The cross-validation work for every specific form of 68Ga showed good correlation between method developed and control (pharmacopoeial) methods. The method can become convenient and much more informative replacement for pharmacopoeial methods, including HPLC.

Keywords: DOTA-TATE, 68Ga, quality control, radiochemical purity, radiopharmaceuticals, TLC

Procedia PDF Downloads 290
14169 A Study on Selection Issues of an Integrated Service Provider Using Analytical Hierarchy Process

Authors: M. Pramila Devi, J. Praveena

Abstract:

In today’s industrial scenario, the expectations and demand of customers are reaching great heights. In order to satisfy the customer requirements the users are increasingly turning towards fourth party logistics (4PL) service providers to manage their total supply chain operations. In this present research, initially, the criteria for the selection of integrated service providers have been identified and an integrated modal based on their inter-relationship has been developed with help of shippers, with this idea of what factors to be considered and their inter-relationships while selecting integrated service provider. Later, various methods deriving the priority weights viz. Analytical Hierarchy Process (AHP) have been employed for 4PL service provider selection. The derived priorities of 4PL alternatives using methods have been critically analyzed and compared for effective selection. The use of the modal indicates that the computed quantitative evaluation can be applied to improve the precision of the selection.

Keywords: analytical hierarchy process, fourth party logistics, priority weight, criteria selection

Procedia PDF Downloads 432
14168 Determining Full Stage Creep Properties from Miniature Specimen Creep Test

Authors: W. Sun, W. Wen, J. Lu, A. A. Becker

Abstract:

In this work, methods for determining creep properties which can be used to represent the full life until failure from miniature specimen creep tests based on analytical solutions are presented. Examples used to demonstrate the application of the methods include a miniature rectangular thin beam specimen creep test under three-point bending and a miniature two-material tensile specimen creep test subjected to a steady load. Mathematical expressions for deflection and creep strain rate of the two specimens were presented for the Kachanov-Rabotnov creep damage model. On this basis, an inverse procedure was developed which has potential applications for deriving the full life creep damage constitutive properties from a very small volume of material, in particular, for various microstructure constitutive  regions, e.g. within heat-affected zones of power plant pipe weldments. Further work on validation and improvement of the method is addressed.

Keywords: creep damage property, miniature specimen, inverse approach, finite element modeling

Procedia PDF Downloads 231
14167 Slosh Investigations on a Spacecraft Propellant Tank for Control Stability Studies

Authors: Sarath Chandran Nair S, Srinivas Kodati, Vasudevan R, Asraff A. K

Abstract:

Spacecrafts generally employ liquid propulsion for their attitude and orbital maneuvers or raising it from geo-transfer orbit to geosynchronous orbit. Liquid propulsion systems use either mono-propellant or bi-propellants for generating thrust. These propellants are generally stored in either spherical tanks or cylindrical tanks with spherical end domes. The propellant tanks are provided with a propellant acquisition system/propellant management device along with vanes and their conical mounting structure to ensure propellant availability in the outlet for thrust generation even under a low/zero-gravity environment. Slosh is the free surface oscillations in partially filled containers under external disturbances. In a spacecraft, these can be due to control forces and due to varying acceleration. Knowledge of slosh and its effect due to internals is essential for understanding its stability through control stability studies. It is mathematically represented by a pendulum-mass model. It requires parameters such as slosh frequency, damping, sloshes mass and its location, etc. This paper enumerates various numerical and experimental methods used for evaluating the slosh parameters required for representing slosh. Numerical methods like finite element methods based on linear velocity potential theory and computational fluid dynamics based on Reynolds Averaged Navier Stokes equations are used for the detailed evaluation of slosh behavior in one of the spacecraft propellant tanks used in an Indian space mission. Experimental studies carried out on a scaled-down model are also discussed. Slosh parameters evaluated by different methods matched very well and finalized their dispersion bands based on experimental studies. It is observed that the presence of internals such as propellant management devices, including conical support structure, alters slosh parameters. These internals also offers one order higher damping compared to viscous/ smooth wall damping. It is an advantage factor for the stability of slosh. These slosh parameters are given for establishing slosh margins through control stability studies and finalize the spacecraft control system design.

Keywords: control stability, propellant tanks, slosh, spacecraft, slosh spacecraft

Procedia PDF Downloads 245
14166 Exploring Multi-Feature Based Action Recognition Using Multi-Dimensional Dynamic Time Warping

Authors: Guoliang Lu, Changhou Lu, Xueyong Li

Abstract:

In action recognition, previous studies have demonstrated the effectiveness of using multiple features to improve the recognition performance. We focus on two practical issues: i) most studies use a direct way of concatenating/accumulating multi features to evaluate the similarity between two actions. This way could be too strong since each kind of feature can include different dimensions, quantities, etc; ii) in many studies, the employed classification methods lack of a flexible and effective mechanism to add new feature(s) into classification. In this paper, we explore an unified scheme based on recently-proposed multi-dimensional dynamic time warping (MD-DTW). Experiments demonstrated the scheme's effectiveness of combining multi-feature and the flexibility of adding new feature(s) to increase the recognition performance. In addition, the explored scheme also provides us an open architecture for using new advanced classification methods in the future to enhance action recognition.

Keywords: action recognition, multi features, dynamic time warping, feature combination

Procedia PDF Downloads 437
14165 Impact of Climate Change on Sea Level Rise along the Coastline of Mumbai City, India

Authors: Chakraborty Sudipta, A. R. Kambekar, Sarma Arnab

Abstract:

Sea-level rise being one of the most important impacts of anthropogenic induced climate change resulting from global warming and melting of icebergs at Arctic and Antarctic, the investigations done by various researchers both on Indian Coast and elsewhere during the last decade has been reviewed in this paper. The paper aims to ascertain the propensity of consistency of different suggested methods to predict the near-accurate future sea level rise along the coast of Mumbai. Case studies at East Coast, Southern Tip and West and South West coast of India have been reviewed. Coastal Vulnerability Index of several important international places has been compared, which matched with Intergovernmental Panel on Climate Change forecasts. The application of Geographic Information System mapping, use of remote sensing technology, both Multi Spectral Scanner and Thematic Mapping data from Landsat classified through Iterative Self-Organizing Data Analysis Technique for arriving at high, moderate and low Coastal Vulnerability Index at various important coastal cities have been observed. Instead of data driven, hindcast based forecast for Significant Wave Height, additional impact of sea level rise has been suggested. Efficacy and limitations of numerical methods vis-à-vis Artificial Neural Network has been assessed, importance of Root Mean Square error on numerical results is mentioned. Comparing between various computerized methods on forecast results obtained from MIKE 21 has been opined to be more reliable than Delft 3D model.

Keywords: climate change, Coastal Vulnerability Index, global warming, sea level rise

Procedia PDF Downloads 132
14164 Model-Driven and Data-Driven Approaches for Crop Yield Prediction: Analysis and Comparison

Authors: Xiangtuo Chen, Paul-Henry Cournéde

Abstract:

Crop yield prediction is a paramount issue in agriculture. The main idea of this paper is to find out efficient way to predict the yield of corn based meteorological records. The prediction models used in this paper can be classified into model-driven approaches and data-driven approaches, according to the different modeling methodologies. The model-driven approaches are based on crop mechanistic modeling. They describe crop growth in interaction with their environment as dynamical systems. But the calibration process of the dynamic system comes up with much difficulty, because it turns out to be a multidimensional non-convex optimization problem. An original contribution of this paper is to propose a statistical methodology, Multi-Scenarios Parameters Estimation (MSPE), for the parametrization of potentially complex mechanistic models from a new type of datasets (climatic data, final yield in many situations). It is tested with CORNFLO, a crop model for maize growth. On the other hand, the data-driven approach for yield prediction is free of the complex biophysical process. But it has some strict requirements about the dataset. A second contribution of the paper is the comparison of these model-driven methods with classical data-driven methods. For this purpose, we consider two classes of regression methods, methods derived from linear regression (Ridge and Lasso Regression, Principal Components Regression or Partial Least Squares Regression) and machine learning methods (Random Forest, k-Nearest Neighbor, Artificial Neural Network and SVM regression). The dataset consists of 720 records of corn yield at county scale provided by the United States Department of Agriculture (USDA) and the associated climatic data. A 5-folds cross-validation process and two accuracy metrics: root mean square error of prediction(RMSEP), mean absolute error of prediction(MAEP) were used to evaluate the crop prediction capacity. The results show that among the data-driven approaches, Random Forest is the most robust and generally achieves the best prediction error (MAEP 4.27%). It also outperforms our model-driven approach (MAEP 6.11%). However, the method to calibrate the mechanistic model from dataset easy to access offers several side-perspectives. The mechanistic model can potentially help to underline the stresses suffered by the crop or to identify the biological parameters of interest for breeding purposes. For this reason, an interesting perspective is to combine these two types of approaches.

Keywords: crop yield prediction, crop model, sensitivity analysis, paramater estimation, particle swarm optimization, random forest

Procedia PDF Downloads 231
14163 Best Option for Countercyclical Capital Buffer Implementation: Scenarios for Baltic States

Authors: Ģirts Brasliņš, Ilja Arefjevs, Nadežda Tarakanova

Abstract:

The objective of countercyclical capital buffer is to encourage banks to build up buffers in good times that can be drawn down in bad times. The aim of the report is to assess such decisions by banks derived from three approaches. The approaches are the aggregate credit-to-GDP ratio, credit growth as well as banking sector profits. The approaches are implemented for Estonia, Latvia and Lithuania for the time period 2000-2012. The report compares three approaches and analyses their relevance to the Baltic states by testing the correlation between a growth in studied variables and a growth of corresponding gaps. Methods used in the empirical part of the report are econometric analysis as well as economic analysis, development indicators, relative and absolute indicators and other methods. The research outcome is a cross-Baltic comparison of two alternative approaches to establish or release a countercyclical capital buffer by banks and their implications for each Baltic country.

Keywords: basel III, countercyclical capital buffer, banks, credit growth, baltic states

Procedia PDF Downloads 396
14162 A Guide for Using Viscoelasticity in ANSYS

Authors: A. Fettahoglu

Abstract:

Theory of viscoelasticity is used by many researchers to represent the behavior of many materials such as pavements on roads or bridges. Several researches used analytical methods and rheology to predict the material behaviors of simple models. Today, more complex engineering structures are analyzed using Finite Element Method, in which material behavior is embedded by means of three dimensional viscoelastic material laws. As a result, structures of unordinary geometry and domain can be analyzed by means of Finite Element Method and three dimensional viscoelastic equations. In the scope of this study, rheological models embedded in ANSYS, namely, generalized Maxwell model and Prony series, which are two methods used by ANSYS to represent viscoelastic material behavior, are presented explicitly. Afterwards, a guide is illustrated to ease using of viscoelasticity tool in ANSYS.

Keywords: ANSYS, generalized Maxwell model, finite element method, Prony series, viscoelasticity, viscoelastic material curve fitting

Procedia PDF Downloads 604
14161 Research and Application of the Three-Dimensional Visualization Geological Modeling of Mine

Authors: Bin Wang, Yong Xu, Honggang Qu, Rongmei Liu, Zhenji Gao

Abstract:

Today's mining industry is advancing gradually toward digital and visual direction. The three dimensional visualization geological modeling of mine is the digital characterization of mineral deposit, and is one of the key technology of digital mine. The three-dimensional geological modeling is a technology that combines the geological spatial information management, geological interpretation, geological spatial analysis and prediction, geostatistical analysis, entity content analysis and graphic visualization in three-dimensional environment with computer technology, and is used in geological analysis. In this paper, the three-dimensional geological modeling of an iron mine through the use of Surpac is constructed, and the weight difference of the estimation methods between distance power inverse ratio method and ordinary kriging is studied, and the ore body volume and reserves are simulated and calculated by using these two methods. Compared with the actual mine reserves, its result is relatively accurate, so it provided scientific bases for mine resource assessment, reserve calculation, mining design and so on.

Keywords: three-dimensional geological modeling, geological database, geostatistics, block model

Procedia PDF Downloads 70
14160 Introduction of Microbial Symbiosis in Genus of Tridacna and Kiwaidae with Insights into Aquaculture

Authors: Jincao Guo

Abstract:

Aquaculture plays a significant role in the diet of people in many regions. However, problems such as bioaccumulation have risen with the rapidly growing industry due to a lack of control in the feeding process, which brings uncertainty to the quality of the products. The paper tackles the problem by introducing the symbiosis of the Giant Clam (Tridacna) with photosynthetic algae and Yeti Crab (Kiwaidae) with chemosynthetic bacteria in molecular and developmental details. By combing the knowledge gained from the two models and past studies, innovative ideas such as using mass selection methods to domesticate and farm those symbiotic species, as well as improvements for the current farming methods, such as introducing algae feeding, are discussed. Further studies are needed, but experiments are worth conducting since it increases the variety of choices for consumers and can potentially improve the quality and efficiency of aquaculture.

Keywords: the giant clam Tridacna, yeti crab Kiwaidae, autotroph microbes, microbial symbiosis, aquaculture, bivalves, crustaceans, mollusk, photosynthesis, chemosynthesis

Procedia PDF Downloads 75
14159 Applications of AI, Machine Learning, and Deep Learning in Cyber Security

Authors: Hailyie Tekleselase

Abstract:

Deep learning is increasingly used as a building block of security systems. However, neural networks are hard to interpret and typically solid to the practitioner. This paper presents a detail survey of computing methods in cyber security, and analyzes the prospects of enhancing the cyber security capabilities by suggests that of accelerating the intelligence of the security systems. There are many AI-based applications used in industrial scenarios such as Internet of Things (IoT), smart grids, and edge computing. Machine learning technologies require a training process which introduces the protection problems in the training data and algorithms. We present machine learning techniques currently applied to the detection of intrusion, malware, and spam. Our conclusions are based on an extensive review of the literature as well as on experiments performed on real enterprise systems and network traffic. We conclude that problems can be solved successfully only when methods of artificial intelligence are being used besides human experts or operators.

Keywords: artificial intelligence, machine learning, deep learning, cyber security, big data

Procedia PDF Downloads 126
14158 Aerodynamic Design an UAV with Application on the Spraying Agricola with Method of Genetic Algorithm Optimization

Authors: Saul A. Torres Z., Eduardo Liceaga C., Alfredo Arias M.

Abstract:

Agriculture in the world falls within the main sources of economic and global needs, so care of crop is extremely important for owners and workers; one of the major causes of loss of product is the pest infection of different types of organisms. We seek to develop a UAV for agricultural spraying at a maximum altitude of 5000 meters above sea level, with a payload of 100 liters of fumigant. For the developing the aerodynamic design of the aircraft is using computational tools such as the "Vortex Lattice Athena" software, "MATLAB"," ANSYS FLUENT"," XFoil " package among others. Also methods are being used structured programming, exhaustive analysis of optimization methods and search. The results have a very low margin of error, and the multi- objective problems can be helpful for future developments. The program has 10 functions developed in MATLAB, these functions are related to each other to enable the development of design, and all these functions are controlled by the principal code "Master.m".

Keywords: aerodynamics design, optimization, algorithm genetic, multi-objective problem, stability, vortex

Procedia PDF Downloads 532
14157 Evaluation of Different Liquid Scintillation Counting Methods for 222Rn Determination in Waters

Authors: Jovana Nikolov, Natasa Todorovic, Ivana Stojkovic

Abstract:

Monitoring of 222Rn in drinking or surface waters, as well as in groundwater has been performed in connection with geological, hydrogeological and hydrological surveys and health hazard studies. Liquid scintillation counting (LSC) is often preferred analytical method for 222Rn measurements in waters because it allows multiple-sample automatic analysis. LSC method implies mixing of water samples with organic scintillation cocktail, which triggers radon diffusion from the aqueous into organic phase for which it has a much greater affinity, eliminating possibility of radon emanation in that manner. Two direct LSC methods that assume different sample composition have been presented, optimized and evaluated in this study. One-phase method assumed direct mixing of 10 ml sample with 10 ml of emulsifying cocktail (Ultima Gold AB scintillation cocktail is used). Two-phase method involved usage of water-immiscible cocktails (in this study High Efficiency Mineral Oil Scintillator, Opti-Fluor O and Ultima Gold F are used). Calibration samples were prepared with aqueous 226Ra standard in glass 20 ml vials and counted on ultra-low background spectrometer Quantulus 1220TM equipped with PSA (Pulse Shape Analysis) circuit which discriminates alpha/beta spectra. Since calibration procedure is carried out with 226Ra standard, which has both alpha and beta progenies, it is clear that PSA discriminator has vital importance in order to provide reliable and precise spectra separation. Consequentially, calibration procedure was done through investigation of PSA discriminator level influence on 222Rn efficiency detection, using 226Ra calibration standard in wide range of activity concentrations. Evaluation of presented methods was based on obtained efficiency detections and achieved Minimal Detectable Activity (MDA). Comparison of presented methods, accuracy and precision as well as different scintillation cocktail’s performance was considered from results of measurements of 226Ra spiked water samples with known activity and environmental samples.

Keywords: 222Rn in water, Quantulus1220TM, scintillation cocktail, PSA parameter

Procedia PDF Downloads 201
14156 Additional Method for the Purification of Lanthanide-Labeled Peptide Compounds Pre-Purified by Weak Cation Exchange Cartridge

Authors: K. Eryilmaz, G. Mercanoglu

Abstract:

Aim: Purification of the final product, which is the last step in the synthesis of lanthanide-labeled peptide compounds, can be accomplished by different methods. Among these methods, the two most commonly used methods are C18 solid phase extraction (SPE) and weak cation exchanger cartridge elution. SPE C18 solid phase extraction method yields high purity final product, while elution from the weak cation exchanger cartridge is pH dependent and ineffective in removing colloidal impurities. The aim of this work is to develop an additional purification method for the lanthanide-labeled peptide compound in cases where the desired radionuclidic and radiochemical purity of the final product can not be achieved because of pH problem or colloidal impurity. Material and Methods: For colloidal impurity formation, 3 mL of water for injection (WFI) was added to 30 mCi of 177LuCl3 solution and allowed to stand for 1 day. 177Lu-DOTATATE was synthesized using EZAG ML-EAZY module (10 mCi/mL). After synthesis, the final product was mixed with the colloidal impurity solution (total volume:13 mL, total activity: 40 mCi). The resulting mixture was trapped in SPE-C18 cartridge. The cartridge was washed with 10 ml saline to remove impurities to the waste vial. The product trapped in the cartridge was eluted with 2 ml of 50% ethanol and collected to the final product vial via passing through a 0.22μm filter. The final product was diluted with 10 mL of saline. Radiochemical purity before and after purification was analysed by HPLC method. (column: ACE C18-100A. 3µm. 150 x 3.0mm, mobile phase: Water-Acetonitrile-Trifluoro acetic acid (75:25:1), flow rate: 0.6 mL/min). Results: UV and radioactivity detector results in HPLC analysis showed that colloidal impurities were completely removed from the 177Lu-DOTATATE/ colloidal impurity mixture by purification method. Conclusion: The improved purification method can be used as an additional method to remove impurities that may result from the lanthanide-peptide synthesis in which the weak cation exchange purification technique is used as the last step. The purification of the final product and the GMP compliance (the final aseptic filtration and the sterile disposable system components) are two major advantages.

Keywords: lanthanide, peptide, labeling, purification, radionuclide, radiopharmaceutical, synthesis

Procedia PDF Downloads 162
14155 Pose-Dependency of Machine Tool Structures: Appearance, Consequences, and Challenges for Lightweight Large-Scale Machines

Authors: S. Apprich, F. Wulle, A. Lechler, A. Pott, A. Verl

Abstract:

Large-scale machine tools for the manufacturing of large work pieces, e.g. blades, casings or gears for wind turbines, feature pose-dependent dynamic behavior. Small structural damping coefficients lead to long decay times for structural vibrations that have negative impacts on the production process. Typically, these vibrations are handled by increasing the stiffness of the structure by adding mass. That is counterproductive to the needs of sustainable manufacturing as it leads to higher resource consumption both in material and in energy. Recent research activities have led to higher resource efficiency by radical mass reduction that rely on control-integrated active vibration avoidance and damping methods. These control methods depend on information describing the dynamic behavior of the controlled machine tools in order to tune the avoidance or reduction method parameters according to the current state of the machine. The paper presents the appearance, consequences and challenges of the pose-dependent dynamic behavior of lightweight large-scale machine tool structures in production. The paper starts with the theoretical introduction of the challenges of lightweight machine tool structures resulting from reduced stiffness. The statement of the pose-dependent dynamic behavior is corroborated by the results of the experimental modal analysis of a lightweight test structure. Afterwards, the consequences of the pose-dependent dynamic behavior of lightweight machine tool structures for the use of active control and vibration reduction methods are explained. Based on the state of the art on pose-dependent dynamic machine tool models and the modal investigation of an FE-model of the lightweight test structure, the criteria for a pose-dependent model for use in vibration reduction are derived. The description of the approach for a general pose-dependent model of the dynamic behavior of large lightweight machine tools that provides the necessary input to the aforementioned vibration avoidance and reduction methods to properly tackle machine vibrations is the outlook of the paper.

Keywords: dynamic behavior, lightweight, machine tool, pose-dependency

Procedia PDF Downloads 459
14154 Development of a Geomechanical Risk Assessment Model for Underground Openings

Authors: Ali Mortazavi

Abstract:

The main objective of this research project is to delve into a multitude of geomechanical risks associated with various mining methods employed within the underground mining industry. Controlling geotechnical design parameters and operational factors affecting the selection of suitable mining techniques for a given underground mining condition will be considered from a risk assessment point of view. Important geomechanical challenges will be investigated as appropriate and relevant to the commonly used underground mining methods. Given the complicated nature of rock mass in-situ and complicated boundary conditions and operational complexities associated with various underground mining methods, the selection of a safe and economic mining operation is of paramount significance. Rock failure at varying scales within the underground mining openings is always a threat to mining operations and causes human and capital losses worldwide. Geotechnical design is a major design component of all underground mines and basically dominates the safety of an underground mine. With regard to uncertainties that exist in rock characterization prior to mine development, there are always risks associated with inappropriate design as a function of mining conditions and the selected mining method. Uncertainty often results from the inherent variability of rock masse, which in turn is a function of both geological materials and rock mass in-situ conditions. The focus of this research is on developing a methodology which enables a geomechanical risk assessment of given underground mining conditions. The outcome of this research is a geotechnical risk analysis algorithm, which can be used as an aid in selecting the appropriate mining method as a function of mine design parameters (e.g., rock in-situ properties, design method, governing boundary conditions such as in-situ stress and groundwater, etc.).

Keywords: geomechanical risk assessment, rock mechanics, underground mining, rock engineering

Procedia PDF Downloads 145
14153 A Stepwise Approach to Automate the Search for Optimal Parameters in Seasonal ARIMA Models

Authors: Manisha Mukherjee, Diptarka Saha

Abstract:

Reliable forecasts of univariate time series data are often necessary for several contexts. ARIMA models are quite popular among practitioners in this regard. Hence, choosing correct parameter values for ARIMA is a challenging yet imperative task. Thus, a stepwise algorithm is introduced to provide automatic and robust estimates for parameters (p; d; q)(P; D; Q) used in seasonal ARIMA models. This process is focused on improvising the overall quality of the estimates, and it alleviates the problems induced due to the unidimensional nature of the methods that are currently used such as auto.arima. The fast and automated search of parameter space also ensures reliable estimates of the parameters that possess several desirable qualities, consequently, resulting in higher test accuracy especially in the cases of noisy data. After vigorous testing on real as well as simulated data, the algorithm doesn’t only perform better than current state-of-the-art methods, it also completely obviates the need for human intervention due to its automated nature.

Keywords: time series, ARIMA, auto.arima, ARIMA parameters, forecast, R function

Procedia PDF Downloads 165
14152 Integrating RAG with Prompt Engineering for Dynamic Log Parsing and Anomaly Detections

Authors: Liu Lin Xin

Abstract:

With the increasing complexity of systems, log parsing and anomaly detection have become crucial for maintaining system stability. However, traditional methods often struggle with adaptability and accuracy, especially when dealing with rapidly evolving log content and unfamiliar domains. To address these challenges, this paper proposes approach that integrates Retrieval Augmented Generation (RAG) technology with Prompt Engineering for Large Language Models, applied specifically in LogPrompt. This approach enables dynamic log parsing and intelligent anomaly detection by combining real-time information retrieval with prompt optimization. The proposed method significantly enhances the adaptability of log analysis and improves the interpretability of results. Experimental results on several public datasets demonstrate the method's superior performance, particularly in scenarios lacking training data, where it significantly outperforms traditional methods. This paper introduces a novel technical pathway for log parsing and anomaly detection, showcasing the substantial theoretical value and practical potential.

Keywords: log parsing, anomaly detection, RAG, prompt engineering, LLMs

Procedia PDF Downloads 34
14151 Improved Pitch Detection Using Fourier Approximation Method

Authors: Balachandra Kumaraswamy, P. G. Poonacha

Abstract:

Automatic Music Information Retrieval has been one of the challenging topics of research for a few decades now with several interesting approaches reported in the literature. In this paper we have developed a pitch extraction method based on a finite Fourier series approximation to the given window of samples. We then estimate pitch as the fundamental period of the finite Fourier series approximation to the given window of samples. This method uses analysis of the strength of harmonics present in the signal to reduce octave as well as harmonic errors. The performance of our method is compared with three best known methods for pitch extraction, namely, Yin, Windowed Special Normalization of the Auto-Correlation Function and Harmonic Product Spectrum methods of pitch extraction. Our study with artificially created signals as well as music files show that Fourier Approximation method gives much better estimate of pitch with less octave and harmonic errors.

Keywords: pitch, fourier series, yin, normalization of the auto- correlation function, harmonic product, mean square error

Procedia PDF Downloads 412
14150 A Comparative Study of Malware Detection Techniques Using Machine Learning Methods

Authors: Cristina Vatamanu, Doina Cosovan, Dragos Gavrilut, Henri Luchian

Abstract:

In the past few years, the amount of malicious software increased exponentially and, therefore, machine learning algorithms became instrumental in identifying clean and malware files through semi-automated classification. When working with very large datasets, the major challenge is to reach both a very high malware detection rate and a very low false positive rate. Another challenge is to minimize the time needed for the machine learning algorithm to do so. This paper presents a comparative study between different machine learning techniques such as linear classifiers, ensembles, decision trees or various hybrids thereof. The training dataset consists of approximately 2 million clean files and 200.000 infected files, which is a realistic quantitative mixture. The paper investigates the above mentioned methods with respect to both their performance (detection rate and false positive rate) and their practicability.

Keywords: ensembles, false positives, feature selection, one side class algorithm

Procedia PDF Downloads 292