Search results for: derivative and convoluted derivative methods
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 15636

Search results for: derivative and convoluted derivative methods

14166 Utilizing Temporal and Frequency Features in Fault Detection of Electric Motor Bearings with Advanced Methods

Authors: Mohammad Arabi

Abstract:

The development of advanced technologies in the field of signal processing and vibration analysis has enabled more accurate analysis and fault detection in electrical systems. This research investigates the application of temporal and frequency features in detecting faults in electric motor bearings, aiming to enhance fault detection accuracy and prevent unexpected failures. The use of methods such as deep learning algorithms and neural networks in this process can yield better results. The main objective of this research is to evaluate the efficiency and accuracy of methods based on temporal and frequency features in identifying faults in electric motor bearings to prevent sudden breakdowns and operational issues. Additionally, the feasibility of using techniques such as machine learning and optimization algorithms to improve the fault detection process is also considered. This research employed an experimental method and random sampling. Vibration signals were collected from electric motors under normal and faulty conditions. After standardizing the data, temporal and frequency features were extracted. These features were then analyzed using statistical methods such as analysis of variance (ANOVA) and t-tests, as well as machine learning algorithms like artificial neural networks and support vector machines (SVM). The results showed that using temporal and frequency features significantly improves the accuracy of fault detection in electric motor bearings. ANOVA indicated significant differences between normal and faulty signals. Additionally, t-tests confirmed statistically significant differences between the features extracted from normal and faulty signals. Machine learning algorithms such as neural networks and SVM also significantly increased detection accuracy, demonstrating high effectiveness in timely and accurate fault detection. This study demonstrates that using temporal and frequency features combined with machine learning algorithms can serve as an effective tool for detecting faults in electric motor bearings. This approach not only enhances fault detection accuracy but also simplifies and streamlines the detection process. However, challenges such as data standardization and the cost of implementing advanced monitoring systems must also be considered. Utilizing temporal and frequency features in fault detection of electric motor bearings, along with advanced machine learning methods, offers an effective solution for preventing failures and ensuring the operational health of electric motors. Given the promising results of this research, it is recommended that this technology be more widely adopted in industrial maintenance processes.

Keywords: electric motor, fault detection, frequency features, temporal features

Procedia PDF Downloads 48
14165 Virtual and Augmented Reality Based Heritage Gamification: Basilica of Smyrna in Turkey

Authors: Tugba Saricaoglu

Abstract:

This study argues about the potential representation and interpretation of Basilica of Smyrna through gamification. Representation can be defined as a key which plays a role as a converter in order to provide interpretation of something according to the person who perceives. Representation of cultural heritage is a hypothetical and factual approach in terms of its sustainable conservation. Today, both site interpreters and public of cultural heritage have varying perspectives due to their different demographic, social, and even cultural backgrounds. Additionally, gamification application offers diversion of methods suchlike video games to improve user perspective of non-game platforms, contexts, and issues. Hence, cultural heritage and video game decided to be analyzed. Moreover, there are basically different ways of representation of cultural heritage such as digital, physical, and virtual methods in terms of conservation. Virtual reality (VR) and augmented reality (AR) technologies are two of the contemporary digital methods of heritage conservation. In this study, 3D documented ruins of the Basilica will be presented in the virtual and augmented reality based technology as a theoretical gamification sample. Also, this paper will focus on two sub-topics: First, evaluation of the video-game platforms applied to cultural heritage sites, and second, potentials of cultural heritage to be represented in video game platforms. The former will cover the analysis of some case(s) with regard to the concepts and representational aspects of cultural heritage. The latter will include the investigation of cultural heritage sites which carry such a potential and their sustainable conversation. Consequently, after mutual collection of information from cultural heritage and video game platforms, a perspective will be provided in terms of interpretation of representation of cultural heritage by sampling that on Basilica of Smyrna by using VR and AR based technologies.

Keywords: Basilica of Smyrna, cultural heritage, digital heritage, gamification

Procedia PDF Downloads 466
14164 A Survey of Feature Selection and Feature Extraction Techniques in Machine Learning

Authors: Samina Khalid, Shamila Nasreen

Abstract:

Dimensionality reduction as a preprocessing step to machine learning is effective in removing irrelevant and redundant data, increasing learning accuracy, and improving result comprehensibility. However, the recent increase of dimensionality of data poses a severe challenge to many existing feature selection and feature extraction methods with respect to efficiency and effectiveness. In the field of machine learning and pattern recognition, dimensionality reduction is important area, where many approaches have been proposed. In this paper, some widely used feature selection and feature extraction techniques have analyzed with the purpose of how effectively these techniques can be used to achieve high performance of learning algorithms that ultimately improves predictive accuracy of classifier. An endeavor to analyze dimensionality reduction techniques briefly with the purpose to investigate strengths and weaknesses of some widely used dimensionality reduction methods is presented.

Keywords: age related macular degeneration, feature selection feature subset selection feature extraction/transformation, FSA’s, relief, correlation based method, PCA, ICA

Procedia PDF Downloads 497
14163 Radiochemical Purity of 68Ga-BCA-Peptides: Separation of All 68Ga Species with a Single iTLC Strip

Authors: Anton A. Larenkov, Alesya Ya Maruk

Abstract:

In the present study, highly effective iTLC single strip method for the determination of radiochemical purity (RCP) of 68Ga-BCA-peptides was developed (with no double-developing, changing of eluents or other additional manipulation). In this method iTLC-SG strips and commonly used eluent TFAaq. (3-5 % (v/v)) are used. The method allows determining each of the key radiochemical forms of 68Ga (colloidal, bound, ionic) separately with the peaks separation being no less than 4 σ. Rf = 0.0-0.1 for 68Ga-colloid; Rf = 0.5-0.6 for 68Ga-BCA-peptides; Rf = 0.9-1.0 for ionic 68Ga. The method is simple and fast: For developing length of 75 mm only 4-6 min is required (versus 18-20 min for pharmacopoeial method). The method has been tested on various compounds (including 68Ga-DOTA-TOC, 68Ga-DOTA-TATE, 68Ga-NODAGA-RGD2 etc.). The cross-validation work for every specific form of 68Ga showed good correlation between method developed and control (pharmacopoeial) methods. The method can become convenient and much more informative replacement for pharmacopoeial methods, including HPLC.

Keywords: DOTA-TATE, 68Ga, quality control, radiochemical purity, radiopharmaceuticals, TLC

Procedia PDF Downloads 290
14162 A Study on Selection Issues of an Integrated Service Provider Using Analytical Hierarchy Process

Authors: M. Pramila Devi, J. Praveena

Abstract:

In today’s industrial scenario, the expectations and demand of customers are reaching great heights. In order to satisfy the customer requirements the users are increasingly turning towards fourth party logistics (4PL) service providers to manage their total supply chain operations. In this present research, initially, the criteria for the selection of integrated service providers have been identified and an integrated modal based on their inter-relationship has been developed with help of shippers, with this idea of what factors to be considered and their inter-relationships while selecting integrated service provider. Later, various methods deriving the priority weights viz. Analytical Hierarchy Process (AHP) have been employed for 4PL service provider selection. The derived priorities of 4PL alternatives using methods have been critically analyzed and compared for effective selection. The use of the modal indicates that the computed quantitative evaluation can be applied to improve the precision of the selection.

Keywords: analytical hierarchy process, fourth party logistics, priority weight, criteria selection

Procedia PDF Downloads 432
14161 Determining Full Stage Creep Properties from Miniature Specimen Creep Test

Authors: W. Sun, W. Wen, J. Lu, A. A. Becker

Abstract:

In this work, methods for determining creep properties which can be used to represent the full life until failure from miniature specimen creep tests based on analytical solutions are presented. Examples used to demonstrate the application of the methods include a miniature rectangular thin beam specimen creep test under three-point bending and a miniature two-material tensile specimen creep test subjected to a steady load. Mathematical expressions for deflection and creep strain rate of the two specimens were presented for the Kachanov-Rabotnov creep damage model. On this basis, an inverse procedure was developed which has potential applications for deriving the full life creep damage constitutive properties from a very small volume of material, in particular, for various microstructure constitutive  regions, e.g. within heat-affected zones of power plant pipe weldments. Further work on validation and improvement of the method is addressed.

Keywords: creep damage property, miniature specimen, inverse approach, finite element modeling

Procedia PDF Downloads 231
14160 Slosh Investigations on a Spacecraft Propellant Tank for Control Stability Studies

Authors: Sarath Chandran Nair S, Srinivas Kodati, Vasudevan R, Asraff A. K

Abstract:

Spacecrafts generally employ liquid propulsion for their attitude and orbital maneuvers or raising it from geo-transfer orbit to geosynchronous orbit. Liquid propulsion systems use either mono-propellant or bi-propellants for generating thrust. These propellants are generally stored in either spherical tanks or cylindrical tanks with spherical end domes. The propellant tanks are provided with a propellant acquisition system/propellant management device along with vanes and their conical mounting structure to ensure propellant availability in the outlet for thrust generation even under a low/zero-gravity environment. Slosh is the free surface oscillations in partially filled containers under external disturbances. In a spacecraft, these can be due to control forces and due to varying acceleration. Knowledge of slosh and its effect due to internals is essential for understanding its stability through control stability studies. It is mathematically represented by a pendulum-mass model. It requires parameters such as slosh frequency, damping, sloshes mass and its location, etc. This paper enumerates various numerical and experimental methods used for evaluating the slosh parameters required for representing slosh. Numerical methods like finite element methods based on linear velocity potential theory and computational fluid dynamics based on Reynolds Averaged Navier Stokes equations are used for the detailed evaluation of slosh behavior in one of the spacecraft propellant tanks used in an Indian space mission. Experimental studies carried out on a scaled-down model are also discussed. Slosh parameters evaluated by different methods matched very well and finalized their dispersion bands based on experimental studies. It is observed that the presence of internals such as propellant management devices, including conical support structure, alters slosh parameters. These internals also offers one order higher damping compared to viscous/ smooth wall damping. It is an advantage factor for the stability of slosh. These slosh parameters are given for establishing slosh margins through control stability studies and finalize the spacecraft control system design.

Keywords: control stability, propellant tanks, slosh, spacecraft, slosh spacecraft

Procedia PDF Downloads 245
14159 Exploring Multi-Feature Based Action Recognition Using Multi-Dimensional Dynamic Time Warping

Authors: Guoliang Lu, Changhou Lu, Xueyong Li

Abstract:

In action recognition, previous studies have demonstrated the effectiveness of using multiple features to improve the recognition performance. We focus on two practical issues: i) most studies use a direct way of concatenating/accumulating multi features to evaluate the similarity between two actions. This way could be too strong since each kind of feature can include different dimensions, quantities, etc; ii) in many studies, the employed classification methods lack of a flexible and effective mechanism to add new feature(s) into classification. In this paper, we explore an unified scheme based on recently-proposed multi-dimensional dynamic time warping (MD-DTW). Experiments demonstrated the scheme's effectiveness of combining multi-feature and the flexibility of adding new feature(s) to increase the recognition performance. In addition, the explored scheme also provides us an open architecture for using new advanced classification methods in the future to enhance action recognition.

Keywords: action recognition, multi features, dynamic time warping, feature combination

Procedia PDF Downloads 437
14158 Impact of Climate Change on Sea Level Rise along the Coastline of Mumbai City, India

Authors: Chakraborty Sudipta, A. R. Kambekar, Sarma Arnab

Abstract:

Sea-level rise being one of the most important impacts of anthropogenic induced climate change resulting from global warming and melting of icebergs at Arctic and Antarctic, the investigations done by various researchers both on Indian Coast and elsewhere during the last decade has been reviewed in this paper. The paper aims to ascertain the propensity of consistency of different suggested methods to predict the near-accurate future sea level rise along the coast of Mumbai. Case studies at East Coast, Southern Tip and West and South West coast of India have been reviewed. Coastal Vulnerability Index of several important international places has been compared, which matched with Intergovernmental Panel on Climate Change forecasts. The application of Geographic Information System mapping, use of remote sensing technology, both Multi Spectral Scanner and Thematic Mapping data from Landsat classified through Iterative Self-Organizing Data Analysis Technique for arriving at high, moderate and low Coastal Vulnerability Index at various important coastal cities have been observed. Instead of data driven, hindcast based forecast for Significant Wave Height, additional impact of sea level rise has been suggested. Efficacy and limitations of numerical methods vis-à-vis Artificial Neural Network has been assessed, importance of Root Mean Square error on numerical results is mentioned. Comparing between various computerized methods on forecast results obtained from MIKE 21 has been opined to be more reliable than Delft 3D model.

Keywords: climate change, Coastal Vulnerability Index, global warming, sea level rise

Procedia PDF Downloads 132
14157 Model-Driven and Data-Driven Approaches for Crop Yield Prediction: Analysis and Comparison

Authors: Xiangtuo Chen, Paul-Henry Cournéde

Abstract:

Crop yield prediction is a paramount issue in agriculture. The main idea of this paper is to find out efficient way to predict the yield of corn based meteorological records. The prediction models used in this paper can be classified into model-driven approaches and data-driven approaches, according to the different modeling methodologies. The model-driven approaches are based on crop mechanistic modeling. They describe crop growth in interaction with their environment as dynamical systems. But the calibration process of the dynamic system comes up with much difficulty, because it turns out to be a multidimensional non-convex optimization problem. An original contribution of this paper is to propose a statistical methodology, Multi-Scenarios Parameters Estimation (MSPE), for the parametrization of potentially complex mechanistic models from a new type of datasets (climatic data, final yield in many situations). It is tested with CORNFLO, a crop model for maize growth. On the other hand, the data-driven approach for yield prediction is free of the complex biophysical process. But it has some strict requirements about the dataset. A second contribution of the paper is the comparison of these model-driven methods with classical data-driven methods. For this purpose, we consider two classes of regression methods, methods derived from linear regression (Ridge and Lasso Regression, Principal Components Regression or Partial Least Squares Regression) and machine learning methods (Random Forest, k-Nearest Neighbor, Artificial Neural Network and SVM regression). The dataset consists of 720 records of corn yield at county scale provided by the United States Department of Agriculture (USDA) and the associated climatic data. A 5-folds cross-validation process and two accuracy metrics: root mean square error of prediction(RMSEP), mean absolute error of prediction(MAEP) were used to evaluate the crop prediction capacity. The results show that among the data-driven approaches, Random Forest is the most robust and generally achieves the best prediction error (MAEP 4.27%). It also outperforms our model-driven approach (MAEP 6.11%). However, the method to calibrate the mechanistic model from dataset easy to access offers several side-perspectives. The mechanistic model can potentially help to underline the stresses suffered by the crop or to identify the biological parameters of interest for breeding purposes. For this reason, an interesting perspective is to combine these two types of approaches.

Keywords: crop yield prediction, crop model, sensitivity analysis, paramater estimation, particle swarm optimization, random forest

Procedia PDF Downloads 231
14156 Best Option for Countercyclical Capital Buffer Implementation: Scenarios for Baltic States

Authors: Ģirts Brasliņš, Ilja Arefjevs, Nadežda Tarakanova

Abstract:

The objective of countercyclical capital buffer is to encourage banks to build up buffers in good times that can be drawn down in bad times. The aim of the report is to assess such decisions by banks derived from three approaches. The approaches are the aggregate credit-to-GDP ratio, credit growth as well as banking sector profits. The approaches are implemented for Estonia, Latvia and Lithuania for the time period 2000-2012. The report compares three approaches and analyses their relevance to the Baltic states by testing the correlation between a growth in studied variables and a growth of corresponding gaps. Methods used in the empirical part of the report are econometric analysis as well as economic analysis, development indicators, relative and absolute indicators and other methods. The research outcome is a cross-Baltic comparison of two alternative approaches to establish or release a countercyclical capital buffer by banks and their implications for each Baltic country.

Keywords: basel III, countercyclical capital buffer, banks, credit growth, baltic states

Procedia PDF Downloads 396
14155 A Guide for Using Viscoelasticity in ANSYS

Authors: A. Fettahoglu

Abstract:

Theory of viscoelasticity is used by many researchers to represent the behavior of many materials such as pavements on roads or bridges. Several researches used analytical methods and rheology to predict the material behaviors of simple models. Today, more complex engineering structures are analyzed using Finite Element Method, in which material behavior is embedded by means of three dimensional viscoelastic material laws. As a result, structures of unordinary geometry and domain can be analyzed by means of Finite Element Method and three dimensional viscoelastic equations. In the scope of this study, rheological models embedded in ANSYS, namely, generalized Maxwell model and Prony series, which are two methods used by ANSYS to represent viscoelastic material behavior, are presented explicitly. Afterwards, a guide is illustrated to ease using of viscoelasticity tool in ANSYS.

Keywords: ANSYS, generalized Maxwell model, finite element method, Prony series, viscoelasticity, viscoelastic material curve fitting

Procedia PDF Downloads 607
14154 Research and Application of the Three-Dimensional Visualization Geological Modeling of Mine

Authors: Bin Wang, Yong Xu, Honggang Qu, Rongmei Liu, Zhenji Gao

Abstract:

Today's mining industry is advancing gradually toward digital and visual direction. The three dimensional visualization geological modeling of mine is the digital characterization of mineral deposit, and is one of the key technology of digital mine. The three-dimensional geological modeling is a technology that combines the geological spatial information management, geological interpretation, geological spatial analysis and prediction, geostatistical analysis, entity content analysis and graphic visualization in three-dimensional environment with computer technology, and is used in geological analysis. In this paper, the three-dimensional geological modeling of an iron mine through the use of Surpac is constructed, and the weight difference of the estimation methods between distance power inverse ratio method and ordinary kriging is studied, and the ore body volume and reserves are simulated and calculated by using these two methods. Compared with the actual mine reserves, its result is relatively accurate, so it provided scientific bases for mine resource assessment, reserve calculation, mining design and so on.

Keywords: three-dimensional geological modeling, geological database, geostatistics, block model

Procedia PDF Downloads 70
14153 Introduction of Microbial Symbiosis in Genus of Tridacna and Kiwaidae with Insights into Aquaculture

Authors: Jincao Guo

Abstract:

Aquaculture plays a significant role in the diet of people in many regions. However, problems such as bioaccumulation have risen with the rapidly growing industry due to a lack of control in the feeding process, which brings uncertainty to the quality of the products. The paper tackles the problem by introducing the symbiosis of the Giant Clam (Tridacna) with photosynthetic algae and Yeti Crab (Kiwaidae) with chemosynthetic bacteria in molecular and developmental details. By combing the knowledge gained from the two models and past studies, innovative ideas such as using mass selection methods to domesticate and farm those symbiotic species, as well as improvements for the current farming methods, such as introducing algae feeding, are discussed. Further studies are needed, but experiments are worth conducting since it increases the variety of choices for consumers and can potentially improve the quality and efficiency of aquaculture.

Keywords: the giant clam Tridacna, yeti crab Kiwaidae, autotroph microbes, microbial symbiosis, aquaculture, bivalves, crustaceans, mollusk, photosynthesis, chemosynthesis

Procedia PDF Downloads 76
14152 Applications of AI, Machine Learning, and Deep Learning in Cyber Security

Authors: Hailyie Tekleselase

Abstract:

Deep learning is increasingly used as a building block of security systems. However, neural networks are hard to interpret and typically solid to the practitioner. This paper presents a detail survey of computing methods in cyber security, and analyzes the prospects of enhancing the cyber security capabilities by suggests that of accelerating the intelligence of the security systems. There are many AI-based applications used in industrial scenarios such as Internet of Things (IoT), smart grids, and edge computing. Machine learning technologies require a training process which introduces the protection problems in the training data and algorithms. We present machine learning techniques currently applied to the detection of intrusion, malware, and spam. Our conclusions are based on an extensive review of the literature as well as on experiments performed on real enterprise systems and network traffic. We conclude that problems can be solved successfully only when methods of artificial intelligence are being used besides human experts or operators.

Keywords: artificial intelligence, machine learning, deep learning, cyber security, big data

Procedia PDF Downloads 126
14151 Aerodynamic Design an UAV with Application on the Spraying Agricola with Method of Genetic Algorithm Optimization

Authors: Saul A. Torres Z., Eduardo Liceaga C., Alfredo Arias M.

Abstract:

Agriculture in the world falls within the main sources of economic and global needs, so care of crop is extremely important for owners and workers; one of the major causes of loss of product is the pest infection of different types of organisms. We seek to develop a UAV for agricultural spraying at a maximum altitude of 5000 meters above sea level, with a payload of 100 liters of fumigant. For the developing the aerodynamic design of the aircraft is using computational tools such as the "Vortex Lattice Athena" software, "MATLAB"," ANSYS FLUENT"," XFoil " package among others. Also methods are being used structured programming, exhaustive analysis of optimization methods and search. The results have a very low margin of error, and the multi- objective problems can be helpful for future developments. The program has 10 functions developed in MATLAB, these functions are related to each other to enable the development of design, and all these functions are controlled by the principal code "Master.m".

Keywords: aerodynamics design, optimization, algorithm genetic, multi-objective problem, stability, vortex

Procedia PDF Downloads 532
14150 Evaluation of Different Liquid Scintillation Counting Methods for 222Rn Determination in Waters

Authors: Jovana Nikolov, Natasa Todorovic, Ivana Stojkovic

Abstract:

Monitoring of 222Rn in drinking or surface waters, as well as in groundwater has been performed in connection with geological, hydrogeological and hydrological surveys and health hazard studies. Liquid scintillation counting (LSC) is often preferred analytical method for 222Rn measurements in waters because it allows multiple-sample automatic analysis. LSC method implies mixing of water samples with organic scintillation cocktail, which triggers radon diffusion from the aqueous into organic phase for which it has a much greater affinity, eliminating possibility of radon emanation in that manner. Two direct LSC methods that assume different sample composition have been presented, optimized and evaluated in this study. One-phase method assumed direct mixing of 10 ml sample with 10 ml of emulsifying cocktail (Ultima Gold AB scintillation cocktail is used). Two-phase method involved usage of water-immiscible cocktails (in this study High Efficiency Mineral Oil Scintillator, Opti-Fluor O and Ultima Gold F are used). Calibration samples were prepared with aqueous 226Ra standard in glass 20 ml vials and counted on ultra-low background spectrometer Quantulus 1220TM equipped with PSA (Pulse Shape Analysis) circuit which discriminates alpha/beta spectra. Since calibration procedure is carried out with 226Ra standard, which has both alpha and beta progenies, it is clear that PSA discriminator has vital importance in order to provide reliable and precise spectra separation. Consequentially, calibration procedure was done through investigation of PSA discriminator level influence on 222Rn efficiency detection, using 226Ra calibration standard in wide range of activity concentrations. Evaluation of presented methods was based on obtained efficiency detections and achieved Minimal Detectable Activity (MDA). Comparison of presented methods, accuracy and precision as well as different scintillation cocktail’s performance was considered from results of measurements of 226Ra spiked water samples with known activity and environmental samples.

Keywords: 222Rn in water, Quantulus1220TM, scintillation cocktail, PSA parameter

Procedia PDF Downloads 201
14149 Additional Method for the Purification of Lanthanide-Labeled Peptide Compounds Pre-Purified by Weak Cation Exchange Cartridge

Authors: K. Eryilmaz, G. Mercanoglu

Abstract:

Aim: Purification of the final product, which is the last step in the synthesis of lanthanide-labeled peptide compounds, can be accomplished by different methods. Among these methods, the two most commonly used methods are C18 solid phase extraction (SPE) and weak cation exchanger cartridge elution. SPE C18 solid phase extraction method yields high purity final product, while elution from the weak cation exchanger cartridge is pH dependent and ineffective in removing colloidal impurities. The aim of this work is to develop an additional purification method for the lanthanide-labeled peptide compound in cases where the desired radionuclidic and radiochemical purity of the final product can not be achieved because of pH problem or colloidal impurity. Material and Methods: For colloidal impurity formation, 3 mL of water for injection (WFI) was added to 30 mCi of 177LuCl3 solution and allowed to stand for 1 day. 177Lu-DOTATATE was synthesized using EZAG ML-EAZY module (10 mCi/mL). After synthesis, the final product was mixed with the colloidal impurity solution (total volume:13 mL, total activity: 40 mCi). The resulting mixture was trapped in SPE-C18 cartridge. The cartridge was washed with 10 ml saline to remove impurities to the waste vial. The product trapped in the cartridge was eluted with 2 ml of 50% ethanol and collected to the final product vial via passing through a 0.22μm filter. The final product was diluted with 10 mL of saline. Radiochemical purity before and after purification was analysed by HPLC method. (column: ACE C18-100A. 3µm. 150 x 3.0mm, mobile phase: Water-Acetonitrile-Trifluoro acetic acid (75:25:1), flow rate: 0.6 mL/min). Results: UV and radioactivity detector results in HPLC analysis showed that colloidal impurities were completely removed from the 177Lu-DOTATATE/ colloidal impurity mixture by purification method. Conclusion: The improved purification method can be used as an additional method to remove impurities that may result from the lanthanide-peptide synthesis in which the weak cation exchange purification technique is used as the last step. The purification of the final product and the GMP compliance (the final aseptic filtration and the sterile disposable system components) are two major advantages.

Keywords: lanthanide, peptide, labeling, purification, radionuclide, radiopharmaceutical, synthesis

Procedia PDF Downloads 163
14148 Pose-Dependency of Machine Tool Structures: Appearance, Consequences, and Challenges for Lightweight Large-Scale Machines

Authors: S. Apprich, F. Wulle, A. Lechler, A. Pott, A. Verl

Abstract:

Large-scale machine tools for the manufacturing of large work pieces, e.g. blades, casings or gears for wind turbines, feature pose-dependent dynamic behavior. Small structural damping coefficients lead to long decay times for structural vibrations that have negative impacts on the production process. Typically, these vibrations are handled by increasing the stiffness of the structure by adding mass. That is counterproductive to the needs of sustainable manufacturing as it leads to higher resource consumption both in material and in energy. Recent research activities have led to higher resource efficiency by radical mass reduction that rely on control-integrated active vibration avoidance and damping methods. These control methods depend on information describing the dynamic behavior of the controlled machine tools in order to tune the avoidance or reduction method parameters according to the current state of the machine. The paper presents the appearance, consequences and challenges of the pose-dependent dynamic behavior of lightweight large-scale machine tool structures in production. The paper starts with the theoretical introduction of the challenges of lightweight machine tool structures resulting from reduced stiffness. The statement of the pose-dependent dynamic behavior is corroborated by the results of the experimental modal analysis of a lightweight test structure. Afterwards, the consequences of the pose-dependent dynamic behavior of lightweight machine tool structures for the use of active control and vibration reduction methods are explained. Based on the state of the art on pose-dependent dynamic machine tool models and the modal investigation of an FE-model of the lightweight test structure, the criteria for a pose-dependent model for use in vibration reduction are derived. The description of the approach for a general pose-dependent model of the dynamic behavior of large lightweight machine tools that provides the necessary input to the aforementioned vibration avoidance and reduction methods to properly tackle machine vibrations is the outlook of the paper.

Keywords: dynamic behavior, lightweight, machine tool, pose-dependency

Procedia PDF Downloads 459
14147 Development of a Geomechanical Risk Assessment Model for Underground Openings

Authors: Ali Mortazavi

Abstract:

The main objective of this research project is to delve into a multitude of geomechanical risks associated with various mining methods employed within the underground mining industry. Controlling geotechnical design parameters and operational factors affecting the selection of suitable mining techniques for a given underground mining condition will be considered from a risk assessment point of view. Important geomechanical challenges will be investigated as appropriate and relevant to the commonly used underground mining methods. Given the complicated nature of rock mass in-situ and complicated boundary conditions and operational complexities associated with various underground mining methods, the selection of a safe and economic mining operation is of paramount significance. Rock failure at varying scales within the underground mining openings is always a threat to mining operations and causes human and capital losses worldwide. Geotechnical design is a major design component of all underground mines and basically dominates the safety of an underground mine. With regard to uncertainties that exist in rock characterization prior to mine development, there are always risks associated with inappropriate design as a function of mining conditions and the selected mining method. Uncertainty often results from the inherent variability of rock masse, which in turn is a function of both geological materials and rock mass in-situ conditions. The focus of this research is on developing a methodology which enables a geomechanical risk assessment of given underground mining conditions. The outcome of this research is a geotechnical risk analysis algorithm, which can be used as an aid in selecting the appropriate mining method as a function of mine design parameters (e.g., rock in-situ properties, design method, governing boundary conditions such as in-situ stress and groundwater, etc.).

Keywords: geomechanical risk assessment, rock mechanics, underground mining, rock engineering

Procedia PDF Downloads 145
14146 A Stepwise Approach to Automate the Search for Optimal Parameters in Seasonal ARIMA Models

Authors: Manisha Mukherjee, Diptarka Saha

Abstract:

Reliable forecasts of univariate time series data are often necessary for several contexts. ARIMA models are quite popular among practitioners in this regard. Hence, choosing correct parameter values for ARIMA is a challenging yet imperative task. Thus, a stepwise algorithm is introduced to provide automatic and robust estimates for parameters (p; d; q)(P; D; Q) used in seasonal ARIMA models. This process is focused on improvising the overall quality of the estimates, and it alleviates the problems induced due to the unidimensional nature of the methods that are currently used such as auto.arima. The fast and automated search of parameter space also ensures reliable estimates of the parameters that possess several desirable qualities, consequently, resulting in higher test accuracy especially in the cases of noisy data. After vigorous testing on real as well as simulated data, the algorithm doesn’t only perform better than current state-of-the-art methods, it also completely obviates the need for human intervention due to its automated nature.

Keywords: time series, ARIMA, auto.arima, ARIMA parameters, forecast, R function

Procedia PDF Downloads 166
14145 Integrating RAG with Prompt Engineering for Dynamic Log Parsing and Anomaly Detections

Authors: Liu Lin Xin

Abstract:

With the increasing complexity of systems, log parsing and anomaly detection have become crucial for maintaining system stability. However, traditional methods often struggle with adaptability and accuracy, especially when dealing with rapidly evolving log content and unfamiliar domains. To address these challenges, this paper proposes approach that integrates Retrieval Augmented Generation (RAG) technology with Prompt Engineering for Large Language Models, applied specifically in LogPrompt. This approach enables dynamic log parsing and intelligent anomaly detection by combining real-time information retrieval with prompt optimization. The proposed method significantly enhances the adaptability of log analysis and improves the interpretability of results. Experimental results on several public datasets demonstrate the method's superior performance, particularly in scenarios lacking training data, where it significantly outperforms traditional methods. This paper introduces a novel technical pathway for log parsing and anomaly detection, showcasing the substantial theoretical value and practical potential.

Keywords: log parsing, anomaly detection, RAG, prompt engineering, LLMs

Procedia PDF Downloads 35
14144 Improved Pitch Detection Using Fourier Approximation Method

Authors: Balachandra Kumaraswamy, P. G. Poonacha

Abstract:

Automatic Music Information Retrieval has been one of the challenging topics of research for a few decades now with several interesting approaches reported in the literature. In this paper we have developed a pitch extraction method based on a finite Fourier series approximation to the given window of samples. We then estimate pitch as the fundamental period of the finite Fourier series approximation to the given window of samples. This method uses analysis of the strength of harmonics present in the signal to reduce octave as well as harmonic errors. The performance of our method is compared with three best known methods for pitch extraction, namely, Yin, Windowed Special Normalization of the Auto-Correlation Function and Harmonic Product Spectrum methods of pitch extraction. Our study with artificially created signals as well as music files show that Fourier Approximation method gives much better estimate of pitch with less octave and harmonic errors.

Keywords: pitch, fourier series, yin, normalization of the auto- correlation function, harmonic product, mean square error

Procedia PDF Downloads 412
14143 A Comparative Study of Malware Detection Techniques Using Machine Learning Methods

Authors: Cristina Vatamanu, Doina Cosovan, Dragos Gavrilut, Henri Luchian

Abstract:

In the past few years, the amount of malicious software increased exponentially and, therefore, machine learning algorithms became instrumental in identifying clean and malware files through semi-automated classification. When working with very large datasets, the major challenge is to reach both a very high malware detection rate and a very low false positive rate. Another challenge is to minimize the time needed for the machine learning algorithm to do so. This paper presents a comparative study between different machine learning techniques such as linear classifiers, ensembles, decision trees or various hybrids thereof. The training dataset consists of approximately 2 million clean files and 200.000 infected files, which is a realistic quantitative mixture. The paper investigates the above mentioned methods with respect to both their performance (detection rate and false positive rate) and their practicability.

Keywords: ensembles, false positives, feature selection, one side class algorithm

Procedia PDF Downloads 292
14142 The Sustainable Development for Coastal Tourist Building

Authors: D. Avila

Abstract:

The tourism industry is a phenomenon that has become a growing presence in international socio-economic dynamics, which in most cases exceeds the control parameters in the various environmental regulations and sustainability of existing resources. Because of this, the effects on the natural environment at the regional and national levels represent a challenge, for which a number of strategies are necessary to minimize the environmental impact generated by the occupation of the territory. The hotel tourist building and sustainable development in the coastal zone, have an important impact on the environment and on the physical and psychological health of the inhabitants. Environmental quality associated with the comfort of humans to the sustainable development of natural resources; applied to the hotel architecture this concept involves the incorporation of new demands on all of the constructive process of a building, changing customs of developers and users. The methodology developed provides an initial analysis to determine and rank the different tourist buildings, with the above it will be feasible to establish methods of study and environmental impact assessment. Finally, it is necessary to establish an overview regarding the best way to implement tourism development on the coast, containing guidelines to improve and protect the natural environment. This paper analyzes the parameters and strategies to reduce environmental impacts derived from deployments tourism on the coast, through a series of recommendations towards sustainability, in the context of the Bahia de Banderas, Puerto Vallarta, Jalisco. The environmental impact caused by the implementation of tourism development, perceived in a coastal environment, forcing a series of processes, ranging from the identification of impacts, prediction and evaluation of them. For this purpose are described below, different techniques and valuation procedures: Identification of impacts. Methods for the identification of damage caused to the environment pursue general purpose to obtain a group of negative indicators that are subsequently used in the study of environmental impact. There are several systematic methods to identify the impacts caused by human activities. In the present work, develops a procedure based and adapted from the Ministry of works public urban reference in studies of environmental impacts, the representative methods are: list of contrast, arrays, and networks, method of transparencies and superposition of maps.

Keywords: environmental impact, physical health, sustainability, tourist building

Procedia PDF Downloads 329
14141 Instruct Students Effective Ways to Reach an Advanced Level after Graduation

Authors: Huynh Tan Hoi

Abstract:

Considered as one of the hardest languages in the world, Japanese is still the language that many young people choose to learn. Today, with the development of technology, learning foreign languages in general and Japanese language, in particular, is not an impossible barrier. Learning materials are not only from paper books, songs but also through software programs of smartphones or computers. Especially, students who begin to explore effective skills to study this language need to access modern technologies to improve their learning much better. When using the software, some students may feel embarrassed and challenged, but everything would go smoothly after a few days. After completing the course, students will get more knowledge, achieve a higher knowledge such as N2 or N1 Japanese Language Proficiency Test Certificate. In this research paper, 35 students who are studying at Ho Chi Minh City FPT University were asked to complete the questionnaire at the beginning of July up to August of 2018. Through this research, we realize that with the guidance of lecturers, the necessity of using modern software and some effective methods are indispensable in term of improving quality of teaching and learning process.

Keywords: higher knowledge, Japanese, methods, software, students

Procedia PDF Downloads 226
14140 Scheduling of Repetitive Activities for Height-Rise Buildings: Optimisation by Genetic Algorithms

Authors: Mohammed Aljoma

Abstract:

In this paper, a developed prototype for the scheduling of repetitive activities in height-rise buildings was presented. The activities that describe the behavior of the most of activities in multi-storey buildings are scheduled using the developed approach. The prototype combines three methods to attain the optimized planning. The methods include Critical Path Method (CPM), Gantt and Line of Balance (LOB). The developed prototype; POTER is used to schedule repetitive and non-repetitive activities with respect to all constraints that can be automatically generated using a generic database. The prototype uses the method of genetic algorithms for optimizing the planning process. As a result, this approach enables contracting organizations to evaluate various planning solutions that are calculated, tested and classified by POTER to attain an optimal time-cost equilibrium according to their own criteria of time or coast.

Keywords: planning scheduling, genetic algorithms, repetitive activity, construction management, planning, scheduling, risk management, project duration

Procedia PDF Downloads 308
14139 Microfluidic Impedimetric Biochip and Related Methods for Measurement Chip Manufacture and Counting Cells

Authors: Amina Farooq, Nauman Zafar Butt

Abstract:

This paper is about methods and tools for counting particles of interest, such as cells. A microfluidic system with interconnected electronics on a flexible substrate, inlet-outlet ports and interface schemes, sensitive and selective detection of cells specificity, and processing of cell counting at polymer interfaces in a microscale biosensor for use in the detection of target biological and non-biological cells. The development of fluidic channels, planar fluidic contact ports, integrated metal electrodes on a flexible substrate for impedance measurements, and a surface modification plasma treatment as an intermediate bonding layer are all part of the fabrication process. Magnetron DC sputtering is used to deposit a double metal layer (Ti/Pt) over the polypropylene film. Using a photoresist layer, specified and etched zones are established. Small fluid volumes, a reduced detection region, and electrical impedance measurements over a range of frequencies for cell counts improve detection sensitivity and specificity. The procedure involves continuous flow of fluid samples that contain particles of interest through the microfluidic channels, counting all types of particles in a portion of the sample using the electrical differential counter to generate a bipolar pulse for each passing cell—calculating the total number of particles of interest originally in the fluid sample by using MATLAB program and signal processing. It's indeed potential to develop a robust and economical kit for cell counting in whole-blood samples using these methods and similar devices.

Keywords: impedance, biochip, cell counting, microfluidics

Procedia PDF Downloads 162
14138 Geophysical Methods of Mapping Groundwater Aquifer System: Perspectives and Inferences From Lisana Area, Western Margin of the Central Main Ethiopian Rift

Authors: Esubalew Yehualaw Melaku, Tigistu Haile Eritro

Abstract:

In this study, two basic geophysical methods are applied for mapping the groundwater aquifer system in the Lisana area along the Guder River, northeast of Hosanna town, near the western margin of the Central Main Ethiopian Rift. The main target of the study is to map the potential aquifer zone and investigate the groundwater potential for current and future development of the resource in the Gode area. The geophysical methods employed in this study include, Vertical Electrical Sounding (VES) and magnetic survey techniques. Electrical sounding was used to examine and map the depth to the potential aquifer zone of the groundwater and its distribution over the area. On the other hand, a magnetic survey was used to delineate contact between lithologic units and geological structures. The 2D magnetic modeling and the geoelectric sections are used for the identification of weak zones, which control the groundwater flow and storage system. The geophysical survey comprises of twelve VES readings collected by using a Schlumberger array along six profile lines and more than four hundred (400) magnetic readings at about 10m station intervals along four profiles and 20m along three random profiles. The study result revealed that the potential aquifer in the area is obtained at a depth range from 45m to 92m. This is the response of the highly weathered/ fractured ignimbrite and pumice layer with sandy soil, which is the main water-bearing horizon. Overall, in the neighborhood of four VES points, VES- 2, VES- 3, VES-10, and VES-11, shows good water-bearing zones in the study area.

Keywords: vertical electrical sounding, magnetic survey, aquifer, groundwater potential

Procedia PDF Downloads 79
14137 TransDrift: Modeling Word-Embedding Drift Using Transformer

Authors: Nishtha Madaan, Prateek Chaudhury, Nishant Kumar, Srikanta Bedathur

Abstract:

In modern NLP applications, word embeddings are a crucial backbone that can be readily shared across a number of tasks. However, as the text distributions change and word semantics evolve over time, the downstream applications using the embeddings can suffer if the word representations do not conform to the data drift. Thus, maintaining word embeddings to be consistent with the underlying data distribution is a key problem. In this work, we tackle this problem and propose TransDrift, a transformer-based prediction model for word embeddings. Leveraging the flexibility of the transformer, our model accurately learns the dynamics of the embedding drift and predicts future embedding. In experiments, we compare with existing methods and show that our model makes significantly more accurate predictions of the word embedding than the baselines. Crucially, by applying the predicted embeddings as a backbone for downstream classification tasks, we show that our embeddings lead to superior performance compared to the previous methods.

Keywords: NLP applications, transformers, Word2vec, drift, word embeddings

Procedia PDF Downloads 91