Search results for: deep vibro techniques
6739 A Nucleic Acid Extraction Method for High-Viscosity Floricultural Samples
Authors: Harunori Kawabe, Hideyuki Aoshima, Koji Murakami, Minoru Kawakami, Yuka Nakano, David D. Ordinario, C. W. Crawford, Iri Sato-Baran
Abstract:
With the recent advances in gene editing technologies allowing the rewriting of genetic sequences, additional market growth in the global floriculture market beyond previous trends is anticipated through increasingly sophisticated plant breeding techniques. As a prerequisite for gene editing, the gene sequence of the target plant must first be identified. This necessitates the genetic analysis of plants with unknown gene sequences, the extraction of RNA, and comprehensive expression analysis. Consequently, a technology capable of consistently and effectively extracting high-purity DNA and RNA from plants is of paramount importance. Although model plants, such as Arabidopsis and tobacco, have established methods for DNA and RNA extraction, floricultural species such as roses present unique challenges. Different techniques to extract DNA and RNA from various floricultural species were investigated. Upon sampling and grinding the petals of several floricultural species, it was observed that nucleic acid extraction from the ground petal solutions of low viscosity was straightforward; solutions of high viscosity presented a significant challenge. It is postulated that the presence of substantial quantities of polysaccharides and polyphenols in the plant tissue was responsible for the inhibition of nucleic acid extraction. Consequently, attempts were made to extract high-purity DNA and RNA by improving the CTAB method and combining it with commercially available nucleic acid extraction kits. The quality of the total extracted DNA and RNA was evaluated using standard methods. Finally, the effectiveness of the extraction method was assessed by determining whether it was possible to create a library that could be applied as a suitable template for a next-generation sequencer. In conclusion, a method was developed for consistent and accurate nucleic acid extraction from high-viscosity floricultural samples. These results demonstrate improved techniques for DNA and RNA extraction from flowers, help facilitate gene editing of floricultural species and expand the boundaries of research and commercial opportunities.Keywords: floriculture, gene editing, next-generation sequencing, nucleic acid extraction
Procedia PDF Downloads 296738 Nanofluidic Cell for Resolution Improvement of Liquid Transmission Electron Microscopy
Authors: Deybith Venegas-Rojas, Sercan Keskin, Svenja Riekeberg, Sana Azim, Stephanie Manz, R. J. Dwayne Miller, Hoc Khiem Trieu
Abstract:
Liquid Transmission Electron Microscopy (TEM) is a growing area with a broad range of applications from physics and chemistry to material engineering and biology, in which it is possible to image in-situ unseen phenomena. For this, a nanofluidic device is used to insert the nanoflow with the sample inside the microscope in order to keep the liquid encapsulated because of the high vacuum. In the last years, Si3N4 windows have been widely used because of its mechanical stability and low imaging contrast. Nevertheless, the pressure difference between the inside fluid and the outside vacuum in the TEM generates bulging in the windows. This increases the imaged fluid volume, which decreases the signal to noise ratio (SNR), limiting the achievable spatial resolution. With the proposed device, the membrane is fortified with a microstructure capable of stand higher pressure differences, and almost removing completely the bulging. A theoretical study is presented with Finite Element Method (FEM) simulations which provide a deep understanding of the membrane mechanical conditions and proves the effectiveness of this novel concept. Bulging and von Mises Stress were studied for different membrane dimensions, geometries, materials, and thicknesses. The microfabrication of the device was made with a thin wafer coated with thin layers of SiO2 and Si3N4. After the lithography process, these layers were etched (reactive ion etching and buffered oxide etch (BOE) respectively). After that, the microstructure was etched (deep reactive ion etching). Then the back side SiO2 was etched (BOE) and the array of free-standing micro-windows was obtained. Additionally, a Pyrex wafer was patterned with windows, and inlets/outlets, and bonded (anodic bonding) to the Si side to facilitate the thin wafer handling. Later, a thin spacer is sputtered and patterned with microchannels and trenches to guide the nanoflow with the samples. This approach reduces considerably the common bulging problem of the window, improving the SNR, contrast and spatial resolution, increasing substantially the mechanical stability of the windows, allowing a larger viewing area. These developments lead to a wider range of applications of liquid TEM, expanding the spectrum of possible experiments in the field.Keywords: liquid cell, liquid transmission electron microscopy, nanofluidics, nanofluidic cell, thin films
Procedia PDF Downloads 2556737 Multiobjective Optimization of a Pharmaceutical Formulation Using Regression Method
Authors: J. Satya Eswari, Ch. Venkateswarlu
Abstract:
The formulation of a commercial pharmaceutical product involves several composition factors and response characteristics. When the formulation requires to satisfy multiple response characteristics which are conflicting, an optimal solution requires the need for an efficient multiobjective optimization technique. In this work, a regression is combined with a non-dominated sorting differential evolution (NSDE) involving Naïve & Slow and ε constraint techniques to derive different multiobjective optimization strategies, which are then evaluated by means of a trapidil pharmaceutical formulation. The analysis of the results show the effectiveness of the strategy that combines the regression model and NSDE with the integration of both Naïve & Slow and ε constraint techniques for Pareto optimization of trapidil formulation. With this strategy, the optimal formulation at pH=6.8 is obtained with the decision variables of micro crystalline cellulose, hydroxypropyl methylcellulose and compression pressure. The corresponding response characteristics of rate constant and release order are also noted down. The comparison of these results with the experimental data and with those of other multiple regression model based multiobjective evolutionary optimization strategies signify the better performance for optimal trapidil formulation.Keywords: pharmaceutical formulation, multiple regression model, response surface method, radial basis function network, differential evolution, multiobjective optimization
Procedia PDF Downloads 4096736 Synthesis and Characterization of Functionalized Carbon Nanorods/Polystyrene Nanocomposites
Authors: M. A. Karakassides, M. Baikousi, A. Kouloumpis, D. Gournis
Abstract:
Nanocomposites of Carbon Nanorods (CNRs) with Polystyrene (PS), have been synthesized successfully by means of in situ polymerization process and characterized. Firstly, carbon nanorods with graphitic structure were prepared by the standard synthetic procedure of CMK-3 using MCM-41 as template, instead of SBA-15, and sucrose as carbon source. In order to create an organophilic surface on CNRs, two parts of modification were realized: surface chemical oxidation (CNRs-ox) according to the Staudenmaier’s method and the attachment of octadecylamine molecules on the functional groups of CNRs-ox (CNRs-ODA The nanocomposite materials of polystyrene with CNRs-ODA, were prepared by a solution-precipitation method at three nanoadditive to polymer loadings (1, 3 and 5 wt. %). The as derived nanocomposites were studied with a combination of characterization and analytical techniques. Especially, Fourier-transform infrared (FT-IR) and Raman spectroscopies were used for the chemical and structural characterization of the pristine materials and the derived nanocomposites while the morphology of nanocomposites and the dispersion of the carbon nanorods were analyzed by atomic force and scanning electron microscopy techniques. Tensile testing and thermogravimetric analysis (TGA) along with differential scanning calorimetry (DSC) were also used to examine the mechanical properties and thermal stability -glass transition temperature of PS after the incorporation of CNRs-ODA nanorods. The results showed that the thermal and mechanical properties of the PS/ CNRs-ODA nanocomposites gradually improved with increasing of CNRs-ODA loading.Keywords: nanocomposites, polystyrene, carbon, nanorods
Procedia PDF Downloads 3526735 Diagnosis of Rotavirus Infection among Egyptian Children by Using Different Laboratory Techniques
Authors: Mohamed A. Alhammad, Hadia A. Abou-Donia, Mona H. Hashish, Mohamed N. Massoud
Abstract:
Background: Rotavirus is the leading etiologic agent of severe diarrheal disease in infants and young children worldwide. The present study was aimed 1) to detect rotavirus infection as a cause of diarrhoea among children under 5 years of age using the two serological methods (ELISA and LA) and the PCR technique (2) to evaluate the three methodologies used for human RV detection in stool samples. Materials and Methods: This study was carried out on 247 children less than 5 years old, diagnosed clinically as acute gastroenteritis and attending Alexandria University Children Hospital at EL-Shatby. Rotavirus antigen was screened by ELISA and LA tests in all stool samples, whereas only 100 samples were subjected to RT-PCR method for detection of rotavirus RNA. Results: Out of the 247 studied cases with diarrhoea, rotavirus antigen was detected in 83 (33.6%) by ELISA and 73 (29.6%) by LA, while the 100 cases tested by RT-PCR showed that 44% of them had rotavirus RNA. Rotavirus diarrhoea was significantly presented with a marked seasonal peak during autumn and winter (61.4%). Conclusion: The present study confirms the huge burden of rotavirus as a major cause of acute diarrhoea in Egyptian infants and young children. It was concluded that; LA is equal in sensitivity to ELISA, ELISA is more specific than LA, and RT-PCR is more specific than ELISA and LA in diagnosis of rotavirus infection.Keywords: rotavirus, diarrhea, immunoenzyme techniques, latex fixation tests, RT-PCR
Procedia PDF Downloads 3706734 A Comprehensive Survey on Machine Learning Techniques and User Authentication Approaches for Credit Card Fraud Detection
Authors: Niloofar Yousefi, Marie Alaghband, Ivan Garibay
Abstract:
With the increase of credit card usage, the volume of credit card misuse also has significantly increased, which may cause appreciable financial losses for both credit card holders and financial organizations issuing credit cards. As a result, financial organizations are working hard on developing and deploying credit card fraud detection methods, in order to adapt to ever-evolving, increasingly sophisticated defrauding strategies and identifying illicit transactions as quickly as possible to protect themselves and their customers. Compounding on the complex nature of such adverse strategies, credit card fraudulent activities are rare events compared to the number of legitimate transactions. Hence, the challenge to develop fraud detection that are accurate and efficient is substantially intensified and, as a consequence, credit card fraud detection has lately become a very active area of research. In this work, we provide a survey of current techniques most relevant to the problem of credit card fraud detection. We carry out our survey in two main parts. In the first part, we focus on studies utilizing classical machine learning models, which mostly employ traditional transnational features to make fraud predictions. These models typically rely on some static physical characteristics, such as what the user knows (knowledge-based method), or what he/she has access to (object-based method). In the second part of our survey, we review more advanced techniques of user authentication, which use behavioral biometrics to identify an individual based on his/her unique behavior while he/she is interacting with his/her electronic devices. These approaches rely on how people behave (instead of what they do), which cannot be easily forged. By providing an overview of current approaches and the results reported in the literature, this survey aims to drive the future research agenda for the community in order to develop more accurate, reliable and scalable models of credit card fraud detection.Keywords: Credit Card Fraud Detection, User Authentication, Behavioral Biometrics, Machine Learning, Literature Survey
Procedia PDF Downloads 1216733 Evaluation of a Data Fusion Algorithm for Detecting and Locating a Radioactive Source through Monte Carlo N-Particle Code Simulation and Experimental Measurement
Authors: Hadi Ardiny, Amir Mohammad Beigzadeh
Abstract:
Through the utilization of a combination of various sensors and data fusion methods, the detection of potential nuclear threats can be significantly enhanced by extracting more information from different data. In this research, an experimental and modeling approach was employed to track a radioactive source by combining a surveillance camera and a radiation detector (NaI). To run this experiment, three mobile robots were utilized, with one of them equipped with a radioactive source. An algorithm was developed in identifying the contaminated robot through correlation between camera images and camera data. The computer vision method extracts the movements of all robots in the XY plane coordinate system, and the detector system records the gamma-ray count. The position of the robots and the corresponding count of the moving source were modeled using the MCNPX simulation code while considering the experimental geometry. The results demonstrated a high level of accuracy in finding and locating the target in both the simulation model and experimental measurement. The modeling techniques prove to be valuable in designing different scenarios and intelligent systems before initiating any experiments.Keywords: nuclear threats, radiation detector, MCNPX simulation, modeling techniques, intelligent systems
Procedia PDF Downloads 1236732 Electric Vehicle Fleet Operators in the Energy Market - Feasibility and Effects on the Electricity Grid
Authors: Benjamin Blat Belmonte, Stephan Rinderknecht
Abstract:
The transition to electric vehicles (EVs) stands at the forefront of innovative strategies designed to address environmental concerns and reduce fossil fuel dependency. As the number of EVs on the roads increases, so too does the potential for their integration into energy markets. This research dives deep into the transformative possibilities of using electric vehicle fleets, specifically electric bus fleets, not just as consumers but as active participants in the energy market. This paper investigates the feasibility and grid effects of electric vehicle fleet operators in the energy market. Our objective centers around a comprehensive exploration of the sector coupling domain, with an emphasis on the economic potential in both electricity and balancing markets. Methodologically, our approach combines data mining techniques with thorough pre-processing, pulling from a rich repository of electricity and balancing market data. Our findings are grounded in the actual operational realities of the bus fleet operator in Darmstadt, Germany. We employ a Mixed Integer Linear Programming (MILP) approach, with the bulk of the computations being processed on the High-Performance Computing (HPC) platform ‘Lichtenbergcluster’. Our findings underscore the compelling economic potential of EV fleets in the energy market. With electric buses becoming more prevalent, the considerable size of these fleets, paired with their substantial battery capacity, opens up new horizons for energy market participation. Notably, our research reveals that economic viability is not the sole advantage. Participating actively in the energy market also translates into pronounced positive effects on grid stabilization. Essentially, EV fleet operators can serve a dual purpose: facilitating transport while simultaneously playing an instrumental role in enhancing grid reliability and resilience. This research highlights the symbiotic relationship between the growth of EV fleets and the stabilization of the energy grid. Such systems could lead to both commercial and ecological advantages, reinforcing the value of electric bus fleets in the broader landscape of sustainable energy solutions. In conclusion, the electrification of transport offers more than just a means to reduce local greenhouse gas emissions. By positioning electric vehicle fleet operators as active participants in the energy market, there lies a powerful opportunity to drive forward the energy transition. This study serves as a testament to the synergistic potential of EV fleets in bolstering both economic viability and grid stabilization, signaling a promising trajectory for future sector coupling endeavors.Keywords: electric vehicle fleet, sector coupling, optimization, electricity market, balancing market
Procedia PDF Downloads 746731 Bacterial Flora of the Anopheles Fluviatilis S. L. in an Endemic Malaria Area in Southeastern Iran for Candidate Paraterasgenesis Strains
Authors: Seyed Hassan Moosa-kazemi, Jalal Mohammadi Soleimani, Hassan Vatandoost, Mohammad Hassan Shirazi, Sara Hajikhani, Roonak Bakhtiari, Morteza Akbari, Siamak Hydarzadeh
Abstract:
Malaria is an infectious disease and considered most important health problems in the southeast of Iran. Iran is elimination malaria phase and new tool need to vector control. Paraterasgenesis is a new way to cut of life cycle of the malaria parasite. In this study, the microflora of the surface and gut of various stages of Anopheles fluviatilis James as one of the important malaria vector was studied using biochemical and molecular techniques during 2013-2014. Twelve bacteria species were found including; Providencia rettgeri, Morganella morganii, Enterobacter aerogenes, Pseudomonas oryzihabitans, Citrobacter braakii، Citrobacter freundii، Aeromonas hydrophila، Klebsiella oxytoca, Citrobacter koseri, Serratia fonticola، Enterobacter sakazakii and Yersinia pseudotuberculosis. The species of Alcaligenes faecalis, Providencia vermicola and Enterobacter hormaechei were identified in various stages of the vector and confirmed by biochemical and molecular techniques. We found Providencia rettgeri proper candidate for paratransgenesis.Keywords: Anopheles fluviatilis, bacteria, malaria, Paraterasgenesis, Southern Iran
Procedia PDF Downloads 4916730 Fiber Stiffness Detection of GFRP Using Combined ABAQUS and Genetic Algorithms
Authors: Gyu-Dong Kim, Wuk-Jae Yoo, Sang-Youl Lee
Abstract:
Composite structures offer numerous advantages over conventional structural systems in the form of higher specific stiffness and strength, lower life-cycle costs, and benefits such as easy installation and improved safety. Recently, there has been a considerable increase in the use of composites in engineering applications and as wraps for seismic upgrading and repairs. However, these composites deteriorate with time because of outdated materials, excessive use, repetitive loading, climatic conditions, manufacturing errors, and deficiencies in inspection methods. In particular, damaged fibers in a composite result in significant degradation of structural performance. In order to reduce the failure probability of composites in service, techniques to assess the condition of the composites to prevent continual growth of fiber damage are required. Condition assessment technology and nondestructive evaluation (NDE) techniques have provided various solutions for the safety of structures by means of detecting damage or defects from static or dynamic responses induced by external loading. A variety of techniques based on detecting the changes in static or dynamic behavior of isotropic structures has been developed in the last two decades. These methods, based on analytical approaches, are limited in their capabilities in dealing with complex systems, primarily because of their limitations in handling different loading and boundary conditions. Recently, investigators have introduced direct search methods based on metaheuristics techniques and artificial intelligence, such as genetic algorithms (GA), simulated annealing (SA) methods, and neural networks (NN), and have promisingly applied these methods to the field of structural identification. Among them, GAs attract our attention because they do not require a considerable amount of data in advance in dealing with complex problems and can make a global solution search possible as opposed to classical gradient-based optimization techniques. In this study, we propose an alternative damage-detection technique that can determine the degraded stiffness distribution of vibrating laminated composites made of Glass Fiber-reinforced Polymer (GFRP). The proposed method uses a modified form of the bivariate Gaussian distribution function to detect degraded stiffness characteristics. In addition, this study presents a method to detect the fiber property variation of laminated composite plates from the micromechanical point of view. The finite element model is used to study free vibrations of laminated composite plates for fiber stiffness degradation. In order to solve the inverse problem using the combined method, this study uses only first mode shapes in a structure for the measured frequency data. In particular, this study focuses on the effect of the interaction among various parameters, such as fiber angles, layup sequences, and damage distributions, on fiber-stiffness damage detection.Keywords: stiffness detection, fiber damage, genetic algorithm, layup sequences
Procedia PDF Downloads 2736729 Algorithmic Skills Transferred from Secondary CSI Studies into Tertiary Education
Authors: Piroska Biró, Mária Csernoch, János Máth, Kálmán Abari
Abstract:
Testing the first year students of Informatics at the University of Debrecen revealed that students start their tertiary studies in programming with a low level of programming knowledge and algorithmic skills. The possible reasons which lead the students to this very unfortunate result were examined. The results of the test were compared to the students’ results in the school leaving exams and to their self-assessment values. It was found that there is only a slight connection between the students’ results in the test and in the school leaving exams, especially at intermediate level. Beyond this, the school leaving exams do not seem to enable students to evaluate their own abilities.Keywords: deep and surface approaches, metacognitive abilities, programming and algorithmic skills, school leaving exams, tracking code
Procedia PDF Downloads 3846728 Sustainable Tourism Management in Taiwan: Using Certification and KPI Indicators to Development Sustainable Tourism Experiences
Authors: Shirley Kuo
Abstract:
The main purpose of this study is to develop sustainable indicators for Taiwan, and using the Delphi method to find that our tourist areas can progress in a sustainable way. We need a lot of infrastructures and policies to develop tourist areas, and with proper KPI indicators can reduce the destruction of the natural and ecological environment. This study will first study the foreign certification experiences, because Taiwan is currently in the development stage, and then the methodology will explain in-depth interviews using the Delphi method, and then there is discussion about which KPI indicators Taiwan currently needs. In this study current progress is a deep understanding of national sustainable tourism certification and KPI indicators.Keywords: sustainable tourism, certification, KPI indicators, Delphi method
Procedia PDF Downloads 3326727 Comparing Deep Architectures for Selecting Optimal Machine Translation
Authors: Despoina Mouratidis, Katia Lida Kermanidis
Abstract:
Machine translation (MT) is a very important task in Natural Language Processing (NLP). MT evaluation is crucial in MT development, as it constitutes the means to assess the success of an MT system, and also helps improve its performance. Several methods have been proposed for the evaluation of (MT) systems. Some of the most popular ones in automatic MT evaluation are score-based, such as the BLEU score, and others are based on lexical similarity or syntactic similarity between the MT outputs and the reference involving higher-level information like part of speech tagging (POS). This paper presents a language-independent machine learning framework for classifying pairwise translations. This framework uses vector representations of two machine-produced translations, one from a statistical machine translation model (SMT) and one from a neural machine translation model (NMT). The vector representations consist of automatically extracted word embeddings and string-like language-independent features. These vector representations used as an input to a multi-layer neural network (NN) that models the similarity between each MT output and the reference, as well as between the two MT outputs. To evaluate the proposed approach, a professional translation and a "ground-truth" annotation are used. The parallel corpora used are English-Greek (EN-GR) and English-Italian (EN-IT), in the educational domain and of informal genres (video lecture subtitles, course forum text, etc.) that are difficult to be reliably translated. They have tested three basic deep learning (DL) architectures to this schema: (i) fully-connected dense, (ii) Convolutional Neural Network (CNN), and (iii) Long Short-Term Memory (LSTM). Experiments show that all tested architectures achieved better results when compared against those of some of the well-known basic approaches, such as Random Forest (RF) and Support Vector Machine (SVM). Better accuracy results are obtained when LSTM layers are used in our schema. In terms of a balance between the results, better accuracy results are obtained when dense layers are used. The reason for this is that the model correctly classifies more sentences of the minority class (SMT). For a more integrated analysis of the accuracy results, a qualitative linguistic analysis is carried out. In this context, problems have been identified about some figures of speech, as the metaphors, or about certain linguistic phenomena, such as per etymology: paronyms. It is quite interesting to find out why all the classifiers led to worse accuracy results in Italian as compared to Greek, taking into account that the linguistic features employed are language independent.Keywords: machine learning, machine translation evaluation, neural network architecture, pairwise classification
Procedia PDF Downloads 1326726 Competition between Regression Technique and Statistical Learning Models for Predicting Credit Risk Management
Authors: Chokri Slim
Abstract:
The objective of this research is attempting to respond to this question: Is there a significant difference between the regression model and statistical learning models in predicting credit risk management? A Multiple Linear Regression (MLR) model was compared with neural networks including Multi-Layer Perceptron (MLP), and a Support vector regression (SVR). The population of this study includes 50 listed Banks in Tunis Stock Exchange (TSE) market from 2000 to 2016. Firstly, we show the factors that have significant effect on the quality of loan portfolios of banks in Tunisia. Secondly, it attempts to establish that the systematic use of objective techniques and methods designed to apprehend and assess risk when considering applications for granting credit, has a positive effect on the quality of loan portfolios of banks and their future collectability. Finally, we will try to show that the bank governance has an impact on the choice of methods and techniques for analyzing and measuring the risks inherent in the banking business, including the risk of non-repayment. The results of empirical tests confirm our claims.Keywords: credit risk management, multiple linear regression, principal components analysis, artificial neural networks, support vector machines
Procedia PDF Downloads 1506725 A Process FMEA in Aero Fuel Pump Manufacturing and Conduct the Corrective Actions
Authors: Zohre Soleymani, Meisam Amirzadeh
Abstract:
Many products are safety critical, so proactive analysis techniques are vital for them because these techniques try to identify potential failures before the products are produced. Failure Mode and Effective Analysis (FMEA) is an effective tool in identifying probable problems of product or process and prioritizing them and planning for its elimination. The paper shows the implementation of FMEA process to identify and remove potential troubles of aero fuel pumps manufacturing process and improve the reliability of subsystems. So the different possible causes of failure and its effects along with the recommended actions are discussed. FMEA uses Risk Priority Number (RPN) to determine the risk level. RPN value is depending on Severity(S), Occurrence (O) and Detection (D) parameters, so these parameters need to be determined. After calculating the RPN for identified potential failure modes, the corrective actions are defined to reduce risk level according to assessment strategy and determined acceptable risk level. Then FMEA process is performed again and RPN revised is calculated. The represented results are applied in the format of a case study. These results show the improvement in manufacturing process and considerable reduction in aero fuel pump production risk level.Keywords: FMEA, risk priority number, aero pump, corrective action
Procedia PDF Downloads 2866724 Numerical Investigation of Multiphase Flow Structure for the Flue Gas Desulfurization
Authors: Cheng-Jui Li, Chien-Chou Tseng
Abstract:
This study adopts Computational Fluid Dynamics (CFD) technique to build the multiphase flow numerical model where the interface between the flue gas and desulfurization liquid can be traced by Eulerian-Eulerian model. Inside the tower, the contact of the desulfurization liquid flow from the spray nozzles and flue gas flow can trigger chemical reactions to remove the sulfur dioxide from the exhaust gas. From experimental observations of the industrial scale plant, the desulfurization mechanism depends on the mixing level between the flue gas and the desulfurization liquid. In order to significantly improve the desulfurization efficiency, the mixing efficiency and the residence time can be increased by perforated sieve trays. Hence, the purpose of this research is to investigate the flow structure of sieve trays for the flue gas desulfurization by numerical simulation. In this study, there is an outlet at the top of FGD tower to discharge the clean gas and the FGD tower has a deep tank at the bottom, which is used to collect the slurry liquid. In the major desulfurization zone, the desulfurization liquid and flue gas have a complex mixing flow. Because there are four perforated plates in the major desulfurization zone, which spaced 0.4m from each other, and the spray array is placed above the top sieve tray, which includes 33 nozzles. Each nozzle injects desulfurization liquid that consists of the Mg(OH)2 solution. On each sieve tray, the outside diameter, the hole diameter, and the porosity are 0.6m, 20 mm and 34.3%. The flue gas flows into the FGD tower from the space between the major desulfurization zone and the deep tank can finally become clean. The desulfurization liquid and the liquid slurry goes to the bottom tank and is discharged as waste. When the desulfurization solution flow impacts the sieve tray, the downward momentum will be converted to the upper surface of the sieve tray. As a result, a thin liquid layer can be developed above the sieve tray, which is the so-called the slurry layer. And the volume fraction value within the slurry layer is around 0.3~0.7. Therefore, the liquid phase can't be considered as a discrete phase under the Eulerian-Lagrangian framework. Besides, there is a liquid column through the sieve trays. The downward liquid column becomes narrow as it interacts with the upward gas flow. After the flue gas flows into the major desulfurization zone, the flow direction of the flue gas is upward (+y) in the tube between the liquid column and the solid boundary of the FGD tower. As a result, the flue gas near the liquid column may be rolled down to slurry layer, which developed a vortex or a circulation zone between any two sieve trays. The vortex structure between two sieve trays results in a sufficient large two-phase contact area. It also increases the number of times that the flue gas interacts with the desulfurization liquid. On the other hand, the sieve trays improve the two-phase mixing, which may improve the SO2 removal efficiency.Keywords: Computational Fluid Dynamics (CFD), Eulerian-Eulerian Model, Flue Gas Desulfurization (FGD), perforated sieve tray
Procedia PDF Downloads 2846723 Fuzzy Neuro Approach for Integrated Water Management System
Authors: Stuti Modi, Aditi Kambli
Abstract:
This paper addresses the need for intelligent water management and distribution system in smart cities to ensure optimal consumption and distribution of water for drinking and sanitation purposes. Water being a limited resource in cities require an effective system for collection, storage and distribution. In this paper, applications of two mostly widely used particular types of data-driven models, namely artificial neural networks (ANN) and fuzzy logic-based models, to modelling in the water resources management field are considered. The objective of this paper is to review the principles of various types and architectures of neural network and fuzzy adaptive systems and their applications to integrated water resources management. Final goal of the review is to expose and formulate progressive direction of their applicability and further research of the AI-related and data-driven techniques application and to demonstrate applicability of the neural networks, fuzzy systems and other machine learning techniques in the practical issues of the regional water management. Apart from this the paper will deal with water storage, using ANN to find optimum reservoir level and predicting peak daily demands.Keywords: artificial neural networks, fuzzy systems, peak daily demand prediction, water management and distribution
Procedia PDF Downloads 1866722 Federated Knowledge Distillation with Collaborative Model Compression for Privacy-Preserving Distributed Learning
Authors: Shayan Mohajer Hamidi
Abstract:
Federated learning has emerged as a promising approach for distributed model training while preserving data privacy. However, the challenges of communication overhead, limited network resources, and slow convergence hinder its widespread adoption. On the other hand, knowledge distillation has shown great potential in compressing large models into smaller ones without significant loss in performance. In this paper, we propose an innovative framework that combines federated learning and knowledge distillation to address these challenges and enhance the efficiency of distributed learning. Our approach, called Federated Knowledge Distillation (FKD), enables multiple clients in a federated learning setting to collaboratively distill knowledge from a teacher model. By leveraging the collaborative nature of federated learning, FKD aims to improve model compression while maintaining privacy. The proposed framework utilizes a coded teacher model that acts as a reference for distilling knowledge to the client models. To demonstrate the effectiveness of FKD, we conduct extensive experiments on various datasets and models. We compare FKD with baseline federated learning methods and standalone knowledge distillation techniques. The results show that FKD achieves superior model compression, faster convergence, and improved performance compared to traditional federated learning approaches. Furthermore, FKD effectively preserves privacy by ensuring that sensitive data remains on the client devices and only distilled knowledge is shared during the training process. In our experiments, we explore different knowledge transfer methods within the FKD framework, including Fine-Tuning (FT), FitNet, Correlation Congruence (CC), Similarity-Preserving (SP), and Relational Knowledge Distillation (RKD). We analyze the impact of these methods on model compression and convergence speed, shedding light on the trade-offs between size reduction and performance. Moreover, we address the challenges of communication efficiency and network resource utilization in federated learning by leveraging the knowledge distillation process. FKD reduces the amount of data transmitted across the network, minimizing communication overhead and improving resource utilization. This makes FKD particularly suitable for resource-constrained environments such as edge computing and IoT devices. The proposed FKD framework opens up new avenues for collaborative and privacy-preserving distributed learning. By combining the strengths of federated learning and knowledge distillation, it offers an efficient solution for model compression and convergence speed enhancement. Future research can explore further extensions and optimizations of FKD, as well as its applications in domains such as healthcare, finance, and smart cities, where privacy and distributed learning are of paramount importance.Keywords: federated learning, knowledge distillation, knowledge transfer, deep learning
Procedia PDF Downloads 756721 Modelling and Simulation of Cascaded H-Bridge Multilevel Single Source Inverter Using PSIM
Authors: Gaddafi Sani Shehu, Tankut Yalcınoz, Abdullahi Bala Kunya
Abstract:
Multilevel inverters such as flying capacitor, diode-clamped, and cascaded H-bridge inverters are very popular particularly in medium and high power applications. This paper focuses on a cascaded H-bridge module using a single direct current (DC) source in order to generate an 11-level output voltage. The noble approach reduces the number of switches and gate drivers, in comparison with a conventional method. The anticipated topology produces more accurate result with an isolation transformer at high switching frequency. Different modulation techniques can be used for the multilevel inverter, but this work features modulation techniques known as selective harmonic elimination (SHE).This modulation approach reduces the number of carriers with reduction in Switching Losses, Total Harmonic Distortion (THD), and thereby increasing Power Quality (PQ). Based on the simulation result obtained, it appears SHE has the ability to eliminate selected harmonics by chopping off the fundamental output component. The performance evaluation of the proposed cascaded multilevel inverter is performed using PSIM simulation package and THD of 0.94% is obtained.Keywords: cascaded H-bridge multilevel inverter, power quality, selective harmonic elimination
Procedia PDF Downloads 4196720 Futuristic Black Box Design Considerations and Global Networking for Real Time Monitoring of Flight Performance Parameters
Authors: K. Parandhama Gowd
Abstract:
The aim of this research paper is to conceptualize, discuss, analyze and propose alternate design methodologies for futuristic Black Box for flight safety. The proposal also includes global networking concepts for real time surveillance and monitoring of flight performance parameters including GPS parameters. It is expected that this proposal will serve as a failsafe real time diagnostic tool for accident investigation and location of debris in real time. In this paper, an attempt is made to improve the existing methods of flight data recording techniques and improve upon design considerations for futuristic FDR to overcome the trauma of not able to locate the block box. Since modern day communications and information technologies with large bandwidth are available coupled with faster computer processing techniques, the attempt made in this paper to develop a failsafe recording technique is feasible. Further data fusion/data warehousing technologies are available for exploitation.Keywords: flight data recorder (FDR), black box, diagnostic tool, global networking, cockpit voice and data recorder (CVDR), air traffic control (ATC), air traffic, telemetry, tracking and control centers ATTTCC)
Procedia PDF Downloads 5726719 3D Model Completion Based on Similarity Search with Slim-Tree
Authors: Alexis Aldo Mendoza Villarroel, Ademir Clemente Villena Zevallos, Cristian Jose Lopez Del Alamo
Abstract:
With the advancement of technology it is now possible to scan entire objects and obtain their digital representation by using point clouds or polygon meshes. However, some objects may be broken or have missing parts; thus, several methods focused on this problem have been proposed based on Geometric Deep Learning, such as GCNN, ACNN, PointNet, among others. In this article an approach from a different paradigm is proposed, using metric data structures to index global descriptors in the spectral domain and allow the recovery of a set of similar models in polynomial time; to later use the Iterative Close Point algorithm and recover the parts of the incomplete model using the geometry and topology of the model with less Hausdorff distance.Keywords: 3D reconstruction method, point cloud completion, shape completion, similarity search
Procedia PDF Downloads 1216718 Debate, Discontent and National Identity in a Secular State
Authors: Man Bahadur Shahu
Abstract:
The secularism is a controversial, debatable and misinterpreted issue since its endorsement in the 2007 constitution in Nepal. The unprecedented acts have been seen favoring and disfavoring against the secularism within the public domain—which creates the fallacies and suspicions in the rationalization and modernization process. This paper highlights three important points: first, the secularization suddenly ruptures the silence and institutional decline of religion within the state. Second, state effort on secularism simultaneously fosters the state neutrality and state separation from religious institutions that amplify the recognition of all religious groups through the equal treatment in their festivity, rituals, and practices. Third, no state would completely secular because of their deep-rooted mindset and disposition with their own religious faiths and beliefs that largely enhance intergroup conflict, dispute, riot and turbulence in post-secular period in the name of proselytizing and conversion.Keywords: conflict, proselytizing, religion, secular
Procedia PDF Downloads 1536717 Modeling and Simulation of Underwater Flexible Manipulator as Raleigh Beam Using Bond Graph
Authors: Sumit Kumar, Sunil Kumar, Chandan Deep Singh
Abstract:
This paper presents modeling and simulation of flexible robot in an underwater environment. The underwater environment completely contrasts with ground or space environment. The robot in an underwater situation is subjected to various dynamic forces like buoyancy forces, hydrostatic and hydrodynamic forces. The underwater robot is modeled as Rayleigh beam. The developed model further allows estimating the deflection of tip in two directions. The complete dynamics of the underwater robot is analyzed, which is the main focus of this investigation. The control of robot trajectory is not discussed in this paper. Simulation is performed using Symbol Shakti software.Keywords: bond graph modeling, dynamics. modeling, rayleigh beam, underwater robot
Procedia PDF Downloads 5876716 Electrochemical Corrosion and Mechanical Properties of Structural Materials for Oil and Gas Applications in Simulated Deep-Sea Well Environments
Authors: Turin Datta, Kisor K. Sahu
Abstract:
Structural materials used in today’s oil and gas exploration and drilling of both onshore and offshore oil and gas wells must possess superior tensile properties, excellent resistance to corrosive degradation that includes general, localized (pitting and crevice) and environment assisted cracking such as stress corrosion cracking and hydrogen embrittlement. The High Pressure and High Temperature (HPHT) wells are typically operated at temperature and pressure that can exceed 300-3500F and 10,000psi (69MPa) respectively which necessitates the use of exotic materials in these exotic sources of natural resources. This research investigation is focussed on the evaluation of tensile properties and corrosion behavior of AISI 4140 High-Strength Low Alloy Steel (HSLA) possessing tempered martensitic microstructure and Duplex 2205 Stainless Steel (DSS) having austenitic and ferritic phase. The selection of this two alloys are primarily based on economic considerations as 4140 HSLA is cheaper when compared to DSS 2205. Due to the harsh aggressive chemical species encountered in deep oil and gas wells like chloride ions (Cl-), carbon dioxide (CO2), hydrogen sulphide (H2S) along with other mineral organic acids, DSS 2205, having a dual-phase microstructure can mitigate the degradation resulting from the presence of both chloride ions (Cl-) and hydrogen simultaneously. Tensile properties evaluation indicates a ductile failure of DSS 2205 whereas 4140 HSLA exhibit quasi-cleavage fracture due to the phenomenon of ‘tempered martensitic embrittlement’. From the potentiodynamic polarization testing, it is observed that DSS 2205 has higher corrosion resistance than 4140 HSLA; the former exhibits passivity signifying resistance to localized corrosion while the latter exhibits active dissolution in all the environmental parameters space that was tested. From the Scanning Electron Microscopy (SEM) evaluation, it is understood that stable pits appear in DSS 2205 only when the temperature exceeds the critical pitting temperature (CPT). SEM observation of the corroded 4140 HSLA specimen tested in aqueous 3.5 wt.% NaCl solution reveals intergranular cracking which appears due to the adsorption and diffusion of hydrogen during polarization, thus, causing hydrogen-induced cracking/hydrogen embrittlement. General corrosion testing of DSS 2205 in acidic brine (pH~3.0) solution at ambient temperature using coupons indicate no weight loss even after three months whereas the corrosion rate of AISI 4140 HSLA is significantly higher after one month of testing.Keywords: DSS 2205, polarization, pitting, SEM
Procedia PDF Downloads 2646715 3D Human Reconstruction over Cloud Based Image Data via AI and Machine Learning
Authors: Kaushik Sathupadi, Sandesh Achar
Abstract:
Human action recognition modeling is a critical task in machine learning. These systems require better techniques for recognizing body parts and selecting optimal features based on vision sensors to identify complex action patterns efficiently. Still, there is a considerable gap and challenges between images and videos, such as brightness, motion variation, and random clutters. This paper proposes a robust approach for classifying human actions over cloud-based image data. First, we apply pre-processing and detection, human and outer shape detection techniques. Next, we extract valuable information in terms of cues. We extract two distinct features: fuzzy local binary patterns and sequence representation. Then, we applied a greedy, randomized adaptive search procedure for data optimization and dimension reduction, and for classification, we used a random forest. We tested our model on two benchmark datasets, AAMAZ and the KTH Multi-view football datasets. Our HMR framework significantly outperforms the other state-of-the-art approaches and achieves a better recognition rate of 91% and 89.6% over the AAMAZ and KTH multi-view football datasets, respectively.Keywords: computer vision, human motion analysis, random forest, machine learning
Procedia PDF Downloads 366714 Design of Transmit Beamspace and DOA Estimation in MIMO Radar
Authors: S. Ilakkiya, A. Merline
Abstract:
A multiple-input multiple-output (MIMO) radar systems use modulated waveforms and directive antennas to transmit electromagnetic energy into a specific volume in space to search for targets. This paper deals with the design of transmit beamspace matrix and DOA estimation for multiple-input multiple-output (MIMO) radar with collocated antennas.The design of transmit beamspace matrix is based on minimizing the difference between a desired transmit beampattern and the actual one while enforcing the constraint of uniform power distribution across the transmit array elements. Rotational invariance property is established at the transmit array by imposing a specific structure on the beamspace matrix. Semidefinite programming and spatial-division based design (SDD) are also designed separately. In MIMO radar systems, DOA estimation is an essential process to determine the direction of incoming signals and thus to direct the beam of the antenna array towards the estimated direction. This estimation deals with non-adaptive spectral estimation and adaptive spectral estimation techniques. The design of the transmit beamspace matrix and spectral estimation techniques are studied through simulation.Keywords: adaptive and non-adaptive spectral estimation, direction of arrival estimation, MIMO radar, rotational invariance property, transmit, receive beamforming
Procedia PDF Downloads 5196713 Combining Corpus Linguistics and Critical Discourse Analysis to Study Power Relations in Hindi Newspapers
Authors: Vandana Mishra, Niladri Sekhar Dash, Jayshree Charkraborty
Abstract:
This present paper focuses on the application of corpus linguistics techniques for critical discourse analysis (CDA) of Hindi newspapers. While Corpus linguistics is the study of language as expressed in corpora (samples) of 'real world' text, CDA is an interdisciplinary approach to the study of discourse that views language as a form of social practice. CDA has mainly been studied from a qualitative perspective. However, we can say that recent studies have begun combining corpus linguistics with CDA in analyzing large volumes of text for the study of existing power relations in society. The corpus under our study is also of a sizable amount (1 million words of Hindi newspaper texts) and its analysis requires an alternative analytical procedure. So, we have combined both the quantitative approach i.e. the use of corpus techniques with CDA’s traditional qualitative analysis. In this context, we have focused on the Keyword Analysis Sorting Concordance Lines of the selected Keywords and calculating collocates of the keywords. We have made use of the Wordsmith Tool for all these analysis. The analysis starts with identifying the keywords in the political news corpus when compared with the main news corpus. The keywords are extracted from the corpus based on their keyness calculated through statistical tests like chi-squared test and log-likelihood test on the frequent words of the corpus. Some of the top occurring keywords are मोदी (Modi), भाजपा (BJP), कांग्रेस (Congress), सरकार (Government) and पार्टी (Political party). This is followed by the concordance analysis of these keywords which generates thousands of lines but we have to select few lines and examine them based on our objective. We have also calculated the collocates of the keywords based on their Mutual Information (MI) score. Both concordance and collocation help to identify lexical patterns in the political texts. Finally, all these quantitative results derived from the corpus techniques will be subjectively interpreted in accordance to the CDA’s theory to examine the ways in which political news discourse produces social and political inequality, power abuse or domination.Keywords: critical discourse analysis, corpus linguistics, Hindi newspapers, power relations
Procedia PDF Downloads 2246712 From Industry 4.0 to Agriculture 4.0: A Framework to Manage Product Data in Agri-Food Supply Chain for Voluntary Traceability
Authors: Angelo Corallo, Maria Elena Latino, Marta Menegoli
Abstract:
Agri-food value chain involves various stakeholders with different roles. All of them abide by national and international rules and leverage marketing strategies to advance their products. Food products and related processing phases carry with it a big mole of data that are often not used to inform final customer. Some data, if fittingly identified and used, can enhance the single company, and/or the all supply chain creates a math between marketing techniques and voluntary traceability strategies. Moreover, as of late, the world has seen buying-models’ modification: customer is careful on wellbeing and food quality. Food citizenship and food democracy was born, leveraging on transparency, sustainability and food information needs. Internet of Things (IoT) and Analytics, some of the innovative technologies of Industry 4.0, have a significant impact on market and will act as a main thrust towards a genuine ‘4.0 change’ for agriculture. But, realizing a traceability system is not simple because of the complexity of agri-food supply chain, a lot of actors involved, different business models, environmental variations impacting products and/or processes, and extraordinary climate changes. In order to give support to the company involved in a traceability path, starting from business model analysis and related business process a Framework to Manage Product Data in Agri-Food Supply Chain for Voluntary Traceability was conceived. Studying each process task and leveraging on modeling techniques lead to individuate information held by different actors during agri-food supply chain. IoT technologies for data collection and Analytics techniques for data processing supply information useful to increase the efficiency intra-company and competitiveness in the market. The whole information recovered can be shown through IT solutions and mobile application to made accessible to the company, the entire supply chain and the consumer with the view to guaranteeing transparency and quality.Keywords: agriculture 4.0, agri-food suppy chain, industry 4.0, voluntary traceability
Procedia PDF Downloads 1476711 GRCNN: Graph Recognition Convolutional Neural Network for Synthesizing Programs from Flow Charts
Authors: Lin Cheng, Zijiang Yang
Abstract:
Program synthesis is the task to automatically generate programs based on user specification. In this paper, we present a framework that synthesizes programs from flow charts that serve as accurate and intuitive specification. In order doing so, we propose a deep neural network called GRCNN that recognizes graph structure from its image. GRCNN is trained end-to-end, which can predict edge and node information of the flow chart simultaneously. Experiments show that the accuracy rate to synthesize a program is 66.4%, and the accuracy rates to recognize edge and node are 94.1% and 67.9%, respectively. On average, it takes about 60 milliseconds to synthesize a program.Keywords: program synthesis, flow chart, specification, graph recognition, CNN
Procedia PDF Downloads 1196710 Statistical Discrimination of Blue Ballpoint Pen Inks by Diamond Attenuated Total Reflectance (ATR) FTIR
Authors: Mohamed Izzharif Abdul Halim, Niamh Nic Daeid
Abstract:
Determining the source of pen inks used on a variety of documents is impartial for forensic document examiners. The examination of inks is often performed to differentiate between inks in order to evaluate the authenticity of a document. A ballpoint pen ink consists of synthetic dyes in (acidic and/or basic), pigments (organic and/or inorganic) and a range of additives. Inks of similar color may consist of different composition and are frequently the subjects of forensic examinations. This study emphasizes on blue ballpoint pen inks available in the market because it is reported that approximately 80% of questioned documents analysis involving ballpoint pen ink. Analytical techniques such as thin layer chromatography, high-performance liquid chromatography, UV-vis spectroscopy, luminescence spectroscopy and infrared spectroscopy have been used in the analysis of ink samples. In this study, application of Diamond Attenuated Total Reflectance (ATR) FTIR is straightforward but preferable in forensic science as it offers no sample preparation and minimal analysis time. The data obtained from these techniques were further analyzed using multivariate chemometric methods which enable extraction of more information based on the similarities and differences among samples in a dataset. It was indicated that some pens from the same manufactures can be similar in composition, however, discrete types can be significantly different.Keywords: ATR FTIR, ballpoint, multivariate chemometric, PCA
Procedia PDF Downloads 457