Search results for: optimization methods
15489 Reactive Power Cost Evaluation with FACTS Devices in Restructured Power System
Authors: A. S. Walkey, N. P. Patidar
Abstract:
It is not always economical to provide reactive power using synchronous alternators. The cost of reactive power can be minimized by optimal placing of FACTS devices in power systems. In this paper a Particle Swarm Optimization- Sequential Quadratic Programming (PSO-SQP) algorithm is applied to minimize the cost of reactive power generation along with real power generation to alleviate the bus voltage violations. The effectiveness of proposed approach tested on IEEE-14 bus systems. In this paper in addition to synchronous generators, an opportunity of FACTS devices are also proposed to procure the reactive power demands in the power system.Keywords: reactive power, reactive power cost, voltage security margins, capability curve, FACTS devices
Procedia PDF Downloads 50615488 Cross-Layer Design of Event-Triggered Adaptive OFDMA Resource Allocation Protocols with Application to Vehicle Clusters
Authors: Shaban Guma, Naim Bajcinca
Abstract:
We propose an event-triggered algorithm for the solution of a distributed optimization problem by means of the projected subgradient method. Thereby, we invoke an OFDMA resource allocation scheme by applying an event-triggered sensitivity analysis at the access point. The optimal resource assignment of the subcarriers to the involved wireless nodes is carried out by considering the sensitivity analysis of the overall objective function as defined by the control of vehicle clusters with respect to the information exchange between the nodes.Keywords: consensus, cross-layer, distributed, event-triggered, multi-vehicle, protocol, resource, OFDMA, wireless
Procedia PDF Downloads 33115487 Patents as Indicators of Innovative Environment
Authors: S. Karklina, I. Erins
Abstract:
The main problem is that there is a very low innovation performance in Latvia. Since Latvia is a Member State of European Union, it also shall have to fulfill the set targets and to improve innovative results. Universities are one of the main performers to provide innovative capacity of country. University, industry and government need to cooperate for getting best results. The intellectual property is one of the indicators to determine innovation level in the country or organization and patents are one of the characteristics of intellectual property. The objective of the article is to determine indicators characterizing innovative environment in Latvia and influence of the development of universities on them. The methods that will be used in the article to achieve the objectives are quantitative and qualitative analysis of the literature, statistical data analysis, and graphical analysis methods.Keywords: HEI, innovations, Latvia, patents
Procedia PDF Downloads 31515486 Solving the Pseudo-Geometric Traveling Salesman Problem with the “Union Husk” Algorithm
Authors: Boris Melnikov, Ye Zhang, Dmitrii Chaikovskii
Abstract:
This study explores the pseudo-geometric version of the extensively researched Traveling Salesman Problem (TSP), proposing a novel generalization of existing algorithms which are traditionally confined to the geometric version. By adapting the "onion husk" method and introducing auxiliary algorithms, this research fills a notable gap in the existing literature. Through computational experiments using randomly generated data, several metrics were analyzed to validate the proposed approach's efficacy. Preliminary results align with expected outcomes, indicating a promising advancement in TSP solutions.Keywords: optimization problems, traveling salesman problem, heuristic algorithms, “onion husk” algorithm, pseudo-geometric version
Procedia PDF Downloads 20715485 The Effect of the Acquisition and Reconstruction Parameters in Quality of Spect Tomographic Images with Attenuation and Scatter Correction
Authors: N. Boutaghane, F. Z. Tounsi
Abstract:
Many physical and technological factors degrade the SPECT images, both qualitatively and quantitatively. For this, it is not always put into leading technological advances to improve the performance of tomographic gamma camera in terms of detection, collimation, reconstruction and correction of tomographic images methods. We have to master firstly the choice of various acquisition and reconstruction parameters, accessible to clinical cases and using the attenuation and scatter correction methods to always optimize quality image and minimized to the maximum dose received by the patient. In this work, an evaluation of qualitative and quantitative tomographic images is performed based on the acquisition parameters (counts per projection) and reconstruction parameters (filter type, associated cutoff frequency). In addition, methods for correcting physical effects such as attenuation and scatter degrading the image quality and preventing precise quantitative of the reconstructed slices are also presented. Two approaches of attenuation and scatter correction are implemented: the attenuation correction by CHANG method with a filtered back projection reconstruction algorithm and scatter correction by the subtraction JASZCZAK method. Our results are considered as such recommandation, which permits to determine the origin of the different artifacts observed both in quality control tests and in clinical images.Keywords: attenuation, scatter, reconstruction filter, image quality, acquisition and reconstruction parameters, SPECT
Procedia PDF Downloads 45315484 Evaluating the Performance of Color Constancy Algorithm
Authors: Damanjit Kaur, Avani Bhatia
Abstract:
Color constancy is significant for human vision since color is a pictorial cue that helps in solving different visions tasks such as tracking, object recognition, or categorization. Therefore, several computational methods have tried to simulate human color constancy abilities to stabilize machine color representations. Two different kinds of methods have been used, i.e., normalization and constancy. While color normalization creates a new representation of the image by canceling illuminant effects, color constancy directly estimates the color of the illuminant in order to map the image colors to a canonical version. Color constancy is the capability to determine colors of objects independent of the color of the light source. This research work studies the most of the well-known color constancy algorithms like white point and gray world.Keywords: color constancy, gray world, white patch, modified white patch
Procedia PDF Downloads 31915483 CO2 Emissions Quantification of the Modular Bridge Superstructure
Authors: Chanhyuck Jeon, Jongho Park, Jinwoong Choi, Sungnam Hong, Sun-Kyu Park
Abstract:
Many industries put emphasis on environmentally-friendliness as environmental problems are on the rise all over the world. Among themselves, the Modular Bridge research is going on. Also performing cross-section optimization and duration reducing, this research aims at developing the modular bridge with Environment-Friendliness and economic feasibility. However, the difficulty lies in verifying environmental effectiveness because there are no field applications of the modular bridge until now. Therefore, this thesis is categorized according to the form of the modular bridge superstructure and assessed CO₂ emission quantification per work types and materials according to each form to verify the environmental effectiveness of the modular bridge.Keywords: modular bridge, CO2 emission, environmentally friendly, quantification, carbon emission factor, LCA (Life Cycle Assessment)
Procedia PDF Downloads 55515482 Variable Selection in a Data Envelopment Analysis Model by Multiple Proportions Comparison
Authors: Jirawan Jitthavech, Vichit Lorchirachoonkul
Abstract:
A statistical procedure using multiple comparisons test for proportions is proposed for variable selection in a data envelopment analysis (DEA) model. The test statistic in the multiple comparisons is the proportion of efficient decision making units (DMUs) in a DEA model. Three methods of multiple comparisons test for proportions: multiple Z tests with Bonferroni correction, multiple tests in 2Xc crosstabulation and the Marascuilo procedure, are used in the proposed statistical procedure of iteratively eliminating the variables in a backward manner. Two simulation populations of moderately and lowly correlated variables are used to compare the results of the statistical procedure using three methods of multiple comparisons test for proportions with the hypothesis testing of the efficiency contribution measure. From the simulation results, it can be concluded that the proposed statistical procedure using multiple Z tests for proportions with Bonferroni correction clearly outperforms the proposed statistical procedure using the remaining two methods of multiple comparisons and the hypothesis testing of the efficiency contribution measure.Keywords: Bonferroni correction, efficient DMUs, Marascuilo procedure, Pastor et al. method, 2xc crosstabulation
Procedia PDF Downloads 31015481 Field Scale Simulation Study of Miscible Water Alternating CO2 Injection Process in Fractured Reservoirs
Authors: Hooman Fallah
Abstract:
Vast amounts of world oil reservoirs are in natural fractured reservoirs. There are different methods for increasing recovery from fractured reservoirs. Miscible injection of water alternating CO2 is a good choice among this methods. In this method, water and CO2 slugs are injected alternatively in reservoir as miscible agent into reservoir. This paper studies water injection scenario and miscible injection of water and CO2 in a two dimensional, inhomogeneous fractured reservoir. The results show that miscible water alternating CO2¬ gas injection leads to 3.95% increase in final oil recovery and total water production decrease of 3.89% comparing to water injection scenario.Keywords: simulation study, CO2, water alternating gas injection, fractured reservoirs
Procedia PDF Downloads 29115480 Optimization of Cutting Parameters during Machining of Fine Grained Cemented Carbides
Authors: Josef Brychta, Jiri Kratochvil, Marek Pagac
Abstract:
The group of progressive cutting materials can include non-traditional, emerging and less-used materials that can be an efficient use of cutting their lead to a quantum leap in the field of machining. This is essentially a “superhard” materials (STM) based on polycrystalline diamond (PCD) and polycrystalline cubic boron nitride (PCBN) cutting performance ceramics and development is constantly "perfecting" fine coated cemented carbides. The latter cutting materials are broken down by two parameters, toughness and hardness. A variation of alloying elements is always possible to improve only one of each parameter. Reducing the size of the core on the other hand doing achieves "contradictory" properties, namely to increase both hardness and toughness.Keywords: grained cutting materials difficult to machine materials, optimum utilization, mechanic, manufacturing
Procedia PDF Downloads 29915479 Logical-Probabilistic Modeling of the Reliability of Complex Systems
Authors: Sergo Tsiramua, Sulkhan Sulkhanishvili, Elisabed Asabashvili, Lazare Kvirtia
Abstract:
The paper presents logical-probabilistic methods, models and algorithms for reliability assessment of complex systems, based on which a web application for structural analysis and reliability assessment of systems was created. The reliability assessment process included the following stages, which were reflected in the application: 1) Construction of a graphical scheme of the structural reliability of the system; 2) Transformation of the graphic scheme into a logical representation and modeling of the shortest ways of successful functioning of the system; 3) Description of system operability condition with logical function in the form of disjunctive normal form (DNF); 4) Transformation of DNF into orthogonal disjunction normal form (ODNF) using the orthogonalization algorithm; 5) Replacing logical elements with probabilistic elements in ODNF, obtaining a reliability estimation polynomial and quantifying reliability; 6) Calculation of weights of elements. Using the logical-probabilistic methods, models and algorithms discussed in the paper, a special software was created, by means of which a quantitative assessment of the reliability of systems of a complex structure is produced. As a result, structural analysis of systems, research and designing of optimal structure systems are carried out.Keywords: Complex systems, logical-probabilistic methods, orthogonalization algorithm, reliability, weight of element
Procedia PDF Downloads 7315478 Accelerating Side Channel Analysis with Distributed and Parallelized Processing
Authors: Kyunghee Oh, Dooho Choi
Abstract:
Although there is no theoretical weakness in a cryptographic algorithm, Side Channel Analysis can find out some secret data from the physical implementation of a cryptosystem. The analysis is based on extra information such as timing information, power consumption, electromagnetic leaks or even sound which can be exploited to break the system. Differential Power Analysis is one of the most popular analyses, as computing the statistical correlations of the secret keys and power consumptions. It is usually necessary to calculate huge data and takes a long time. It may take several weeks for some devices with countermeasures. We suggest and evaluate the methods to shorten the time to analyze cryptosystems. Our methods include distributed computing and parallelized processing.Keywords: DPA, distributed computing, parallelized processing, side channel analysis
Procedia PDF Downloads 42815477 Mediation in Turkey
Authors: Ibrahim Ercan, Mustafa Arikan
Abstract:
In recent years, alternative dispute resolution methods have attracted the attention of many country’s legislators. Instead of solving the disputes by litigation, putting the end to a dispute by parties themselves is more important for the preservation of social peace. Therefore, alternative dispute resolution methods (ADR) have been discussed more intensively in Turkey as well as the whole world. After these discussions, Mediation Act was adopted on 07.06.2012 and entered into force on 21.06.2013. According to the Mediation Act, it is only possible to mediate issues arising from the private law. Also, it is not compulsory to go to mediation in Turkish law, it is optional. Therefore, the parties are completely free to choose mediation method in dispute resolution. Mediators need to be a lawyer with experience in five years. Therefore, it is not possible to be a mediator who is not lawyers. Beyond five years of experience, getting education and success in exams about especially body language and psychology is also very important to be a mediator. If the parties compromise as a result of mediation, a document is issued. This document will also have the ability to exercising availability under certain circumstances. Thus, the parties will not need to apply to the court again. On the contrary, they will find the opportunity to execute this document, so they can regain their debts. However, the Mediation Act has entered into force in a period of nearly two years of history; it is possible to say that the interest in mediation is not at the expected level. Therefore, making mediation mandatory for some disputes has been discussed recently. At this point, once the mediation becomes mandatory and good results follows it, this institution will be able to find a serious interest in Turkey. Otherwise, if the results will not be satisfying, the mediation method will be removed.Keywords: alternative dispute resolution methods, mediation act, mediation, mediator, mediation in Turkey
Procedia PDF Downloads 36515476 Multi Objective Near-Optimal Trajectory Planning of Mobile Robot
Authors: Amar Khoukhi, Mohamed Shahab
Abstract:
This paper presents the optimal control problem of mobile robot motion as a nonlinear programming problem (NLP) and solved using a direct method of numerical optimal control. The NLP is initialized with a B-Spline for which node locations are optimized using a genetic search. The system acceleration inputs and sampling periods are considered as optimization variables. Different scenarios with different objectives weights are implemented and investigated. Interesting results are found in terms of complying with the expected behavior of a mobile robot system and time-energy minimization.Keywords: multi-objective control, non-holonomic systems, mobile robots, nonlinear programming, motion planning, B-spline, genetic algorithm
Procedia PDF Downloads 36915475 Application of Adaptive Particle Filter for Localizing a Mobile Robot Using 3D Camera Data
Authors: Maysam Shahsavari, Seyed Jamalaldin Haddadi
Abstract:
There are several methods to localize a mobile robot such as relative, absolute and probabilistic. In this paper, particle filter due to its simple implementation and the fact that it does not need to know to the starting position will be used. This method estimates the position of the mobile robot using a probabilistic distribution, relying on a known map of the environment instead of predicting it. Afterwards, it updates this estimation by reading input sensors and control commands. To receive information from the surrounding world, distance to obstacles, for example, a Kinect is used which is much cheaper than a laser range finder. Finally, after explaining the Adaptive Particle Filter method and its implementation in detail, we will compare this method with the dead reckoning method and show that this method is much more suitable for situations in which we have a map of the environment.Keywords: particle filter, localization, methods, odometry, kinect
Procedia PDF Downloads 26915474 A Review Paper for Detecting Zero-Day Vulnerabilities
Authors: Tshegofatso Rambau, Tonderai Muchenje
Abstract:
Zero-day attacks (ZDA) are increasing day by day; there are many vulnerabilities in systems and software that date back decades. Companies keep discovering vulnerabilities in their systems and software and work to release patches and updates. A zero-day vulnerability is a software fault that is not widely known and is unknown to the vendor; attackers work very quickly to exploit these vulnerabilities. These are major security threats with a high success rate because businesses lack the essential safeguards to detect and prevent them. This study focuses on the factors and techniques that can help us detect zero-day attacks. There are various methods and techniques for detecting vulnerabilities. Various companies like edges can offer penetration testing and smart vulnerability management solutions. We will undertake literature studies on zero-day attacks and detection methods, as well as modeling approaches and simulations, as part of the study process.Keywords: zero-day attacks, exploitation, vulnerabilities
Procedia PDF Downloads 10215473 The Integration of Digital Humanities into the Sociology of Knowledge Approach to Discourse Analysis
Authors: Gertraud Koch, Teresa Stumpf, Alejandra Tijerina García
Abstract:
Discourse analysis research approaches belong to the central research strategies applied throughout the humanities; they focus on the countless forms and ways digital texts and images shape present-day notions of the world. Despite the constantly growing number of relevant digital, multimodal discourse resources, digital humanities (DH) methods are thus far not systematically developed and accessible for discourse analysis approaches. Specifically, the significance of multimodality and meaning plurality modelling are yet to be sufficiently addressed. In order to address this research gap, the D-WISE project aims to develop a prototypical working environment as digital support for the sociology of knowledge approach to discourse analysis and new IT-analysis approaches for the use of context-oriented embedding representations. Playing an essential role throughout our research endeavor is the constant optimization of hermeneutical methodology in the use of (semi)automated processes and their corresponding epistemological reflection. Among the discourse analyses, the sociology of knowledge approach to discourse analysis is characterised by the reconstructive and accompanying research into the formation of knowledge systems in social negotiation processes. The approach analyses how dominant understandings of a phenomenon develop, i.e., the way they are expressed and consolidated by various actors in specific arenas of discourse until a specific understanding of the phenomenon and its socially accepted structure are established. This article presents insights and initial findings from D-WISE, a joint research project running since 2021 between the Institute of Anthropological Studies in Culture and History and the Language Technology Group of the Department of Informatics at the University of Hamburg. As an interdisciplinary team, we develop central innovations with regard to the availability of relevant DH applications by building up a uniform working environment, which supports the procedure of the sociology of knowledge approach to discourse analysis within open corpora and heterogeneous, multimodal data sources for researchers in the humanities. We are hereby expanding the existing range of DH methods by developing contextualized embeddings for improved modelling of the plurality of meaning and the integrated processing of multimodal data. The alignment of this methodological and technical innovation is based on the epistemological working methods according to grounded theory as a hermeneutic methodology. In order to systematically relate, compare, and reflect the approaches of structural-IT and hermeneutic-interpretative analysis, the discourse analysis is carried out both manually and digitally. Using the example of current discourses on digitization in the healthcare sector and the associated issues regarding data protection, we have manually built an initial data corpus of which the relevant actors and discourse positions are analysed in conventional qualitative discourse analysis. At the same time, we are building an extensive digital corpus on the same topic based on the use and further development of entity-centered research tools such as topic crawlers and automated newsreaders. In addition to the text material, this consists of multimodal sources such as images, video sequences, and apps. In a blended reading process, the data material is filtered, annotated, and finally coded with the help of NLP tools such as dependency parsing, named entity recognition, co-reference resolution, entity linking, sentiment analysis, and other project-specific tools that are being adapted and developed. The coding process is carried out (semi-)automated by programs that propose coding paradigms based on the calculated entities and their relationships. Simultaneously, these can be specifically trained by manual coding in a closed reading process and specified according to the content issues. Overall, this approach enables purely qualitative, fully automated, and semi-automated analyses to be compared and reflected upon.Keywords: entanglement of structural IT and hermeneutic-interpretative analysis, multimodality, plurality of meaning, sociology of knowledge approach to discourse analysis
Procedia PDF Downloads 22615472 Evaluation of Microbiological Quality and Safety of Two Types of Salads Prepared at Libyan Airline Catering Center in Tripoli
Authors: Elham A. Kwildi, Yahia S. Abugnah, Nuri S. Madi
Abstract:
This study was designed to evaluate the microbiological quality and safety of two types of salads prepared at a catering center affiliated with Libyan Airlines in Tripoli, Libya. Two hundred and twenty-one (221) samples (132 economy-class and 89 first- class) were used in this project which lasted for ten months. Biweekly, microbiological tests were performed which included total plate count (TPC) and total coliforms (TCF), in addition to enumeration and/or detection of some pathogenic bacteria mainly Escherichia coli, Staphylococcus aureus, Bacillus cereus, Salmonella sp, Listeria sp and Vibrio parahaemolyticus parahaemolyticus, By using conventional as well as compact dry methods. Results indicated that TPC of type 1 salad ranged between (<10 – 62 x 103 cfu/gm) and (<10 to 36 x103 cfu/g), while TCF were (<10 – 41 x 103 cfu/gm) and (< 10 to 66 x102 cfu/g) using both methods of detection respectively. On the other hand, TPC of type 2 salad were: (1 × 10 – 52 x 103) and (<10 – 55 x 103 cfu/gm) and in the range of (1 x10 to 45x103 cfu/g), and the (TCF) counts were between (< 10 to 55x103 cfu/g) and (< 10 to 34 x103 cfu/g) using the 1st and the 2nd methods of detection respectively. Also, the pathogens mentioned above were detected in both types of salads, but their levels varied according to the type of salad and the method of detection. The level of Staphylococcus aureus, for instance, was 17.4% using conventional method versus 14.4% using the compact dry method. Similarly, E. coli was 7.6% and 9.8%, while Salmonella sp. recorded the least percentage i.e. 3% and 3.8% with the two mentioned methods respectively. First class salads were also found to contain the same pathogens, but the level of E. coli was relatively higher in this case (14.6% and 16.9%) using conventional and compact dry methods respectively. The second rank came Staphylococcus aureus (13.5%) and (11.2%), followed by Salmonella (6.74%) and 6.70%). The least percentage was for Vibrio parahaemolyticus (4.9%) which was detected in the first class salads only. The other two pathogens Bacillus cereus and Listeria sp. were not detected in either one of the salads. Finally, it is worth mentioning that there was a significant decline in TPC and TCF counts in addition to the disappearance of pathogenic bacteria after the 6-7th month of the study which coincided with the first trial of the HACCP system at the center. The ups and downs in the counts along the early stages of the study reveal that there is a need for some important correction measures including more emphasis on training of the personnel in applying the HACCP system effectively.Keywords: air travel, vegetable salads, foodborne outbreaks, Libya
Procedia PDF Downloads 32615471 Job Shop Scheduling: Classification, Constraints and Objective Functions
Authors: Majid Abdolrazzagh-Nezhad, Salwani Abdullah
Abstract:
The job-shop scheduling problem (JSSP) is an important decision facing those involved in the fields of industry, economics and management. This problem is a class of combinational optimization problem known as the NP-hard problem. JSSPs deal with a set of machines and a set of jobs with various predetermined routes through the machines, where the objective is to assemble a schedule of jobs that minimizes certain criteria such as makespan, maximum lateness, and total weighted tardiness. Over the past several decades, interest in meta-heuristic approaches to address JSSPs has increased due to the ability of these approaches to generate solutions which are better than those generated from heuristics alone. This article provides the classification, constraints and objective functions imposed on JSSPs that are available in the literature.Keywords: job-shop scheduling, classification, constraints, objective functions
Procedia PDF Downloads 44415470 Computational Fluid Dynamics Simulation Study of Flow near Moving Wall of Various Surface Types Using Moving Mesh Method
Authors: Khizir Mohd Ismail, Yu Jun Lim, Tshun Howe Yong
Abstract:
The study of flow behavior in an enclosed volume using Computational Fluid Dynamics (CFD) has been around for decades. However, due to the knowledge limitation of adaptive grid methods, the flow in an enclosed volume near the moving wall using CFD is less explored. A CFD simulation of flow in an enclosed volume near a moving wall was demonstrated and studied by introducing a moving mesh method and was modeled with Unsteady Reynolds-Averaged Navier-Stokes (URANS) approach. A static enclosed volume with controlled opening size in the bottom was positioned against a moving, translational wall with sliding mesh features. Controlled variables such as smoothed, crevices and corrugated wall characteristics, the distance between the enclosed volume to the wall and the moving wall speed against the enclosed chamber were varied to understand how the flow behaves and reacts in between these two geometries. These model simulations were validated against experimental results and provided result confidence when the simulation had shown good agreement with the experimental data. This study had provided better insight into the flow behaving in an enclosed volume when various wall types in motion were introduced within the various distance between each other and create a potential opportunity of application which involves adaptive grid methods in CFD.Keywords: moving wall, adaptive grid methods, CFD, moving mesh method
Procedia PDF Downloads 14715469 Development of Cost-effective Sensitive Methods for Pathogen Detection in Community Wastewater for Disease Surveillance
Authors: Jesmin Akter, Chang Hyuk Ahn, Ilho Kim, Jaiyeop Lee
Abstract:
Global pandemic coronavirus disease (COVID-19) caused by Severe acute respiratory syndrome SARS-CoV-2, to control the spread of the COVID-19 pandemic, wastewater surveillance has been used to monitor SARS-CoV2 prevalence in the community. The challenging part is establishing wastewater surveillance; there is a need for a well-equipped laboratory for wastewater sample analysis. According to many previous studies, reverse transcription-polymerase chain reaction (RT-PCR) based molecular tests are the most widely used and popular detection method worldwide. However, the RT-qPCR based approaches for the detection or quantification of SARS-CoV-2 genetic fragments ribonucleic acid (RNA) from wastewater require a specialized laboratory, skilled personnel, expensive instruments, and a workflow that typically requires 6 to 8 hours to provide results for just minimum samples. Rapid and reliable alternative detection methods are needed to enable less-well-qualified practitioners to set up and provide sensitive detection of SARS-CoV-2 within wastewater at less-specialized regional laboratories. Therefore, scientists and researchers are conducting experiments for rapid detection methods of COVID-19; in some cases, the structural and molecular characteristics of SARS-CoV-2 are unknown, and various strategies for the correct diagnosis of COVID-19 have been proposed by research laboratories, which are presented in the present study. The ongoing research and development of these highly sensitive and rapid technologies, namely RT-LAMP, ELISA, Biosensors, GeneXpert, allows a wide range of potential options not only for SARS-CoV-2 detection but also for other viruses as well. The effort of this study is to discuss the above effective and regional rapid detection and quantification methods in community wastewater as an essential step in advancing scientific goals.Keywords: rapid detection, SARS-CoV-2, sensitive detection, wastewater surveillance
Procedia PDF Downloads 8515468 Molecular Biomonitoring of Bacterial Pathogens in Wastewater
Authors: Desouky Abd El Haleem, Sahar Zaki
Abstract:
This work was conducted to develop a one-step multiplex PCR system for rapid, sensitive, and specific detection of three different bacterial pathogens, Escherichia coli, Pseudomonas aeruginosa, and Salmonella spp, directly in wastewater without prior isolation on selective media. As a molecular confirmatory test after isolation of the pathogens by classical microbiological methods, PCR-RFLP of their amplified 16S rDNA genes was performed. It was observed that the developed protocols have significance impact in the ability to detect sensitively, rapidly and specifically the three pathogens directly in water within short-time, represents a considerable advancement over more time-consuming and less-sensitive methods for identification and characterization of these kinds of pathogens.Keywords: multiplex PCR, bacterial pathogens, Escherichia coli, Pseudomonas aeruginosa, Salmonella spp.
Procedia PDF Downloads 44915467 Principal Component Analysis Applied to the Electric Power Systems – Practical Guide; Practical Guide for Algorithms
Authors: John Morales, Eduardo Orduña
Abstract:
Currently the Principal Component Analysis (PCA) theory has been used to develop algorithms regarding to Electric Power Systems (EPS). In this context, this paper presents a practical tutorial of this technique detailed their concept, on-line and off-line mathematical foundations, which are necessary and desirables in EPS algorithms. Thus, features of their eigenvectors which are very useful to real-time process are explained, showing how it is possible to select these parameters through a direct optimization. On the other hand, in this work in order to show the application of PCA to off-line and on-line signals, an example step to step using Matlab commands is presented. Finally, a list of different approaches using PCA is presented, and some works which could be analyzed using this tutorial are presented.Keywords: practical guide; on-line; off-line, algorithms, faults
Procedia PDF Downloads 56315466 Investigation of Long-Term Thermal Insulation Performance of Vacuum Insulation Panels with Various Enveloping Methods
Authors: Inseok Yeo, Tae-Ho Song
Abstract:
To practically apply vacuum insulation panels (VIPs) to buildings or home appliances, VIPs have demanded long-term lifespan with outstanding insulation performance. Service lives of VIPs enveloped with Al-foil and three-layer Al-metallized envelope are calculated. For Al-foil envelope, the service life is longer but edge conduction is too large compared with the Al metallized envelope. To increase service life even more, the proposed double enveloping method and metal-barrier-added enveloping method are further analyzed. The service lives of the VIP to employ two enveloping methods are calculated. Also, pressure increase and thermal insulation performance characteristics are investigated. For the metal- barrier-added enveloping method, effective thermal conductivity increase with time is close to that of Al-foil envelope, especially, for getter-inserted VIPs. For the double enveloping method, if water vapor is perfectly adsorbed, the effect of service life enhancement becomes much greater. From these methods, the VIP can be guaranteed for the service life of more than 20 years.Keywords: vacuum insulation panels, service life, double enveloping, metal-barrier-added enveloping, edge conduction
Procedia PDF Downloads 43315465 Waste Management in a Hot Laboratory of Japan Atomic Energy Agency – 2: Condensation and Solidification Experiments on Liquid Waste
Authors: Sou Watanabe, Hiromichi Ogi, Atsuhiro Shibata, Kazunori Nomura
Abstract:
As a part of STRAD project conducted by JAEA, condensation of radioactive liquid waste containing various chemical compounds using reverse osmosis (RO) membrane filter was examined for efficient and safety treatment of the liquid wastes accumulated inside hot laboratories. NH4+ ion in the feed solution was successfully concentrated, and NH4+ ion involved in the effluents became lower than target value; 100 ppm. Solidification of simulated aqueous and organic liquid wastes was also tested. Those liquids were successfully solidified by adding cement or coagulants. Nevertheless, optimization in materials for confinement of chemicals is required for long time storage of the final solidified wastes.Keywords: condensation, radioactive liquid waste, solidification, STRAD project
Procedia PDF Downloads 15815464 Comparison of Finite-Element and IEC Methods for Cable Thermal Analysis under Various Operating Environments
Authors: M. S. Baazzim, M. S. Al-Saud, M. A. El-Kady
Abstract:
In this paper, steady-state ampacity (current carrying capacity) evaluation of underground power cable system by using analytical and numerical methods for different conditions (depth of cable, spacing between phases, soil thermal resistivity, ambient temperature, wind speed), for two system voltage level were used 132 and 380 kV. The analytical method or traditional method that was used is based on the thermal analysis method developed by Neher-McGrath and further enhanced by International Electrotechnical Commission (IEC) and published in standard IEC 60287. The numerical method that was used is finite element method and it was recourse commercial software based on finite element method.Keywords: cable ampacity, finite element method, underground cable, thermal rating
Procedia PDF Downloads 37915463 Survey of Methods for Solutions of Spatial Covariance Structures and Their Limitations
Authors: Joseph Thomas Eghwerido, Julian I. Mbegbu
Abstract:
In modelling environment processes, we apply multidisciplinary knowledge to explain, explore and predict the Earth's response to natural human-induced environmental changes. Thus, the analysis of spatial-time ecological and environmental studies, the spatial parameters of interest are always heterogeneous. This often negates the assumption of stationarity. Hence, the dispersion of the transportation of atmospheric pollutants, landscape or topographic effect, weather patterns depends on a good estimate of spatial covariance. The generalized linear mixed model, although linear in the expected value parameters, its likelihood varies nonlinearly as a function of the covariance parameters. As a consequence, computing estimates for a linear mixed model requires the iterative solution of a system of simultaneous nonlinear equations. In other to predict the variables at unsampled locations, we need to know the estimate of the present sampled variables. The geostatistical methods for solving this spatial problem assume covariance stationarity (locally defined covariance) and uniform in space; which is not apparently valid because spatial processes often exhibit nonstationary covariance. Hence, they have globally defined covariance. We shall consider different existing methods of solutions of spatial covariance of a space-time processes at unsampled locations. This stationary covariance changes with locations for multiple time set with some asymptotic properties.Keywords: parametric, nonstationary, Kernel, Kriging
Procedia PDF Downloads 25515462 Intelligent and Optimized Placement for CPLD Devices
Authors: Abdelkader Hadjoudja, Hajar Bouazza
Abstract:
The PLD/CPLD devices are widely used for logic synthesis since several decades. Based on sum of product terms (PTs) architecture, the PLD/CPLD offer a high degree of flexibility to support various application requirements. They are suitable for large combinational logic, finite state machines as well as intensive I/O designs. CPLDs offer very predictable timing characteristics and are therefore ideal for critical control applications. This paper describes how the logic synthesis techniques, such as 1) XOR detection, 2) logic doubling, 3) complement of a Boolean function are combined, applied and used to optimize the CPLDs devices architecture that is based on PAL-like macrocells. Our goal is to use these techniques for minimizing the number of macrocells required to implement a circuit and minimize the delay of mapped circuit.Keywords: CPLD, doubling, optimization, XOR
Procedia PDF Downloads 28215461 A Diagnostic Accuracy Study: Comparison of Two Different Molecular-Based Tests (Genotype HelicoDR and Seeplex Clar-H. pylori ACE Detection), in the Diagnosis of Helicobacter pylori Infections
Authors: Recep Kesli, Huseyin Bilgin, Yasar Unlu, Gokhan Gungor
Abstract:
Aim: The aim of this study was to compare diagnostic values of two different molecular-based tests (GenoType® HelicoDR ve Seeplex® H. pylori-ClaR- ACE Detection) in detection presence of the H. pylori from gastric biopsy specimens. In addition to this also was aimed to determine resistance ratios of H. pylori strains against to clarytromycine and quinolone isolated from gastric biopsy material cultures by using both the genotypic (GenoType® HelicoDR, Seeplex ® H. pylori -ClaR- ACE Detection) and phenotypic (gradient strip, E-test) methods. Material and methods: A total of 266 patients who admitted to Konya Education and Research Hospital Department of Gastroenterology with dyspeptic complaints, between January 2011-June 2013, were included in the study. Microbiological and histopathological examinations of biopsy specimens taken from antrum and corpus regions were performed. The presence of H. pylori in all the biopsy samples was investigated by five differnt dignostic methods together: culture (C) (Portagerm pylori-PORT PYL, Pylori agar-PYL, GENbox microaer, bioMerieux, France), histology (H) (Giemsa, Hematoxylin and Eosin staining), rapid urease test (RUT) (CLOtest, Cimberly-Clark, USA), and two different molecular tests; GenoType® HelicoDR, Hain, Germany, based on DNA strip assay, and Seeplex ® H. pylori -ClaR- ACE Detection, Seegene, South Korea, based on multiplex PCR. Antimicrobial resistance of H. pylori isolates against clarithromycin and levofloxacin was determined by GenoType® HelicoDR, Seeplex ® H. pylori -ClaR- ACE Detection, and gradient strip (E-test, bioMerieux, France) methods. Culture positivity alone or positivities of both histology and RUT together was accepted as the gold standard for H. pylori positivity. Sensitivity and specificity rates of two molecular methods used in the study were calculated by taking the two gold standards previously mentioned. Results: A total of 266 patients between 16-83 years old who 144 (54.1 %) were female, 122 (45.9 %) were male were included in the study. 144 patients were found as culture positive, and 157 were H and RUT were positive together. 179 patients were found as positive with GenoType® HelicoDR and Seeplex ® H. pylori -ClaR- ACE Detection together. Sensitivity and specificity rates of studied five different methods were found as follows: C were 80.9 % and 84.4 %, H + RUT were 88.2 % and 75.4 %, GenoType® HelicoDR were 100 % and 71.3 %, and Seeplex ® H. pylori -ClaR- ACE Detection were, 100 % and 71.3 %. A strong correlation was found between C and H+RUT, C and GenoType® HelicoDR, and C and Seeplex ® H. pylori -ClaR- ACE Detection (r:0.644 and p:0.000, r:0.757 and p:0.000, r:0.757 and p:0.000, respectively). Of all the isolated 144 H. pylori strains 24 (16.6 %) were detected as resistant to claritromycine, and 18 (12.5 %) were levofloxacin. Genotypic claritromycine resistance was detected only in 15 cases with GenoType® HelicoDR, and 6 cases with Seeplex ® H. pylori -ClaR- ACE Detection. Conclusion: In our study, it was concluded that; GenoType® HelicoDR and Seeplex ® H. pylori -ClaR- ACE Detection was found as the most sensitive diagnostic methods when comparing all the investigated other ones (C, H, and RUT).Keywords: Helicobacter pylori, GenoType® HelicoDR, Seeplex ® H. pylori -ClaR- ACE Detection, antimicrobial resistance
Procedia PDF Downloads 16815460 Augmented Reality in Teaching Children with Autism
Authors: Azadeh Afrasyabi, Ali Khaleghi, Aliakbar Alijarahi
Abstract:
Training at an early age is so important, because of tremendous changes in adolescence, including the formation of character, physical changes and other factors. One of the most sensitive sectors in this field is the children with a disability and are somehow special children who have trouble in communicating with their environment. One of the emerging technologies in the field of education that can be effectively profitable called augmented reality, where the combination of real world and virtual images in real time produces new concepts that can facilitate learning. The purpose of this paper is to propose an effective training method for special and disabled children based on augmented reality. Of course, in particular, the efficiency of augmented reality in teaching children with autism will consider, also examine the various aspect of this disease and different learning methods in this area.Keywords: technology in education, augmented reality, special education, teaching methods
Procedia PDF Downloads 371