Search results for: multiple detection
6899 Comprehensive Study of X-Ray Emission by APF Plasma Focus Device
Authors: M. Habibi
Abstract:
The time-resolved studies of soft and hard X-ray were carried out over a wide range of argon pressures by employing an array of eight filtered photo PIN diodes and a scintillation detector, simultaneously. In 50% of the discharges, the soft X-ray is seen to be emitted in short multiple pulses corresponding to different compression, whereas it is a single pulse for hard X-rays corresponding to only the first strong compression. It should be stated that multiple compressions dominantly occur at low pressures and high pressures are mostly in the single compression regime. In 43% of the discharges, at all pressures except for optimum pressure, the first period is characterized by two or more sharp peaks.The X–ray signal intensity during the second and subsequent compressions is much smaller than the first compression.Keywords: plasma focus device, SXR, HXR, Pin-diode, argon plasma
Procedia PDF Downloads 4086898 Effect of Drying on the Concrete Structures
Authors: A. Brahma
Abstract:
The drying of hydraulics materials is unavoidable and conducted to important spontaneous deformations. In this study, we show that it is possible to describe the drying shrinkage of the high-performance concrete by a simple expression. A multiple regression model was developed for the prediction of the drying shrinkage of the high-performance concrete. The assessment of the proposed model has been done by a set of statistical tests. The model developed takes in consideration the main parameters of confection and conservation. There was a very good agreement between drying shrinkage predicted by the multiple regression model and experimental results. The developed model adjusts easily to all hydraulic concrete types.Keywords: hydraulic concretes, drying, shrinkage, prediction, modeling
Procedia PDF Downloads 3686897 A Rapid Code Acquisition Scheme in OOC-Based CDMA Systems
Authors: Keunhong Chae, Seokho Yoon
Abstract:
We propose a code acquisition scheme called improved multiple-shift (IMS) for optical code division multiple access systems, where the optical orthogonal code is used instead of the pseudo noise code. Although the IMS algorithm has a similar process to that of the conventional MS algorithm, it has a better code acquisition performance than the conventional MS algorithm. We analyze the code acquisition performance of the IMS algorithm and compare the code acquisition performances of the MS and the IMS algorithms in single-user and multi-user environments.Keywords: code acquisition, optical CDMA, optical orthogonal code, serial algorithm
Procedia PDF Downloads 5406896 Application of Support Vector Machines in Fault Detection and Diagnosis of Power Transmission Lines
Authors: I. A. Farhat, M. Bin Hasan
Abstract:
A developed approach for the protection of power transmission lines using Support Vector Machines (SVM) technique is presented. In this paper, the SVM technique is utilized for the classification and isolation of faults in power transmission lines. Accurate fault classification and location results are obtained for all possible types of short circuit faults. As in distance protection, the approach utilizes the voltage and current post-fault samples as inputs. The main advantage of the method introduced here is that the method could easily be extended to any power transmission line.Keywords: fault detection, classification, diagnosis, power transmission line protection, support vector machines (SVM)
Procedia PDF Downloads 5596895 Design of Replication System for Computer-Generated Hologram in Optical Component Application
Authors: Chih-Hung Chen, Yih-Shyang Cheng, Yu-Hsin Tu
Abstract:
Holographic optical elements (HOEs) have recently been one of the most suitable components in optoelectronic technology owing to the requirement of the product system with compact size. Computer-generated holography (CGH) is a well-known technology for HOEs production. In some cases, a well-designed diffractive optical element with multifunctional components is also an important issue and needed for an advanced optoelectronic system. Spatial light modulator (SLM) is one of the key components that has great capability to display CGH pattern and is widely used in various applications, such as an image projection system. As mentioned to multifunctional components, such as phase and amplitude modulation of light, high-resolution hologram with multiple-exposure procedure is also one of the suitable candidates. However, holographic recording under multiple exposures, the diffraction efficiency of the final hologram is inevitably lower than that with single exposure process. In this study, a two-step holographic recording method, including the master hologram fabrication and the replicated hologram production, will be designed. Since there exist a reduction factor M² of diffraction efficiency in multiple-exposure holograms (M multiple exposures), so it seems that single exposure would be more efficient for holograms replication. In the second step of holographic replication, a stable optical system with one-shot copying is introduced. For commercial application, one may utilize this concept of holographic copying to obtain duplications of HOEs with higher optical performance.Keywords: holographic replication, holography, one-shot copying, optical element
Procedia PDF Downloads 1566894 Simultaneous Determination of p-Phenylenediamine, N-Acetyl-p-phenylenediamine and N,N-Diacetyl-p-phenylenediamine in Human Urine by LC-MS/MS
Authors: Khaled M. Mohamed
Abstract:
Background: P-Phenylenediamine (PPD) is used in the manufacture of hair dyes and skin decoration. In some developing countries, suicidal, homicidal and accidental cases by PPD were recorded. In this work, a sensitive LC-MS/MS method for determination of PPD and its metabolites N-acetyl-p-phenylenediamine (MAPPD) and N,N-diacetyl-p-phenylenediamine (DAPPD) in human urine has been developed and validated. Methods: PPD, MAPPD and DAPPD were extracted from urine by methylene chloride at alkaline pH. Acetanilide was used as internal standard (IS). The analytes and IS were separated on an Eclipse XDB- C18 column (150 X 4.6 mm, 5 µm) using a mobile phase of acetonitrile-1% formic acid in gradient elution. Detection was performed by LC-MS/MS using electrospray positive ionization under multiple reaction-monitoring mode. The transition ions m/z 109 → 92, m/z 151 → 92, m/z 193 → 92, and m/z 136 → 77 were selected for the quantification of PPD, MAPPD, DAPPD, and IS, respectively. Results: Calibration curves were linear in the range 10–2000 ng/mL for all analytes. The mean recoveries for PPD, MAPPD and DAPPD were 57.62, 74.19 and 50.99%, respectively. Intra-assay and inter-assay imprecisions were within 1.58–9.52% and 5.43–9.45% respectively for PPD, MAPPD and DAPPD. Inter-assay accuracies were within -7.43 and 7.36 for all compounds. PPD, MAPPD and DAPPD were stable in urine at –20 degrees for 24 hours. Conclusions: The method was successfully applied to the analysis of PPD, MAPPD and DAPPD in urine samples collected from suicidal cases.Keywords: p-Phenylenediamine, metabolites, urine, LC-MS/MS, validation
Procedia PDF Downloads 3556893 The Benefits of Security Culture for Improving Physical Protection Systems at Detection and Radiation Measurement Laboratory
Authors: Ari S. Prabowo, Nia Febriyanti, Haryono B. Santosa
Abstract:
Security function that is called as Physical Protection Systems (PPS) has functions to detect, delay and response. Physical Protection Systems (PPS) in Detection and Radiation Measurement Laboratory needs to be improved continually by using internal resources. The nuclear security culture provides some potentials to support this research. The study starts by identifying the security function’s weaknesses and its strengths of security culture as a purpose. Secondly, the strengths of security culture are implemented in the laboratory management. Finally, a simulation was done to measure its effectiveness. Some changes were happened in laboratory personnel behaviors and procedures. All became more prudent. The results showed a good influence of nuclear security culture in laboratory security functions.Keywords: laboratory, physical protection system, security culture, security function
Procedia PDF Downloads 1856892 Cyber-Med: Practical Detection Methodology of Cyber-Attacks Aimed at Medical Devices Eco-Systems
Authors: Nir Nissim, Erez Shalom, Tomer Lancewiki, Yuval Elovici, Yuval Shahar
Abstract:
Background: A Medical Device (MD) is an instrument, machine, implant, or similar device that includes a component intended for the purpose of the diagnosis, cure, treatment, or prevention of disease in humans or animals. Medical devices play increasingly important roles in health services eco-systems, including: (1) Patient Diagnostics and Monitoring; Medical Treatment and Surgery; and Patient Life Support Devices and Stabilizers. MDs are part of the medical device eco-system and are connected to the network, sending vital information to the internal medical information systems of medical centers that manage this data. Wireless components (e.g. Wi-Fi) are often embedded within medical devices, enabling doctors and technicians to control and configure them remotely. All these functionalities, roles, and uses of MDs make them attractive targets of cyber-attacks launched for many malicious goals; this trend is likely to significantly increase over the next several years, with increased awareness regarding MD vulnerabilities, the enhancement of potential attackers’ skills, and expanded use of medical devices. Significance: We propose to develop and implement Cyber-Med, a unique collaborative project of Ben-Gurion University of the Negev and the Clalit Health Services Health Maintenance Organization. Cyber-Med focuses on the development of a comprehensive detection framework that relies on a critical attack repository that we aim to create. Cyber-Med will allow researchers and companies to better understand the vulnerabilities and attacks associated with medical devices as well as providing a comprehensive platform for developing detection solutions. Methodology: The Cyber-Med detection framework will consist of two independent, but complementary detection approaches: one for known attacks, and the other for unknown attacks. These modules incorporate novel ideas and algorithms inspired by our team's domains of expertise, including cyber security, biomedical informatics, and advanced machine learning, and temporal data mining techniques. The establishment and maintenance of Cyber-Med’s up-to-date attack repository will strengthen the capabilities of Cyber-Med’s detection framework. Major Findings: Based on our initial survey, we have already found more than 15 types of vulnerabilities and possible attacks aimed at MDs and their eco-system. Many of these attacks target individual patients who use devices such pacemakers and insulin pumps. In addition, such attacks are also aimed at MDs that are widely used by medical centers such as MRIs, CTs, and dialysis engines; the information systems that store patient information; protocols such as DICOM; standards such as HL7; and medical information systems such as PACS. However, current detection tools, techniques, and solutions generally fail to detect both the known and unknown attacks launched against MDs. Very little research has been conducted in order to protect these devices from cyber-attacks, since most of the development and engineering efforts are aimed at the devices’ core medical functionality, the contribution to patients’ healthcare, and the business aspects associated with the medical device.Keywords: medical device, cyber security, attack, detection, machine learning
Procedia PDF Downloads 3576891 Analysis of a Generalized Sharma-Tasso-Olver Equation with Variable Coefficients
Authors: Fadi Awawdeh, O. Alsayyed, S. Al-Shará
Abstract:
Considering the inhomogeneities of media, the variable-coefficient Sharma-Tasso-Olver (STO) equation is hereby investigated with the aid of symbolic computation. A newly developed simplified bilinear method is described for the solution of considered equation. Without any constraints on the coefficient functions, multiple kink solutions are obtained. Parametric analysis is carried out in order to analyze the effects of the coefficient functions on the stabilities and propagation characteristics of the solitonic waves.Keywords: Hirota bilinear method, multiple kink solution, Sharma-Tasso-Olver equation, inhomogeneity of media
Procedia PDF Downloads 5176890 Multiple Intelligences as Basis for Differentiated Classroom Instruction in Technology Livelihood Education: An Impact Analysis
Authors: Sheila S. Silang
Abstract:
This research seeks to make an impact analysis on multiple intelligence as the basis for differentiated classroom instruction in TLE. It will also address the felt need of how TLE subject could be taught effectively exhausting all the possible means.This study seek the effect of giving different instruction according to the ability of the students in the following objectives: 1. student’s technological skills enhancement, 2. learning potential improvements 3. having better linkage between school and community in a need for soliciting different learning devices and materials for the learner’s academic progress. General Luna, Quezon is composed of twenty seven barangays. There are only two public high schools. We are aware that K-12 curriculum is focused on providing sufficient time for mastery of concepts and skills, develop lifelong learners, and prepare graduates for tertiary education, middle-level skills development, employment, and entrepreneurship. The challenge is with TLE offerring a vast area of specializations, how would Multiple Intelligence play its vital role as basis in classroom instruction in acquiring the requirement of the said curriculum? 1.To what extent do the respondent students manifest the following types of intelligences: Visual-Spatial, Body-Kinesthetic, Musical, Interpersonal, Intrapersonal, Verbal-Linguistic, Logical-Mathematical and Naturalistic. What media should be used appropriate to the student’s learning style? Visual, Printed Words, Sound, Motion, Color or Realia 3. What is the impact of multiple intelligence as basis for differentiated instruction in T.L.E. based on the following student’s ability? Learning Characteristic and Reading Ability and Performance 3. To what extent do the intelligences of the student relate with their academic performance? The following were the findings derived from the study: In consideration of the vast areas of study of TLE, and the importance it plays in the school curriculum coinciding with the expectation of turning students to technologically competent contributing members of the society, either in the field of Technical/Vocational Expertise or Entrepreneurial based competencies, as well as the government’s concern for it, we visualize TLE classroom teachers making use of multiple intelligence as basis for differentiated classroom instruction in teaching the subject .Somehow, multiple intelligence sample such as Linguistic, Logical-Mathematical, Bodily-Kinesthetic, Interpersonal, Intrapersonal, and Spatial abilities that an individual student may have or may not have, can be a basis for a TLE teacher’s instructional method or design.Keywords: education, multiple, differentiated classroom instruction, impact analysis
Procedia PDF Downloads 4456889 Metal-Oxide-Semiconductor-Only Process Corner Monitoring Circuit
Authors: Davit Mirzoyan, Ararat Khachatryan
Abstract:
A process corner monitoring circuit (PCMC) is presented in this work. The circuit generates a signal, the logical value of which depends on the process corner only. The signal can be used in both digital and analog circuits for testing and compensation of process variations (PV). The presented circuit uses only metal-oxide-semiconductor (MOS) transistors, which allow increasing its detection accuracy, decrease power consumption and area. Due to its simplicity the presented circuit can be easily modified to monitor parametrical variations of only n-type and p-type MOS (NMOS and PMOS, respectively) transistors, resistors, as well as their combinations. Post-layout simulation results prove correct functionality of the proposed circuit, i.e. ability to monitor the process corner (equivalently die-to-die variations) even in the presence of within-die variations.Keywords: detection, monitoring, process corner, process variation
Procedia PDF Downloads 5256888 Fault Diagnosis in Induction Motor
Authors: Kirti Gosavi, Anita Bhole
Abstract:
The paper demonstrates simulation and steady-state performance of three phase squirrel cage induction motor and detection of rotor broken bar fault using MATLAB. This simulation model is successfully used in the fault detection of rotor broken bar for the induction machines. A dynamic model using PWM inverter and mathematical modelling of the motor is developed. The dynamic simulation of the small power induction motor is one of the key steps in the validation of the design process of the motor drive system and it is needed for eliminating advertent design errors and the resulting error in the prototype construction and testing. The simulation model will be helpful in detecting the faults in three phase induction motor using Motor current signature analysis.Keywords: squirrel cage induction motor, pulse width modulation (PWM), fault diagnosis, induction motor
Procedia PDF Downloads 6336887 Cancer Survivor’s Adherence to Healthy Lifestyle Behaviours; Meeting the World Cancer Research Fund/American Institute of Cancer Research Recommendations, a Systematic Review and Meta-Analysis
Authors: Daniel Nigusse Tollosa, Erica James, Alexis Hurre, Meredith Tavener
Abstract:
Introduction: Lifestyle behaviours such as healthy diet, regular physical activity and maintaining a healthy weight are essential for cancer survivors to improve the quality of life and longevity. However, there is no study that synthesis cancer survivor’s adherence to healthy lifestyle recommendations. The purpose of this review was to collate existing data on the prevalence of adherence to healthy behaviours and produce the pooled estimate among adult cancer survivors. Method: Multiple databases (Embase, Medline, Scopus, Web of Science and Google Scholar) were searched for relevant articles published since 2007, reporting cancer survivors adherence to more than two lifestyle behaviours based on the WCRF/AICR recommendations. The pooled prevalence of adherence to single and multiple behaviours (operationalized as adherence to more than 75% (3/4) of health behaviours included in a particular study) was calculated using a random effects model. Subgroup analysis adherence to multiple behaviours was undertaken corresponding to the mean survival years and year of publication. Results: A total of 3322 articles were generated through our search strategies. Of these, 51 studies matched our inclusion criteria, which presenting data from 2,620,586 adult cancer survivors. The highest prevalence of adherence was observed for smoking (pooled estimate: 87%, 95% CI: 85%, 88%) and alcohol intake (pooled estimate 83%, 95% CI: 81%, 86%), and the lowest was for fiber intake (pooled estimate: 31%, 95% CI: 21%, 40%). Thirteen studies were reported the proportion of cancer survivors (all used a simple summative index method) to multiple healthy behaviours, whereby the prevalence of adherence was ranged from 7% to 40% (pooled estimate 23%, 95% CI: 17% to 30%). Subgroup analysis suggest that short-term survivors ( < 5 years survival time) had relatively a better adherence to multiple behaviours (pooled estimate: 31%, 95% CI: 27%, 35%) than long-term ( > 5 years survival time) cancer survivors (pooled estimate: 25%, 95% CI: 14%, 36%). Pooling of estimates according to the year of publication (since 2007) also suggests an increasing trend of adherence to multiple behaviours over time. Conclusion: Overall, the adherence to multiple lifestyle behaviors was poor (not satisfactory), and relatively, it is a major concern for long-term than the short-term cancer survivor. Cancer survivors need to obey with healthy lifestyle recommendations related to physical activity, fruit and vegetable, fiber, red/processed meat and sodium intake.Keywords: adherence, lifestyle behaviours, cancer survivors, WCRF/AICR
Procedia PDF Downloads 1836886 Recommendations Using Online Water Quality Sensors for Chlorinated Drinking Water Monitoring at Drinking Water Distribution Systems Exposed to Glyphosate
Authors: Angela Maria Fasnacht
Abstract:
Detection of anomalies due to contaminants’ presence, also known as early detection systems in water treatment plants, has become a critical point that deserves an in-depth study for their improvement and adaptation to current requirements. The design of these systems requires a detailed analysis and processing of the data in real-time, so it is necessary to apply various statistical methods appropriate to the data generated, such as Spearman’s Correlation, Factor Analysis, Cross-Correlation, and k-fold Cross-validation. Statistical analysis and methods allow the evaluation of large data sets to model the behavior of variables; in this sense, statistical treatment or analysis could be considered a vital step to be able to develop advanced models focused on machine learning that allows optimized data management in real-time, applied to early detection systems in water treatment processes. These techniques facilitate the development of new technologies used in advanced sensors. In this work, these methods were applied to identify the possible correlations between the measured parameters and the presence of the glyphosate contaminant in the single-pass system. The interaction between the initial concentration of glyphosate and the location of the sensors on the reading of the reported parameters was studied.Keywords: glyphosate, emergent contaminants, machine learning, probes, sensors, predictive
Procedia PDF Downloads 1216885 Real-Time Quantitative Polymerase Chain Reaction Assay for the Detection of microRNAs Using Bi-Directional Extension Sequences
Authors: Kyung Jin Kim, Jiwon Kwak, Jae-Hoon Lee, Soo Suk Lee
Abstract:
MicroRNAs (miRNA) are a class of endogenous, single-stranded, small, and non-protein coding RNA molecules typically 20-25 nucleotides long. They are thought to regulate the expression of other genes in a broad range by binding to 3’- untranslated regions (3’-UTRs) of specific mRNAs. The detection of miRNAs is very important for understanding of the function of these molecules and in the diagnosis of variety of human diseases. However, detection of miRNAs is very challenging because of their short length and high sequence similarities within miRNA families. So, a simple-to-use, low-cost, and highly sensitive method for the detection of miRNAs is desirable. In this study, we demonstrate a novel bi-directional extension (BDE) assay. In the first step, a specific linear RT primer is hybridized to 6-10 base pairs from the 3’-end of a target miRNA molecule and then reverse transcribed to generate a cDNA strand. After reverse transcription, the cDNA was hybridized to the 3’-end which is BDE sequence; it played role as the PCR template. The PCR template was amplified in an SYBR green-based quantitative real-time PCR. To prove the concept, we used human brain total RNA. It could be detected quantitatively in the range of seven orders of magnitude with excellent linearity and reproducibility. To evaluate the performance of BDE assay, we contrasted sensitivity and specificity of the BDE assay against a commercially available poly (A) tailing method using miRNAs for let-7e extracted from A549 human epithelial lung cancer cells. The BDE assay displayed good performance compared with a poly (A) tailing method in terms of specificity and sensitivity; the CT values differed by 2.5 and the melting curve showed a sharper than poly (A) tailing methods. We have demonstrated an innovative, cost-effective BDE assay that allows improved sensitivity and specificity in detection of miRNAs. Dynamic range of the SYBR green-based RT-qPCR for miR-145 could be represented quantitatively over a range of 7 orders of magnitude from 0.1 pg to 1.0 μg of human brain total RNA. Finally, the BDE assay for detection of miRNA species such as let-7e shows good performance compared with a poly (A) tailing method in terms of specificity and sensitivity. Thus BDE proves a simple, low cost, and highly sensitive assay for various miRNAs and should provide significant contributions in research on miRNA biology and application of disease diagnostics with miRNAs as targets.Keywords: bi-directional extension (BDE), microRNA (miRNA), poly (A) tailing assay, reverse transcription, RT-qPCR
Procedia PDF Downloads 1666884 Video Heart Rate Measurement for the Detection of Trauma-Related Stress States
Authors: Jarek Krajewski, David Daxberger, Luzi Beyer
Abstract:
Finding objective and non-intrusive measurements of emotional and psychopathological states (e.g., post-traumatic stress disorder, PTSD) is an important challenge. Thus, the proposed approach here uses Photoplethysmographic imaging (PPGI) applying facial RGB Cam videos to estimate heart rate levels. A pipeline for the signal processing of the raw image has been proposed containing different preprocessing approaches, e.g., Independent Component Analysis, Non-negative Matrix factorization, and various other artefact correction approaches. Under resting and constant light conditions, we reached a sensitivity of 84% for pulse peak detection. The results indicate that PPGI can be a suitable solution for providing heart rate data derived from these indirectly post-traumatic stress states.Keywords: heart rate, PTSD, PPGI, stress, preprocessing
Procedia PDF Downloads 1246883 Mike Hat: Coloured-Tape-in-Hat as a Head Circumference Measuring Instrument for Early Detection of Hydrocephalus in an Infant
Authors: Nyimas Annissa Mutiara Andini
Abstract:
Every year, children develop hydrocephalus during the first year of life. If it is not treated, hydrocephalus can lead to brain damage, a loss in mental and physical abilities, and even death. To be treated, first, we have to do a proper diagnosis using some examinations especially to detect hydrocephalus earlier. One of the examination that could be done is using a head circumference measurement. Increased head circumference is a first and main sign of hydrocephalus, especially in infant (0-1 year age). Head circumference is a measurement of a child's head largest area. In this measurement, we want to get the distance from above the eyebrows and ears and around the back of the head using a measurement tape. If the head circumference of an infant is larger than normal, this infant might potentially suffer hydrocephalus. If early diagnosis and timely treatment of hydrocephalus could be done most children can recover successfully. There are some problems with early detection of hydrocephalus using regular tape for head circumference measurement. One of the problem is the infant’s comfort. We need to make the infant feel comfort along the head circumference measurement to get a proper result of the examination. For that, we can use a helpful stuff, like a hat. This paper is aimed to describe the possibility of using a head circumference measuring instrument for early detection of hydrocephalus in an infant with a mike hat, coloured-tape-in-hat. In the first life, infants’ head size is about 35 centimeters. First three months after that infants will gain 2 centimeters each month. The second three months, infant’s head circumference will increase 1 cm each month. And for the six months later, the rate is 0.5 cm per month, and end up with an average of 47 centimeters. This formula is compared to the WHO’s head circumference growth chart. The shape of this tape-in-hat is alike an upper arm measurement. This tape-in-hat diameter is about 47 centimeters. It contains twelve different colours range by age. If it is out of the normal colour, the infant potentially suffers hydrocephalus. This examination should be done monthly. If in two times of measurement there still in the same range abnormal of head circumference, or a rapid growth of the head circumference size, the infant should be referred to a pediatrician. There are the pink hat for girls and blue hat for boys. Based on this paper, we know that this measurement can be used to help early detection of hydrocephalus in an infant.Keywords: head circumference, hydrocephalus, infant, mike hat
Procedia PDF Downloads 2666882 Numerical Simulation of Multiple Arrays Arrangement of Micro Hydro Power Turbines
Authors: M. A. At-Tasneem, N. T. Rao, T. M. Y. S. Tuan Ya, M. S. Idris, M. Ammar
Abstract:
River flow over micro hydro power (MHP) turbines of multiple arrays arrangement is simulated with computational fluid dynamics (CFD) software to obtain the flow characteristics. In this paper, CFD software is used to simulate the water flow over MHP turbines as they are placed in a river. Multiple arrays arrangement of MHP turbines lead to generate large amount of power. In this study, a river model is created and simulated in CFD software to obtain the water flow characteristic. The process then continued by simulating different types of arrays arrangement in the river model. A MHP turbine model consists of a turbine outer body and static propeller blade in it. Five types of arrangements are used which are parallel, series, triangular, square and rhombus with different spacing sizes. The velocity profiles on each MHP turbines are identified at the mouth of each turbine bodies. This study is required to obtain the arrangement with increasing spacing sizes that can produce highest power density through the water flow variation.Keywords: micro hydro power, CFD, arrays arrangement, spacing sizes, velocity profile, power
Procedia PDF Downloads 3586881 Implementation of Successive Interference Cancellation Algorithms in the 5g Downlink
Authors: Mokrani Mohamed Amine
Abstract:
In this paper, we have implemented successive interference cancellation algorithms in the 5G downlink. We have calculated the maximum throughput in Frequency Division Duplex (FDD) mode in the downlink, where we have obtained a value equal to 836932 b/ms. The transmitter is of type Multiple Input Multiple Output (MIMO) with eight transmitting and receiving antennas. Each antenna among eight transmits simultaneously a data rate of 104616 b/ms that contains the binary messages of the three users; in this case, the Cyclic Redundancy Check CRC is negligible, and the MIMO category is the spatial diversity. The technology used for this is called Non-Orthogonal Multiple Access (NOMA) with a Quadrature Phase Shift Keying (QPSK) modulation. The transmission is done in a Rayleigh fading channel with the presence of obstacles. The MIMO Successive Interference Cancellation (SIC) receiver with two transmitting and receiving antennas recovers its binary message without errors for certain values of transmission power such as 50 dBm, with 0.054485% errors when the transmitted power is 20dBm and with 0.00286763% errors for a transmitted power of 32 dBm(in the case of user 1) as well as with 0.0114705% errors when the transmitted power is 20 dBm also with 0.00286763% errors for a power of 24 dBm(in the case of user2) by applying the steps involved in SIC.Keywords: 5G, NOMA, QPSK, TBS, LDPC, SIC, capacity
Procedia PDF Downloads 1036880 Electrical Decomposition of Time Series of Power Consumption
Authors: Noura Al Akkari, Aurélie Foucquier, Sylvain Lespinats
Abstract:
Load monitoring is a management process for energy consumption towards energy savings and energy efficiency. Non Intrusive Load Monitoring (NILM) is one method of load monitoring used for disaggregation purposes. NILM is a technique for identifying individual appliances based on the analysis of the whole residence data retrieved from the main power meter of the house. Our NILM framework starts with data acquisition, followed by data preprocessing, then event detection, feature extraction, then general appliance modeling and identification at the final stage. The event detection stage is a core component of NILM process since event detection techniques lead to the extraction of appliance features. Appliance features are required for the accurate identification of the household devices. In this research work, we aim at developing a new event detection methodology with accurate load disaggregation to extract appliance features. Time-domain features extracted are used for tuning general appliance models for appliance identification and classification steps. We use unsupervised algorithms such as Dynamic Time Warping (DTW). The proposed method relies on detecting areas of operation of each residential appliance based on the power demand. Then, detecting the time at which each selected appliance changes its states. In order to fit with practical existing smart meters capabilities, we work on low sampling data with a frequency of (1/60) Hz. The data is simulated on Load Profile Generator software (LPG), which was not previously taken into consideration for NILM purposes in the literature. LPG is a numerical software that uses behaviour simulation of people inside the house to generate residential energy consumption data. The proposed event detection method targets low consumption loads that are difficult to detect. Also, it facilitates the extraction of specific features used for general appliance modeling. In addition to this, the identification process includes unsupervised techniques such as DTW. To our best knowledge, there exist few unsupervised techniques employed with low sampling data in comparison to the many supervised techniques used for such cases. We extract a power interval at which falls the operation of the selected appliance along with a time vector for the values delimiting the state transitions of the appliance. After this, appliance signatures are formed from extracted power, geometrical and statistical features. Afterwards, those formed signatures are used to tune general model types for appliances identification using unsupervised algorithms. This method is evaluated using both simulated data on LPG and real-time Reference Energy Disaggregation Dataset (REDD). For that, we compute performance metrics using confusion matrix based metrics, considering accuracy, precision, recall and error-rate. The performance analysis of our methodology is then compared with other detection techniques previously used in the literature review, such as detection techniques based on statistical variations and abrupt changes (Variance Sliding Window and Cumulative Sum).Keywords: electrical disaggregation, DTW, general appliance modeling, event detection
Procedia PDF Downloads 786879 Biochar as a Strong Adsorbent for Multiple-Metal Removal from Contaminated Water
Authors: Eman H. El-Gamal, Mai E. Khedr, Randa Ghonim, Mohamed Rashad
Abstract:
In the past few years, biochar - a highly carbon-rich material produced from agro-wastes by pyrolysis process - was used as an effective adsorbent for heavy metals removal from polluted water. In this study, different types of biochar (rice straw 'RSB', corn cob 'CCB', and Jatropha shell 'JSB' were used to evaluate the adsorption capacity of heavy metals removal from multiple-metal solutions (Cu, Mn, Zn, and Cd). Kinetics modeling has been examined to illustrate potential adsorption mechanisms. The results showed that the potential removal of metal is dependent on the metal and biochar types. The adsorption capacity of the biochars followed the order: RSB > JSB > CCB. In general, RSB and JSB biochars presented high potential removal of heavy metals from polluted water, which was higher than 90 and 80% after 2 hrs of contact time for all metals, respectively. According to the kinetics data, the pseudo-second-order model was agreed strongly with Cu, Mn, Zn, and Cd adsorption onto the biochars (R2 ≥ 0.97), indicating the dominance of specific adsorption process, i.e., chemisorption. In conclusion, this study revealed that RSB and JSB biochar have the potential to be a strong adsorbent for multiple-metal removal from wastewater.Keywords: adsorption, biochar, chemisorption, polluted water
Procedia PDF Downloads 1506878 Automatic Motion Trajectory Analysis for Dual Human Interaction Using Video Sequences
Authors: Yuan-Hsiang Chang, Pin-Chi Lin, Li-Der Jeng
Abstract:
Advance in techniques of image and video processing has enabled the development of intelligent video surveillance systems. This study was aimed to automatically detect moving human objects and to analyze events of dual human interaction in a surveillance scene. Our system was developed in four major steps: image preprocessing, human object detection, human object tracking, and motion trajectory analysis. The adaptive background subtraction and image processing techniques were used to detect and track moving human objects. To solve the occlusion problem during the interaction, the Kalman filter was used to retain a complete trajectory for each human object. Finally, the motion trajectory analysis was developed to distinguish between the interaction and non-interaction events based on derivatives of trajectories related to the speed of the moving objects. Using a database of 60 video sequences, our system could achieve the classification accuracy of 80% in interaction events and 95% in non-interaction events, respectively. In summary, we have explored the idea to investigate a system for the automatic classification of events for interaction and non-interaction events using surveillance cameras. Ultimately, this system could be incorporated in an intelligent surveillance system for the detection and/or classification of abnormal or criminal events (e.g., theft, snatch, fighting, etc.).Keywords: motion detection, motion tracking, trajectory analysis, video surveillance
Procedia PDF Downloads 5486877 Subpixel Corner Detection for Monocular Camera Linear Model Research
Authors: Guorong Sui, Xingwei Jia, Fei Tong, Xiumin Gao
Abstract:
Camera calibration is a fundamental issue of high precision noncontact measurement. And it is necessary to analyze and study the reliability and application range of its linear model which is often used in the camera calibration. According to the imaging features of monocular cameras, a camera model which is based on the image pixel coordinates and three dimensional space coordinates is built. Using our own customized template, the image pixel coordinate is obtained by the subpixel corner detection method. Without considering the aberration of the optical system, the feature extraction and linearity analysis of the line segment in the template are performed. Moreover, the experiment is repeated 11 times by constantly varying the measuring distance. At last, the linearity of the camera is achieved by fitting 11 groups of data. The camera model measurement results show that the relative error does not exceed 1%, and the repeated measurement error is not more than 0.1 mm magnitude. Meanwhile, it is found that the model has some measurement differences in the different region and object distance. The experiment results show this linear model is simple and practical, and have good linearity within a certain object distance. These experiment results provide a powerful basis for establishment of the linear model of camera. These works will have potential value to the actual engineering measurement.Keywords: camera linear model, geometric imaging relationship, image pixel coordinates, three dimensional space coordinates, sub-pixel corner detection
Procedia PDF Downloads 2776876 Assay for SARS-Cov-2 on Chicken Meat
Authors: R. Mehta, M. Ghogomu, B. Schoel
Abstract:
Reports appeared in 2020 about China detecting SARS-Cov-2 (Covid-19) on frozen meat, shrimp, and food packaging material. In this study, we examined the use of swabs for the detection of Covid-19 on meat samples, and chicken breast (CB) was used as a model. Methods: Heat inactivated SARS-Cov-2 virus (IV) from Microbiologics was loaded onto the CB, swabbing was done, and the recovered inactivated virus was subjected to the Machery & Nagel NucleoSpin RNAVirus kit for RNA isolation according to manufacturer's instructions. For RT-PCR, the IDT 2019-nCoV RUO Covid-19 test kit was used with the Taqman Fast Virus 1-step master mix. The limit of detection (LOD) of viral load recovered from the CB was determined under various conditions: first on frozen CB where the IV was introduced on a defined area, then on frozen CB, with IV spread-out, and finally, on thawed CB. Results: The lowest amount of IV which can be reliably detected on frozen CB was a load of 1,000 - 2,000 IV copies where the IV was loaded on one spot of about 1 square inch. Next, the IV was spread out over a whole frozen CB about 16 square inches. The IV could be recovered at a lowest load of 4,000 to 8,000 copies. Furthermore, the effects of temperature change on viral load recovery was investigated i.e., if raw unfrozen meat became contaminated and remains for 1 hour at 4°C or gets refrozen. The amount of IV recovered successfully from CB kept at 4°C and the refrozen CB was similar to the recovery gotten from loading the IV directly on the frozen CB. In conclusion, an assay using swabs was successfully established for the detection of SARS-Cov-2 on frozen or raw (unfrozen) CB with a minimal load of up to 8,000 copies spread over 16 square inches.Keywords: assay, COVID-19, meat, SARS-Cov-2
Procedia PDF Downloads 2036875 Clinical Efficacy of Indigenous Software for Automatic Detection of Stages of Retinopathy of Prematurity (ROP)
Authors: Joshi Manisha, Shivaram, Anand Vinekar, Tanya Susan Mathews, Yeshaswini Nagaraj
Abstract:
Retinopathy of prematurity (ROP) is abnormal blood vessel development in the retina of the eye in a premature infant. The principal object of the invention is to provide a technique for detecting demarcation line and ridge detection for a given ROP image that facilitates early detection of ROP in stage 1 and stage 2. The demarcation line is an indicator of Stage 1 of the ROP and the ridge is the hallmark of typically Stage 2 ROP. Thirty Retcam images of Asian Indian infants obtained during routine ROP screening have been used for the analysis. A graphical user interface has been developed to detect demarcation line/ridge and to extract ground truth. This novel algorithm uses multilevel vessel enhancement to enhance tubular structures in the digital ROP images. It has been observed that the orientation of the demarcation line/ridge is normal to the direction of the blood vessels, which is used for the identification of the ridge/ demarcation line. Quantitative analysis has been presented based on gold standard images marked by expert ophthalmologist. Image based analysis has been based on the length and the position of the detected ridge. In image based evaluation, average sensitivity and positive predictive value was found to be 92.30% and 85.71% respectively. In pixel based evaluation, average sensitivity, specificity, positive predictive value and negative predictive value achieved were 60.38%, 99.66%, 52.77% and 99.75% respectively.Keywords: ROP, ridge, multilevel vessel enhancement, biomedical
Procedia PDF Downloads 4106874 Efficient Tuning Parameter Selection by Cross-Validated Score in High Dimensional Models
Authors: Yoonsuh Jung
Abstract:
As DNA microarray data contain relatively small sample size compared to the number of genes, high dimensional models are often employed. In high dimensional models, the selection of tuning parameter (or, penalty parameter) is often one of the crucial parts of the modeling. Cross-validation is one of the most common methods for the tuning parameter selection, which selects a parameter value with the smallest cross-validated score. However, selecting a single value as an "optimal" value for the parameter can be very unstable due to the sampling variation since the sample sizes of microarray data are often small. Our approach is to choose multiple candidates of tuning parameter first, then average the candidates with different weights depending on their performance. The additional step of estimating the weights and averaging the candidates rarely increase the computational cost, while it can considerably improve the traditional cross-validation. We show that the selected value from the suggested methods often lead to stable parameter selection as well as improved detection of significant genetic variables compared to the tradition cross-validation via real data and simulated data sets.Keywords: cross validation, parameter averaging, parameter selection, regularization parameter search
Procedia PDF Downloads 4156873 Improvements and Implementation Solutions to Reduce the Computational Load for Traffic Situational Awareness with Alerts (TSAA)
Authors: Salvatore Luongo, Carlo Luongo
Abstract:
This paper discusses the implementation solutions to reduce the computational load for the Traffic Situational Awareness with Alerts (TSAA) application, based on Automatic Dependent Surveillance-Broadcast (ADS-B) technology. In 2008, there were 23 total mid-air collisions involving general aviation fixed-wing aircraft, 6 of which were fatal leading to 21 fatalities. These collisions occurred during visual meteorological conditions, indicating the limitations of the see-and-avoid concept for mid-air collision avoidance as defined in the Federal Aviation Administration’s (FAA). The commercial aviation aircraft are already equipped with collision avoidance system called TCAS, which is based on classic transponder technology. This system dramatically reduced the number of mid-air collisions involving air transport aircraft. In general aviation, the same reduction in mid-air collisions has not occurred, so this reduction is the main objective of the TSAA application. The major difference between the original conflict detection application and the TSAA application is that the conflict detection is focused on preventing loss of separation in en-route environments. Instead TSAA is devoted to reducing the probability of mid-air collision in all phases of flight. The TSAA application increases the flight crew traffic situation awareness providing alerts of traffic that are detected in conflict with ownship in support of the see-and-avoid responsibility. The relevant effort has been spent in the design process and the code generation in order to maximize the efficiency and performances in terms of computational load and memory consumption reduction. The TSAA architecture is divided into two high-level systems: the “Threats database” and the “Conflict detector”. The first one receives the traffic data from ADS-B device and provides the memorization of the target’s data history. Conflict detector module estimates ownship and targets trajectories in order to perform the detection of possible future loss of separation between ownship and each target. Finally, the alerts are verified by additional conflict verification logic, in order to prevent possible undesirable behaviors of the alert flag. In order to reduce the computational load, a pre-check evaluation module is used. This pre-check is only a computational optimization, so the performances of the conflict detector system are not modified in terms of number of alerts detected. The pre-check module uses analytical trajectories propagation for both target and ownship. This allows major accuracy and avoids the step-by-step propagation, which requests major computational load. Furthermore, the pre-check permits to exclude the target that is certainly not a threat, using an analytical and efficient geometrical approach, in order to decrease the computational load for the following modules. This software improvement is not suggested by FAA documents, and so it is the main innovation of this work. The efficiency and efficacy of this enhancement are verified using fast-time and real-time simulations and by the execution on a real device in several FAA scenarios. The final implementation also permits the FAA software certification in compliance with DO-178B standard. The computational load reduction allows the installation of TSAA application also on devices with multiple applications and/or low capacity in terms of available memory and computational capabilitiesKeywords: traffic situation awareness, general aviation, aircraft conflict detection, computational load reduction, implementation solutions, software certification
Procedia PDF Downloads 2856872 Analysis of a IncResU-Net Model for R-Peak Detection in ECG Signals
Authors: Beatriz Lafuente Alcázar, Yash Wani, Amit J. Nimunkar
Abstract:
Cardiovascular Diseases (CVDs) are the leading cause of death globally, and around 80% of sudden cardiac deaths are due to arrhythmias or irregular heartbeats. The majority of these pathologies are revealed by either short-term or long-term alterations in the electrocardiogram (ECG) morphology. The ECG is the main diagnostic tool in cardiology. It is a non-invasive, pain free procedure that measures the heart’s electrical activity and that allows the detecting of abnormal rhythms and underlying conditions. A cardiologist can diagnose a wide range of pathologies based on ECG’s form alterations, but the human interpretation is subjective and it is contingent to error. Moreover, ECG records can be quite prolonged in time, which can further complicate visual diagnosis, and deeply retard disease detection. In this context, deep learning methods have risen as a promising strategy to extract relevant features and eliminate individual subjectivity in ECG analysis. They facilitate the computation of large sets of data and can provide early and precise diagnoses. Therefore, the cardiology field is one of the areas that can most benefit from the implementation of deep learning algorithms. In the present study, a deep learning algorithm is trained following a novel approach, using a combination of different databases as the training set. The goal of the algorithm is to achieve the detection of R-peaks in ECG signals. Its performance is further evaluated in ECG signals with different origins and features to test the model’s ability to generalize its outcomes. Performance of the model for detection of R-peaks for clean and noisy ECGs is presented. The model is able to detect R-peaks in the presence of various types of noise, and when presented with data, it has not been trained. It is expected that this approach will increase the effectiveness and capacity of cardiologists to detect divergences in the normal cardiac activity of their patients.Keywords: arrhythmia, deep learning, electrocardiogram, machine learning, R-peaks
Procedia PDF Downloads 1866871 EQMamba - Method Suggestion for Earthquake Detection and Phase Picking
Authors: Noga Bregman
Abstract:
Accurate and efficient earthquake detection and phase picking are crucial for seismic hazard assessment and emergency response. This study introduces EQMamba, a deep-learning method that combines the strengths of the Earthquake Transformer and the Mamba model for simultaneous earthquake detection and phase picking. EQMamba leverages the computational efficiency of Mamba layers to process longer seismic sequences while maintaining a manageable model size. The proposed architecture integrates convolutional neural networks (CNNs), bidirectional long short-term memory (BiLSTM) networks, and Mamba blocks. The model employs an encoder composed of convolutional layers and max pooling operations, followed by residual CNN blocks for feature extraction. Mamba blocks are applied to the outputs of BiLSTM blocks, efficiently capturing long-range dependencies in seismic data. Separate decoders are used for earthquake detection, P-wave picking, and S-wave picking. We trained and evaluated EQMamba using a subset of the STEAD dataset, a comprehensive collection of labeled seismic waveforms. The model was trained using a weighted combination of binary cross-entropy loss functions for each task, with the Adam optimizer and a scheduled learning rate. Data augmentation techniques were employed to enhance the model's robustness. Performance comparisons were conducted between EQMamba and the EQTransformer over 20 epochs on this modest-sized STEAD subset. Results demonstrate that EQMamba achieves superior performance, with higher F1 scores and faster convergence compared to EQTransformer. EQMamba reached F1 scores of 0.8 by epoch 5 and maintained higher scores throughout training. The model also exhibited more stable validation performance, indicating good generalization capabilities. While both models showed lower accuracy in phase-picking tasks compared to detection, EQMamba's overall performance suggests significant potential for improving seismic data analysis. The rapid convergence and superior F1 scores of EQMamba, even on a modest-sized dataset, indicate promising scalability for larger datasets. This study contributes to the field of earthquake engineering by presenting a computationally efficient and accurate method for simultaneous earthquake detection and phase picking. Future work will focus on incorporating Mamba layers into the P and S pickers and further optimizing the architecture for seismic data specifics. The EQMamba method holds the potential for enhancing real-time earthquake monitoring systems and improving our understanding of seismic events.Keywords: earthquake, detection, phase picking, s waves, p waves, transformer, deep learning, seismic waves
Procedia PDF Downloads 526870 A Simple Approach to Reliability Assessment of Structures via Anomaly Detection
Authors: Rims Janeliukstis, Deniss Mironovs, Andrejs Kovalovs
Abstract:
Operational Modal Analysis (OMA) is widely applied as a method for Structural Health Monitoring for structural damage identification and assessment by tracking the changes of the identified modal parameters over time. Unfortunately, modal parameters also depend on such external factors as temperature and loads. Any structural condition assessment using modal parameters should be done taking into consideration those external factors, otherwise there is a high chance of false positives. A method of structural reliability assessment based on anomaly detection technique called Machalanobis Squared Distance (MSD) is proposed. It requires a set of reference conditions to learn healthy state of a structure, which all future parameters are compared to. In this study, structural modal parameters (natural frequency and mode shape), as well as ambient temperature and loads acting on the structure are used as features. Numerical tests were performed on a finite element model of a carbon fibre reinforced polymer composite beam with delamination damage at various locations and of various severities. The advantages of the demonstrated approach include relatively few computational steps, ability to distinguish between healthy and damaged conditions and discriminate between different damage severities. It is anticipated to be promising in reliability assessment of massively produced structural parts.Keywords: operational modal analysis, reliability assessment, anomaly detection, damage, mahalanobis squared distance
Procedia PDF Downloads 114