Search results for: subtle change detection and quantification
9697 Alternator Fault Detection Using Wigner-Ville Distribution
Authors: Amin Ranjbar, Amir Arsalan Jalili Zolfaghari, Amir Abolfazl Suratgar, Mehrdad Khajavi
Abstract:
This paper describes two stages of learning-based fault detection procedure in alternators. The procedure consists of three states of machine condition namely shortened brush, high impedance relay and maintaining a healthy condition in the alternator. The fault detection algorithm uses Wigner-Ville distribution as a feature extractor and also appropriate feature classifier. In this work, ANN (Artificial Neural Network) and also SVM (support vector machine) were compared to determine more suitable performance evaluated by the mean squared of errors criteria. Modules work together to detect possible faulty conditions of machines working. To test the method performance, a signal database is prepared by making different conditions on a laboratory setup. Therefore, it seems by implementing this method, satisfactory results are achieved.Keywords: alternator, artificial neural network, support vector machine, time-frequency analysis, Wigner-Ville distribution
Procedia PDF Downloads 3749696 Robust Segmentation of Salient Features in Automatic Breast Ultrasound (ABUS) Images
Authors: Lamees Nasser, Yago Diez, Robert Martí, Joan Martí, Ibrahim Sadek
Abstract:
Automated 3D breast ultrasound (ABUS) screening is a novel modality in medical imaging because of its common characteristics shared with other ultrasound modalities in addition to the three orthogonal planes (i.e., axial, sagittal, and coronal) that are useful in analysis of tumors. In the literature, few automatic approaches exist for typical tasks such as segmentation or registration. In this work, we deal with two problems concerning ABUS images: nipple and rib detection. Nipple and ribs are the most visible and salient features in ABUS images. Determining the nipple position plays a key role in some applications for example evaluation of registration results or lesion follow-up. We present a nipple detection algorithm based on color and shape of the nipple, besides an automatic approach to detect the ribs. In point of fact, rib detection is considered as one of the main stages in chest wall segmentation. This approach consists of four steps. First, images are normalized in order to minimize the intensity variability for a given set of regions within the same image or a set of images. Second, the normalized images are smoothed by using anisotropic diffusion filter. Next, the ribs are detected in each slice by analyzing the eigenvalues of the 3D Hessian matrix. Finally, a breast mask and a probability map of regions detected as ribs are used to remove false positives (FP). Qualitative and quantitative evaluation obtained from a total of 22 cases is performed. For all cases, the average and standard deviation of the root mean square error (RMSE) between manually annotated points placed on the rib surface and detected points on rib borders are 15.1188 mm and 14.7184 mm respectively.Keywords: Automated 3D Breast Ultrasound, Eigenvalues of Hessian matrix, Nipple detection, Rib detection
Procedia PDF Downloads 3319695 Architectural Adaptation for Road Humps Detection in Adverse Light Scenario
Authors: Padmini S. Navalgund, Manasi Naik, Ujwala Patil
Abstract:
Road hump is a semi-cylindrical elevation on the road made across specific locations of the road. The vehicle needs to maneuver the hump by reducing the speed to avoid car damage and pass over the road hump safely. Road Humps on road surfaces, if identified in advance, help to maintain the security and stability of vehicles, especially in adverse visibility conditions, viz. night scenarios. We have proposed a deep learning architecture adaptation by implementing the MISH activation function and developing a new classification loss function called "Effective Focal Loss" for Indian road humps detection in adverse light scenarios. We captured images comprising of marked and unmarked road humps from two different types of cameras across South India to build a heterogeneous dataset. A heterogeneous dataset enabled the algorithm to train and improve the accuracy of detection. The images were pre-processed, annotated for two classes viz, marked hump and unmarked hump. The dataset from these images was used to train the single-stage object detection algorithm. We utilised an algorithm to synthetically generate reduced visible road humps scenarios. We observed that our proposed framework effectively detected the marked and unmarked hump in the images in clear and ad-verse light environments. This architectural adaptation sets up an option for early detection of Indian road humps in reduced visibility conditions, thereby enhancing the autonomous driving technology to handle a wider range of real-world scenarios.Keywords: Indian road hump, reduced visibility condition, low light condition, adverse light condition, marked hump, unmarked hump, YOLOv9
Procedia PDF Downloads 279694 Synthesis and Characterization of CNPs Coated Carbon Nanorods for Cd2+ Ion Adsorption from Industrial Waste Water and Reusable for Latent Fingerprint Detection
Authors: Bienvenu Gael Fouda Mbanga
Abstract:
This study reports a new approach of preparation of carbon nanoparticles coated cerium oxide nanorods (CNPs/CeONRs) nanocomposite and reusing the spent adsorbent of Cd2+- CNPs/CeONRs nanocomposite for latent fingerprint detection (LFP) after removing Cd2+ ions from aqueous solution. CNPs/CeONRs nanocomposite was prepared by using CNPs and CeONRs with adsorption processes. The prepared nanocomposite was then characterized by using UV-visible spectroscopy (UV-visible), Fourier transforms infrared spectroscopy (FTIR), X-ray diffraction pattern (XRD), scanning electron microscope (SEM), Transmission electron microscopy (TEM), Energy-dispersive X-ray spectroscopy (EDS), Zeta potential, X-ray photoelectron spectroscopy (XPS). The average size of the CNPs was 7.84nm. The synthesized CNPs/CeONRs nanocomposite has proven to be a good adsorbent for Cd2+ removal from water with optimum pH 8, dosage 0. 5 g / L. The results were best described by the Langmuir model, which indicated a linear fit (R2 = 0.8539-0.9969). The adsorption capacity of CNPs/CeONRs nanocomposite showed the best removal of Cd2+ ions with qm = (32.28-59.92 mg/g), when compared to previous reports. This adsorption followed pseudo-second order kinetics and intra particle diffusion processes. ∆G and ∆H values indicated spontaneity at high temperature (40oC) and the endothermic nature of the adsorption process. CNPs/CeONRs nanocomposite therefore showed potential as an effective adsorbent. Furthermore, the metal loaded on the adsorbent Cd2+- CNPs/CeONRs has proven to be sensitive and selective for LFP detection on various porous substrates. Hence Cd2+-CNPs/CeONRs nanocomposite can be reused as a good fingerprint labelling agent in LFP detection so as to avoid secondary environmental pollution by disposal of the spent adsorbent.Keywords: Cd2+-CNPs/CeONRs nanocomposite, cadmium adsorption, isotherm, kinetics, thermodynamics, reusable for latent fingerprint detection
Procedia PDF Downloads 1219693 Teachers as Agents of Change in Diverse Classrooms: An Overview of the Literature
Authors: Anna Sanczyk
Abstract:
Diverse students may experience different forms of discrimination. Some of the oppression students experience in schools are racism, sexism, classism, or homophobia that may affect their achievement, and teachers need to make sure they create inclusive, equitable classroom environments. The broader literature on social change in education shows that teachers who challenge oppression and want to promote equitable and transformative education face institutional, social, and political constraints. This paper discusses research on teachers’ work to create socially just and culturally inclusive classrooms and schools. The practical contribution of this literature review is that it provides a comprehensive compilation of the studies presenting teachers’ roles and efforts in affecting social change. The examination of the research on social change in education points to the urgency of teachers addressing the needs of marginalized students and resisting systemic oppression in schools. The implications of this literature review relate to the concerns that schools should provide greater advocacy for marginalized students in diverse learning contexts, and teacher education programs should prepare teachers to be active advocates for diverse students. The literature review has the potential to inform educators to enhance educational equity and improve the learning environment. This literature review illustrates teachers as agents of change in diverse classrooms and contributes to understanding various ways of taking action towards fostering more equitable and transformative education in today’s schools.Keywords: agents of change, diversity, opression, social change
Procedia PDF Downloads 1409692 Automatic Vowel and Consonant's Target Formant Frequency Detection
Authors: Othmane Bouferroum, Malika Boudraa
Abstract:
In this study, a dual exponential model for CV formant transition is derived from locus theory of speech perception. Then, an algorithm for automatic vowel and consonant’s target formant frequency detection is developed and tested on real speech. The results show that vowels and consonants are detected through transitions rather than their small stable portions. Also, vowel reduction is clearly observed in our data. These results are confirmed by the observations made in perceptual experiments in the literature.Keywords: acoustic invariance, coarticulation, formant transition, locus equation
Procedia PDF Downloads 2729691 Sensor Fault-Tolerant Model Predictive Control for Linear Parameter Varying Systems
Authors: Yushuai Wang, Feng Xu, Junbo Tan, Xueqian Wang, Bin Liang
Abstract:
In this paper, a sensor fault-tolerant control (FTC) scheme using robust model predictive control (RMPC) and set theoretic fault detection and isolation (FDI) is extended to linear parameter varying (LPV) systems. First, a group of set-valued observers are designed for passive fault detection (FD) and the observer gains are obtained through minimizing the size of invariant set of state estimation-error dynamics. Second, an input set for fault isolation (FI) is designed offline through set theory for actively isolating faults after FD. Third, an RMPC controller based on state estimation for LPV systems is designed to control the system in the presence of disturbance and measurement noise and tolerate faults. Besides, an FTC algorithm is proposed to maintain the plant operate in the corresponding mode when the fault occurs. Finally, a numerical example is used to show the effectiveness of the proposed results.Keywords: fault detection, linear parameter varying, model predictive control, set theory
Procedia PDF Downloads 2549690 Absolute Quantification of the Bexsero Vaccine Component Factor H Binding Protein (fHbp) by Selected Reaction Monitoring: The Contribution of Mass Spectrometry in Vaccinology
Authors: Massimiliano Biagini, Marco Spinsanti, Gabriella De Angelis, Sara Tomei, Ilaria Ferlenghi, Maria Scarselli, Alessia Biolchi, Alessandro Muzzi, Brunella Brunelli, Silvana Savino, Marzia M. Giuliani, Isabel Delany, Paolo Costantino, Rino Rappuoli, Vega Masignani, Nathalie Norais
Abstract:
The gram-negative bacterium Neisseria meningitidis serogroup B (MenB) is an exclusively human pathogen representing the major cause of meningitides and severe sepsis in infants and children but also in young adults. This pathogen is usually present in the 30% of healthy population that act as a reservoir, spreading it through saliva and respiratory fluids during coughing, sneezing, kissing. Among surface-exposed protein components of this diplococcus, factor H binding protein is a lipoprotein proved to be a protective antigen used as a component of the recently licensed Bexsero vaccine. fHbp is a highly variable meningococcal protein: to reflect its remarkable sequence variability, it has been classified in three variants (or two subfamilies), and with poor cross-protection among the different variants. Furthermore, the level of fHbp expression varies significantly among strains, and this has also been considered an important factor for predicting MenB strain susceptibility to anti-fHbp antisera. Different methods have been used to assess fHbp expression on meningococcal strains, however, all these methods use anti-fHbp antibodies, and for this reason, the results are affected by the different affinity that antibodies can have to different antigenic variants. To overcome the limitations of an antibody-based quantification, we developed a quantitative Mass Spectrometry (MS) approach. Selected Reaction Monitoring (SRM) recently emerged as a powerful MS tool for detecting and quantifying proteins in complex mixtures. SRM is based on the targeted detection of ProteoTypicPeptides (PTPs), which are unique signatures of a protein that can be easily detected and quantified by MS. This approach, proven to be highly sensitive, quantitatively accurate and highly reproducible, was used to quantify the absolute amount of fHbp antigen in total extracts derived from 105 clinical isolates, evenly distributed among the three main variant groups and selected to be representative of the fHbp circulating subvariants around the world. We extended the study at the genetic level investigating the correlation between the differential level of expression and polymorphisms present within the genes and their promoter sequences. The implications of fHbp expression on the susceptibility of the strain to killing by anti-fHbp antisera are also presented. To date this is the first comprehensive fHbp expression profiling in a large panel of Neisseria meningitidis clinical isolates driven by an antibody-independent MS-based methodology, opening the door to new applications in vaccine coverage prediction and reinforcing the molecular understanding of released vaccines.Keywords: quantitative mass spectrometry, Neisseria meningitidis, vaccines, bexsero, molecular epidemiology
Procedia PDF Downloads 3149689 Analysis, Evaluation and Optimization of Food Management: Minimization of Food Losses and Food Wastage along the Food Value Chain
Authors: G. Hafner
Abstract:
A method developed at the University of Stuttgart will be presented: ‘Analysis, Evaluation and Optimization of Food Management’. A major focus is represented by quantification of food losses and food waste as well as their classification and evaluation regarding a system optimization through waste prevention. For quantification and accounting of food, food losses and food waste along the food chain, a clear definition of core terms is required at the beginning. This includes their methodological classification and demarcation within sectors of the food value chain. The food chain is divided into agriculture, industry and crafts, trade and consumption (at home and out of home). For adjustment of core terms, the authors have cooperated with relevant stakeholders in Germany for achieving the goal of holistic and agreed definitions for the whole food chain. This includes modeling of sub systems within the food value chain, definition of terms, differentiation between food losses and food wastage as well as methodological approaches. ‘Food Losses’ and ‘Food Wastes’ are assigned to individual sectors of the food chain including a description of the respective methods. The method for analyzing, evaluation and optimization of food management systems consist of the following parts: Part I: Terms and Definitions. Part II: System Modeling. Part III: Procedure for Data Collection and Accounting Part. IV: Methodological Approaches for Classification and Evaluation of Results. Part V: Evaluation Parameters and Benchmarks. Part VI: Measures for Optimization. Part VII: Monitoring of Success The method will be demonstrated at the example of an invesigation of food losses and food wastage in the Federal State of Bavaria including an extrapolation of respective results to quantify food wastage in Germany.Keywords: food losses, food waste, resource management, waste management, system analysis, waste minimization, resource efficiency
Procedia PDF Downloads 4059688 Real Time Detection of Application Layer DDos Attack Using Log Based Collaborative Intrusion Detection System
Authors: Farheen Tabassum, Shoab Ahmed Khan
Abstract:
The brutality of attacks on networks and decisive infrastructures are on the climb over recent years and appears to continue to do so. Distributed Denial of service attack is the most prevalent and easy attack on the availability of a service due to the easy availability of large botnet computers at cheap price and the general lack of protection against these attacks. Application layer DDoS attack is DDoS attack that is targeted on wed server, application server or database server. These types of attacks are much more sophisticated and challenging as they get around most conventional network security devices because attack traffic often impersonate normal traffic and cannot be recognized by network layer anomalies. Conventional techniques of single-hosted security systems are becoming gradually less effective in the face of such complicated and synchronized multi-front attacks. In order to protect from such attacks and intrusion, corporation among all network devices is essential. To overcome this issue, a collaborative intrusion detection system (CIDS) is proposed in which multiple network devices share valuable information to identify attacks, as a single device might not be capable to sense any malevolent action on its own. So it helps us to take decision after analyzing the information collected from different sources. This novel attack detection technique helps to detect seemingly benign packets that target the availability of the critical infrastructure, and the proposed solution methodology shall enable the incident response teams to detect and react to DDoS attacks at the earliest stage to ensure that the uptime of the service remain unaffected. Experimental evaluation shows that the proposed collaborative detection approach is much more effective and efficient than the previous approaches.Keywords: Distributed Denial-of-Service (DDoS), Collaborative Intrusion Detection System (CIDS), Slowloris, OSSIM (Open Source Security Information Management tool), OSSEC HIDS
Procedia PDF Downloads 3559687 Fault Detection and Diagnosis of Broken Bar Problem in Induction Motors Base Wavelet Analysis and EMD Method: Case Study of Mobarakeh Steel Company in Iran
Authors: M. Ahmadi, M. Kafil, H. Ebrahimi
Abstract:
Nowadays, induction motors have a significant role in industries. Condition monitoring (CM) of this equipment has gained a remarkable importance during recent years due to huge production losses, substantial imposed costs and increases in vulnerability, risk, and uncertainty levels. Motor current signature analysis (MCSA) is one of the most important techniques in CM. This method can be used for rotor broken bars detection. Signal processing methods such as Fast Fourier transformation (FFT), Wavelet transformation and Empirical Mode Decomposition (EMD) are used for analyzing MCSA output data. In this study, these signal processing methods are used for broken bar problem detection of Mobarakeh steel company induction motors. Based on wavelet transformation method, an index for fault detection, CF, is introduced which is the variation of maximum to the mean of wavelet transformation coefficients. We find that, in the broken bar condition, the amount of CF factor is greater than the healthy condition. Based on EMD method, the energy of intrinsic mode functions (IMF) is calculated and finds that when motor bars become broken the energy of IMFs increases.Keywords: broken bar, condition monitoring, diagnostics, empirical mode decomposition, fourier transform, wavelet transform
Procedia PDF Downloads 1509686 Development of Real Time System for Human Detection and Localization from Unmanned Aerial Vehicle Using Optical and Thermal Sensor and Visualization on Geographic Information Systems Platform
Authors: Nemi Bhattarai
Abstract:
In recent years, there has been a rapid increase in the use of Unmanned Aerial Vehicle (UAVs) in search and rescue (SAR) operations, disaster management, and many more areas where information about the location of human beings are important. This research will primarily focus on the use of optical and thermal camera via UAV platform in real-time detection, localization, and visualization of human beings on GIS. This research will be beneficial in disaster management search of lost humans in wilderness or difficult terrain, detecting abnormal human behaviors in border or security tight areas, studying distribution of people at night, counting people density in crowd, manage people flow during evacuation, planning provisions in areas with high human density and many more.Keywords: UAV, human detection, real-time, localization, visualization, haar-like, GIS, thermal sensor
Procedia PDF Downloads 4669685 Evaluating the Effect of Climate Change and Land Use/Cover Change on Catchment Hydrology of Gumara Watershed, Upper Blue Nile Basin, Ethiopia
Authors: Gashaw Gismu Chakilu
Abstract:
Climate and land cover change are very important issues in terms of global context and their responses to environmental and socio-economic drivers. The dynamic of these two factors is currently affecting the environment in unbalanced way including watershed hydrology. In this paper individual and combined impacts of climate change and land use land cover change on hydrological processes were evaluated through applying the model Soil and Water Assessment Tool (SWAT) in Gumara watershed, Upper Blue Nile basin Ethiopia. The regional climate; temperature and rainfall data of the past 40 years in the study area were prepared and changes were detected by using trend analysis applying Mann-Kendall trend test. The land use land cover data were obtained from land sat image and processed by ERDAS IMAGIN 2010 software. Three land use land cover data; 1973, 1986, and 2013 were prepared and these data were used for base line, model calibration and change study respectively. The effects of these changes on high flow and low flow of the catchment have also been evaluated separately. The high flow of the catchment for these two decades was analyzed by using Annual Maximum (AM) model and the low flow was evaluated by seven day sustained low flow model. Both temperature and rainfall showed increasing trend; and then the extent of changes were evaluated in terms of monthly bases by using two decadal time periods; 1973-1982 was taken as baseline and 2004-2013 was used as change study. The efficiency of the model was determined by Nash-Sutcliffe (NS) and Relative Volume error (RVe) and their values were 0.65 and 0.032 for calibration and 0.62 and 0.0051 for validation respectively. The impact of climate change was higher than that of land use land cover change on stream flow of the catchment; the flow has been increasing by 16.86% and 7.25% due to climate and LULC change respectively, and the combined change effect accounted 22.13% flow increment. The overall results of the study indicated that Climate change is more responsible for high flow than low flow; and reversely the land use land cover change showed more significant effect on low flow than high flow of the catchment. From the result we conclude that the hydrology of the catchment has been altered because of changes of climate and land cover of the study area.Keywords: climate, LULC, SWAT, Ethiopia
Procedia PDF Downloads 3769684 Pyramidal Lucas-Kanade Optical Flow Based Moving Object Detection in Dynamic Scenes
Authors: Hyojin Lim, Cuong Nguyen Khac, Yeongyu Choi, Ho-Youl Jung
Abstract:
In this paper, we propose a simple moving object detection, which is based on motion vectors obtained from pyramidal Lucas-Kanade optical flow. The proposed method detects moving objects such as pedestrians, the other vehicles and some obstacles at the front-side of the host vehicle, and it can provide the warning to the driver. Motion vectors are obtained by using pyramidal Lucas-Kanade optical flow, and some outliers are eliminated by comparing the amplitude of each vector with the pre-defined threshold value. The background model is obtained by calculating the mean and the variance of the amplitude of recent motion vectors in the rectangular shaped local region called the cell. The model is applied as the reference to classify motion vectors of moving objects and those of background. Motion vectors are clustered to rectangular regions by using the unsupervised clustering K-means algorithm. Labeling method is applied to label groups which is close to each other, using by distance between each center points of rectangular. Through the simulations tested on four kinds of scenarios such as approaching motorbike, vehicle, and pedestrians to host vehicle, we prove that the proposed is simple but efficient for moving object detection in parking lots.Keywords: moving object detection, dynamic scene, optical flow, pyramidal optical flow
Procedia PDF Downloads 3509683 Neural Network-based Risk Detection for Dyslexia and Dysgraphia in Sinhala Language Speaking Children
Authors: Budhvin T. Withana, Sulochana Rupasinghe
Abstract:
The problem of Dyslexia and Dysgraphia, two learning disabilities that affect reading and writing abilities, respectively, is a major concern for the educational system. Due to the complexity and uniqueness of the Sinhala language, these conditions are especially difficult for children who speak it. The traditional risk detection methods for Dyslexia and Dysgraphia frequently rely on subjective assessments, making it difficult to cover a wide range of risk detection and time-consuming. As a result, diagnoses may be delayed and opportunities for early intervention may be lost. The project was approached by developing a hybrid model that utilized various deep learning techniques for detecting risk of Dyslexia and Dysgraphia. Specifically, Resnet50, VGG16 and YOLOv8 were integrated to detect the handwriting issues, and their outputs were fed into an MLP model along with several other input data. The hyperparameters of the MLP model were fine-tuned using Grid Search CV, which allowed for the optimal values to be identified for the model. This approach proved to be effective in accurately predicting the risk of Dyslexia and Dysgraphia, providing a valuable tool for early detection and intervention of these conditions. The Resnet50 model achieved an accuracy of 0.9804 on the training data and 0.9653 on the validation data. The VGG16 model achieved an accuracy of 0.9991 on the training data and 0.9891 on the validation data. The MLP model achieved an impressive training accuracy of 0.99918 and a testing accuracy of 0.99223, with a loss of 0.01371. These results demonstrate that the proposed hybrid model achieved a high level of accuracy in predicting the risk of Dyslexia and Dysgraphia.Keywords: neural networks, risk detection system, Dyslexia, Dysgraphia, deep learning, learning disabilities, data science
Procedia PDF Downloads 1179682 Multivariate Data Analysis for Automatic Atrial Fibrillation Detection
Authors: Zouhair Haddi, Stephane Delliaux, Jean-Francois Pons, Ismail Kechaf, Jean-Claude De Haro, Mustapha Ouladsine
Abstract:
Atrial fibrillation (AF) has been considered as the most common cardiac arrhythmia, and a major public health burden associated with significant morbidity and mortality. Nowadays, telemedical approaches targeting cardiac outpatients situate AF among the most challenged medical issues. The automatic, early, and fast AF detection is still a major concern for the healthcare professional. Several algorithms based on univariate analysis have been developed to detect atrial fibrillation. However, the published results do not show satisfactory classification accuracy. This work was aimed at resolving this shortcoming by proposing multivariate data analysis methods for automatic AF detection. Four publicly-accessible sets of clinical data (AF Termination Challenge Database, MIT-BIH AF, Normal Sinus Rhythm RR Interval Database, and MIT-BIH Normal Sinus Rhythm Databases) were used for assessment. All time series were segmented in 1 min RR intervals window and then four specific features were calculated. Two pattern recognition methods, i.e., Principal Component Analysis (PCA) and Learning Vector Quantization (LVQ) neural network were used to develop classification models. PCA, as a feature reduction method, was employed to find important features to discriminate between AF and Normal Sinus Rhythm. Despite its very simple structure, the results show that the LVQ model performs better on the analyzed databases than do existing algorithms, with high sensitivity and specificity (99.19% and 99.39%, respectively). The proposed AF detection holds several interesting properties, and can be implemented with just a few arithmetical operations which make it a suitable choice for telecare applications.Keywords: atrial fibrillation, multivariate data analysis, automatic detection, telemedicine
Procedia PDF Downloads 2699681 Survey of Intrusion Detection Systems and Their Assessment of the Internet of Things
Authors: James Kaweesa
Abstract:
The Internet of Things (IoT) has become a critical component of modern technology, enabling the connection of numerous devices to the internet. The interconnected nature of IoT devices, along with their heterogeneous and resource-constrained nature, makes them vulnerable to various types of attacks, such as malware, denial-of-service attacks, and network scanning. Intrusion Detection Systems (IDSs) are a key mechanism for protecting IoT networks and from attacks by identifying and alerting administrators to suspicious activities. In this review, the paper will discuss the different types of IDSs available for IoT systems and evaluate their effectiveness in detecting and preventing attacks. Also, examine the various evaluation methods used to assess the performance of IDSs and the challenges associated with evaluating them in IoT environments. The review will highlight the need for effective and efficient IDSs that can cope with the unique characteristics of IoT networks, including their heterogeneity, dynamic topology, and resource constraints. The paper will conclude by indicating where further research is needed to develop IDSs that can address these challenges and effectively protect IoT systems from cyber threats.Keywords: cyber-threats, iot, intrusion detection system, networks
Procedia PDF Downloads 819680 Critical Success Factors Quality Requirement Change Management
Authors: Jamshed Ahmad, Abdul Wahid Khan, Javed Ali Khan
Abstract:
Managing software quality requirements change management is a difficult task in the field of software engineering. Avoiding incoming changes result in user dissatisfaction while accommodating to many requirement changes may delay product delivery. Poor requirements management is solely considered the primary cause of the software failure. It becomes more challenging in global software outsourcing. Addressing success factors in quality requirement change management is desired today due to the frequent change requests from the end-users. In this research study, success factors are recognized and scrutinized with the help of a systematic literature review (SLR). In total, 16 success factors were identified, which significantly impacted software quality requirement change management. The findings show that Proper Requirement Change Management, Rapid Delivery, Quality Software Product, Access to Market, Project Management, Skills and Methodologies, Low Cost/Effort Estimation, Clear Plan and Road Map, Agile Processes, Low Labor Cost, User Satisfaction, Communication/Close Coordination, Proper Scheduling and Time Constraints, Frequent Technological Changes, Robust Model, Geographical distribution/Cultural differences are the key factors that influence software quality requirement change. The recognized success factors and validated with the help of various research methods, i.e., case studies, interviews, surveys and experiments. These factors are then scrutinized in continents, database, company size and period of time. Based on these findings, requirement change will be implemented in a better way.Keywords: global software development, requirement engineering, systematic literature review, success factors
Procedia PDF Downloads 1979679 An in Situ Dna Content Detection Enabled by Organic Long-persistent Luminescence Materials with Tunable Afterglow-time in Water and Air
Authors: Desissa Yadeta Muleta
Abstract:
Purely organic long-persistent luminescence materials (OLPLMs) have been developed as emerging organic materials due to their simple production process, low preparation cost and better biocompatibilities. Notably, OLPLMs with afterglow-time-tunable long-persistent luminescence (LPL) characteristics enable higher-level protection applications and have great prospects in biological applications. The realization of these advanced performances depends on our ability to gradually tune LPL duration under ambient conditions, however, the strategies to achieve this are few due to the lack of unambiguous mechanisms. Here, we propose a two-step strategy to gradually tune LPL duration of OLPLMs over a wide range of seconds in water and air, by using derivatives as the guest and introducing a third-party material into the host-immobilized host–guest doping system. Based on this strategy, we develop an analysis method for deoxyribonucleic acid (DNA) content detection without DNA separation in aqueous samples, which circumvents the influence of the chromophore, fluorophore and other interferents in vivo, enabling a certain degree of in situ detection that is difficult to achieve using today’s methods. This work will expedite the development of afterglow-time-tunable OLPLMs and expand new horizons for their applications in data protection, bio-detection, and bio-sensingKeywords: deoxyribonucliec acid, long persistent luminescent materials, water, air
Procedia PDF Downloads 779678 From Electroencephalogram to Epileptic Seizures Detection by Using Artificial Neural Networks
Authors: Gaetano Zazzaro, Angelo Martone, Roberto V. Montaquila, Luigi Pavone
Abstract:
Seizure is the main factor that affects the quality of life of epileptic patients. The diagnosis of epilepsy, and hence the identification of epileptogenic zone, is commonly made by using continuous Electroencephalogram (EEG) signal monitoring. Seizure identification on EEG signals is made manually by epileptologists and this process is usually very long and error prone. The aim of this paper is to describe an automated method able to detect seizures in EEG signals, using knowledge discovery in database process and data mining methods and algorithms, which can support physicians during the seizure detection process. Our detection method is based on Artificial Neural Network classifier, trained by applying the multilayer perceptron algorithm, and by using a software application, called Training Builder that has been developed for the massive extraction of features from EEG signals. This tool is able to cover all the data preparation steps ranging from signal processing to data analysis techniques, including the sliding window paradigm, the dimensionality reduction algorithms, information theory, and feature selection measures. The final model shows excellent performances, reaching an accuracy of over 99% during tests on data of a single patient retrieved from a publicly available EEG dataset.Keywords: artificial neural network, data mining, electroencephalogram, epilepsy, feature extraction, seizure detection, signal processing
Procedia PDF Downloads 1899677 Highly Specific DNA-Aptamer-Based Electrochemical Biosensor for Mercury (II) and Lead (II) Ions Detection in Water Samples
Authors: H. Abu-Ali, A. Nabok, T. Smith
Abstract:
Aptamers are single-strand of DNA or RNA nucleotides sequence which is designed in vitro using selection process known as SELEX (systematic evolution of ligands by exponential enrichment) were developed for the selective detection of many toxic materials. In this work, we have developed an electrochemical biosensor for highly selective and sensitive detection of Hg2+ and Pb2+ using a specific aptamer probe (SAP) labelled with ferrocene (or methylene blue) in (5′) end and the thiol group at its (3′) termini, respectively. The SAP has a specific coil structure that matching with G-G for Pb2+ and T-T for Hg2+ interaction binding nucleotides ions, respectively. Aptamers were immobilized onto surface of screen-printed gold electrodes via SH groups; then the cyclic voltammograms were recorded in binding buffer with the addition of the above metal salts in different concentrations. The resulted values of anode current increase upon binding heavy metal ions to aptamers and analyte due to the presence of electrochemically active probe, i.e. ferrocene or methylene blue group. The correlation between the anodic current values and the concentrations of Hg2+ and Pb2+ ions has been established in this work. To the best of our knowledge, this is the first example of using a specific DNA aptamers for electrochemical detection of heavy metals. Each increase in concentration of 0.1 μM results in an increase in the anode current value by simple DC electrochemical test i.e (Cyclic Voltammetry), thus providing an easy way of determining Hg2+ and Pb2+concentration.Keywords: aptamer, based, biosensor, DNA, electrochemical, highly, specific
Procedia PDF Downloads 1629676 Detection of Cardiac Arrhythmia Using Principal Component Analysis and Xgboost Model
Authors: Sujay Kotwale, Ramasubba Reddy M.
Abstract:
Electrocardiogram (ECG) is a non-invasive technique used to study and analyze various heart diseases. Cardiac arrhythmia is a serious heart disease which leads to death of the patients, when left untreated. An early-time detection of cardiac arrhythmia would help the doctors to do proper treatment of the heart. In the past, various algorithms and machine learning (ML) models were used to early-time detection of cardiac arrhythmia, but few of them have achieved better results. In order to improve the performance, this paper implements principal component analysis (PCA) along with XGBoost model. The PCA was implemented to the raw ECG signals which suppress redundancy information and extracted significant features. The obtained significant ECG features were fed into XGBoost model and the performance of the model was evaluated. In order to valid the proposed technique, raw ECG signals obtained from standard MIT-BIH database were employed for the analysis. The result shows that the performance of proposed method is superior to the several state-of-the-arts techniques.Keywords: cardiac arrhythmia, electrocardiogram, principal component analysis, XGBoost
Procedia PDF Downloads 1209675 Monocular 3D Person Tracking AIA Demographic Classification and Projective Image Processing
Authors: McClain Thiel
Abstract:
Object detection and localization has historically required two or more sensors due to the loss of information from 3D to 2D space, however, most surveillance systems currently in use in the real world only have one sensor per location. Generally, this consists of a single low-resolution camera positioned above the area under observation (mall, jewelry store, traffic camera). This is not sufficient for robust 3D tracking for applications such as security or more recent relevance, contract tracing. This paper proposes a lightweight system for 3D person tracking that requires no additional hardware, based on compressed object detection convolutional-nets, facial landmark detection, and projective geometry. This approach involves classifying the target into a demographic category and then making assumptions about the relative locations of facial landmarks from the demographic information, and from there using simple projective geometry and known constants to find the target's location in 3D space. Preliminary testing, although severely lacking, suggests reasonable success in 3D tracking under ideal conditions.Keywords: monocular distancing, computer vision, facial analysis, 3D localization
Procedia PDF Downloads 1419674 Welfare Estimation in a General Equilibrium Model with Cities
Authors: Oded Hochman
Abstract:
We first show that current measures of welfare changes in the whole economy do not apply to an economy with cities. In addition, since such measures are defined over a partial equilibrium, they capture only partially the effect of a welfare change. We then define a unique and additive measure that we term the modified economic surplus (mES) which fully captures the welfare effects caused by a change in the price of a nationally traded good. We show that the price change causes, on the one hand a change of land rents in the economy and, on the other hand, an equal change of mES that can be estimated by measuring areas in the price-quantity national demand and supply plane. We construct for each city a cost function from which we derive a city’s and, after aggregation, an economy-wide demand and supply functions of nationwide prices and of either the unearned incomes (Marshalian functions) or the utility levels (compensated functions).Keywords: city cost function, welfare measures, modified compensated variation, modified economic surplus, unearned income function, differential land rents, city size
Procedia PDF Downloads 3219673 Modeling the Impacts of Road Construction on Lands Values
Authors: Maha Almumaiz, Harry Evdorides
Abstract:
Change in land value typically occurs when a new interurban road construction causes an increase in accessibility; this change in the adjacent lands values differs according to land characteristics such as geographic location, land use type, land area and sale time (appraisal time). A multiple regression model is obtained to predict the percent change in land value (CLV) based on four independent variables namely land distance from the constructed road, area of land, nature of land use and time from the works completion of the road. The random values of percent change in land value were generated using Microsoft Excel with a range of up to 35%. The trend of change in land value with the four independent variables was determined from the literature references. The statistical analysis and model building process has been made by using the IBM SPSS V23 software. The Regression model suggests, for lands that are located within 3 miles as the straight distance from the road, the percent CLV is between (0-35%) which is depending on many factors including distance from the constructed road, land use, land area and time from works completion of the new road.Keywords: interurban road, land use types, new road construction, percent CLV, regression model
Procedia PDF Downloads 2669672 Socio-Economic Setting and Implications to Climate Change Impacts in Eastern Cape Province, South Africa
Authors: Kenneth Nhundu, Leocadia Zhou, Farhad Aghdasi, Voster Muchenje
Abstract:
Climate change poses increased risks to rural communities that rely on natural resources, such as forests, cropland and rangeland, waterways, and open spaces Because of their connection to the land and the potential for climate change to impact natural resources and disrupt ecosystems and seasons, rural livelihoods and well-being are disproportionately vulnerable to climate change. Climate change has the potential to affect the environment in a number of ways that place increased stress on everyone, but disproportionately on the most vulnerable populations, including the young, the old, those with chronic illness, and the poor. The communities in the study area are predominantly rural, resource-based and are generally surrounded by public or private lands that are dominated by natural resources, including forests, rangelands, and agriculture. The livelihoods of these communities are tied to natural resources. Therefore, targeted strategies to cope will be required. This paper assessed the household socio-economic characteristics and their implications to household vulnerability to climate change impacts in the rural Eastern Cape Province, South Africa. The results indicate that the rural communities are climate-vulnerable populations as they have a large proportion of people who are less economically or physically capable of adapting to climate change. The study therefore recommends that at each level, the needs, knowledge, and voices of vulnerable populations, including indigenous peoples and resource-based communities, deserve consideration and incorporation so that climate change policy (1) ensures that all people are supported and able to act, (2) provides as robust a strategy as possible to address a rapidly changing environment, and (3) enhances equity and justice.Keywords: climate change, vulnerable, socio-economic, livelihoods
Procedia PDF Downloads 3569671 Video Foreground Detection Based on Adaptive Mixture Gaussian Model for Video Surveillance Systems
Authors: M. A. Alavianmehr, A. Tashk, A. Sodagaran
Abstract:
Modeling background and moving objects are significant techniques for video surveillance and other video processing applications. This paper presents a foreground detection algorithm that is robust against illumination changes and noise based on adaptive mixture Gaussian model (GMM), and provides a novel and practical choice for intelligent video surveillance systems using static cameras. In the previous methods, the image of still objects (background image) is not significant. On the contrary, this method is based on forming a meticulous background image and exploiting it for separating moving objects from their background. The background image is specified either manually, by taking an image without vehicles, or is detected in real-time by forming a mathematical or exponential average of successive images. The proposed scheme can offer low image degradation. The simulation results demonstrate high degree of performance for the proposed method.Keywords: image processing, background models, video surveillance, foreground detection, Gaussian mixture model
Procedia PDF Downloads 5169670 Developing Commitment to Change in Egyptian Modern Bureaucracies
Authors: Nada Basset
Abstract:
Purpose: To examine the nature of the civil service sector as an employer through identifying the likely ways to develop employees’ commitment towards change in the civil service sector. Design/Methodology/Approach: a qualitative research approach was followed. Data was collected via a triangulation of interviews, non-participant observation and archival documents analysis. Non-probability sampling took place with a case-study method applied on a sample of 33 civil servants working in the Egyptian Ministry of State for Administrative Development (MSAD) which is the civil service entity acting as the change agent responsible for managing the government administrative reforms plan in the civil service sector. All study participants were actually working in one of the change projects/programmes and had a minimum of 12 months of service in the civil service. Interviews were digitally recorded and transcribed in the form of MS-Word documents, and data transcripts were analyzed manually using MS-Excel worksheets and main research themes were developed and statistics drawn using those Excel worksheets. Findings: The results demonstrate that developing the civil servant’s commitment towards change may require a number of suggested solutions like (1) employee involvement and participation in the planning and implementation processes, (2) linking the employee support to change to some tangible rewards and incentives, (3) appointing some inspirational change leaders that should act as role models, and (4) as a last resort, enforcing employee’s commitment towards change by coercion and authoritarianism. Practical Implications: it is clear that civil servants’ lack of organizational commitment is not directly related to their level of commitment towards change. The research findings showed that civil servants’ commitment towards change can be raised and promoted by getting them involved in the planning and implementation processes, as this develops some sense of belongingness and ownership, thus there is a fair chance that low organizationally committed civil servants can develop high commitment towards change; given they are provided a favorable environment where they are invited to participate and get involved into the move of change. Originality/Value: the research addresses a relatively new area of ‘developing organizational commitment in modern bureaucracies’ by virtue of investigating the levels of civil servants’ commitment towards their jobs and/or organizations -on one hand- and suggesting different ways of developing their commitment towards administrative reform and change initiatives in the Egyptian civil service sector.Keywords: change, commitment, Egypt, bureaucracy
Procedia PDF Downloads 4839669 Multiaxial Fatigue Analysis of a High Performance Nickel-Based Superalloy
Authors: P. Selva, B. Lorraina, J. Alexis, A. Seror, A. Longuet, C. Mary, F. Denard
Abstract:
Over the past four decades, the fatigue behavior of nickel-based alloys has been widely studied. However, in recent years, significant advances in the fabrication process leading to grain size reduction have been made in order to improve fatigue properties of aircraft turbine discs. Indeed, a change in particle size affects the initiation mode of fatigue cracks as well as the fatigue life of the material. The present study aims to investigate the fatigue behavior of a newly developed nickel-based superalloy under biaxial-planar loading. Low Cycle Fatigue (LCF) tests are performed at different stress ratios so as to study the influence of the multiaxial stress state on the fatigue life of the material. Full-field displacement and strain measurements as well as crack initiation detection are obtained using Digital Image Correlation (DIC) techniques. The aim of this presentation is first to provide an in-depth description of both the experimental set-up and protocol: the multiaxial testing machine, the specific design of the cruciform specimen and performances of the DIC code are introduced. Second, results for sixteen specimens related to different load ratios are presented. Crack detection, strain amplitude and number of cycles to crack initiation vs. triaxial stress ratio for each loading case are given. Third, from fractographic investigations by scanning electron microscopy it is found that the mechanism of fatigue crack initiation does not depend on the triaxial stress ratio and that most fatigue cracks initiate from subsurface carbides.Keywords: cruciform specimen, multiaxial fatigue, nickel-based superalloy
Procedia PDF Downloads 2969668 Vehicle Timing Motion Detection Based on Multi-Dimensional Dynamic Detection Network
Authors: Jia Li, Xing Wei, Yuchen Hong, Yang Lu
Abstract:
Detecting vehicle behavior has always been the focus of intelligent transportation, but with the explosive growth of the number of vehicles and the complexity of the road environment, the vehicle behavior videos captured by traditional surveillance have been unable to satisfy the study of vehicle behavior. The traditional method of manually labeling vehicle behavior is too time-consuming and labor-intensive, but the existing object detection and tracking algorithms have poor practicability and low behavioral location detection rate. This paper proposes a vehicle behavior detection algorithm based on the dual-stream convolution network and the multi-dimensional video dynamic detection network. In the videos, the straight-line behavior of the vehicle will default to the background behavior. The Changing lanes, turning and turning around are set as target behaviors. The purpose of this model is to automatically mark the target behavior of the vehicle from the untrimmed videos. First, the target behavior proposals in the long video are extracted through the dual-stream convolution network. The model uses a dual-stream convolutional network to generate a one-dimensional action score waveform, and then extract segments with scores above a given threshold M into preliminary vehicle behavior proposals. Second, the preliminary proposals are pruned and identified using the multi-dimensional video dynamic detection network. Referring to the hierarchical reinforcement learning, the multi-dimensional network includes a Timer module and a Spacer module, where the Timer module mines time information in the video stream and the Spacer module extracts spatial information in the video frame. The Timer and Spacer module are implemented by Long Short-Term Memory (LSTM) and start from an all-zero hidden state. The Timer module uses the Transformer mechanism to extract timing information from the video stream and extract features by linear mapping and other methods. Finally, the model fuses time information and spatial information and obtains the location and category of the behavior through the softmax layer. This paper uses recall and precision to measure the performance of the model. Extensive experiments show that based on the dataset of this paper, the proposed model has obvious advantages compared with the existing state-of-the-art behavior detection algorithms. When the Time Intersection over Union (TIoU) threshold is 0.5, the Average-Precision (MP) reaches 36.3% (the MP of baselines is 21.5%). In summary, this paper proposes a vehicle behavior detection model based on multi-dimensional dynamic detection network. This paper introduces spatial information and temporal information to extract vehicle behaviors in long videos. Experiments show that the proposed algorithm is advanced and accurate in-vehicle timing behavior detection. In the future, the focus will be on simultaneously detecting the timing behavior of multiple vehicles in complex traffic scenes (such as a busy street) while ensuring accuracy.Keywords: vehicle behavior detection, convolutional neural network, long short-term memory, deep learning
Procedia PDF Downloads 132