Search results for: software vulnerability detection
7944 Labview-Based System for Fiber Links Events Detection
Authors: Bo Liu, Qingshan Kong, Weiqing Huang
Abstract:
With the rapid development of modern communication, diagnosing the fiber-optic quality and faults in real-time is widely focused. In this paper, a Labview-based system is proposed for fiber-optic faults detection. The wavelet threshold denoising method combined with Empirical Mode Decomposition (EMD) is applied to denoise the optical time domain reflectometer (OTDR) signal. Then the method based on Gabor representation is used to detect events. Experimental measurements show that signal to noise ratio (SNR) of the OTDR signal is improved by 1.34dB on average, compared with using the wavelet threshold denosing method. The proposed system has a high score in event detection capability and accuracy. The maximum detectable fiber length of the proposed Labview-based system can be 65km.Keywords: empirical mode decomposition, events detection, Gabor transform, optical time domain reflectometer, wavelet threshold denoising
Procedia PDF Downloads 1237943 Talent Sourcing Practices in Sri Lankan Software Industry
Authors: Malmi Amadoru, Chandana Gamage
Abstract:
Sri Lanka is emerging as a global IT-BPO hub topping up among the 20 global outsourcing destinations. When setting up a new venture in Sri Lanka, talent sourcing plays one of the key functions due to the rapid growth of workforce. Getting competent people with right skills for right positions leads organizations achieving its vision, mission and objectives. It also drives in earning competitive advantage over industry competitors. Thus it is crucial to scan and recruit the best employees to an organization. However there is no published information available on recruitment methods utilized in Sri Lankan software industry, as a study of this nature had not being conducted previously in Sri Lanka. The main objective of this study was to explore various talent sourcing practices exploited in Sri Lankan software industry. Also this study analyses the extent which Sri Lanka has adopted different recruitment strategies utilized in worldwide and its deviations. The research outcome is beneficial for HR professionals to identify the current trends in recruitment practices. Moreover investors who are interested in IT-BPO engagements can gain a thorough knowledge about talent sourcing techniques in Sri Lankan software industry. Finally, this research clues trending areas which can be further investigated in future.Keywords: IT-BPO, recruitment, Sri Lanka, software industry, talent
Procedia PDF Downloads 4887942 Indicator-Immobilized, Cellulose Based Optical Sensing Membrane for the Detection of Heavy Metal Ions
Authors: Nisha Dhariwal, Anupama Sharma
Abstract:
The synthesis of cellulose nanofibrils quaternized with 3‐chloro‐2‐hydroxypropyltrimethylammonium chloride (CHPTAC) in NaOH/urea aqueous solution has been reported. Xylenol Orange (XO) has been used as an indicator for selective detection of Sn (II) ions, by its immobilization on quaternized cellulose membrane. The effects of pH, reagent concentration and reaction time on the immobilization of XO have also been studied. The linear response, limit of detection, and interference of other metal ions have also been studied and no significant interference has been observed. The optical chemical sensor displayed good durability and short response time with negligible leaching of the reagent.Keywords: cellulose, chemical sensor, heavy metal ions, indicator immobilization
Procedia PDF Downloads 3017941 Surface Hole Defect Detection of Rolled Sheets Based on Pixel Classification Approach
Authors: Samira Taleb, Sakina Aoun, Slimane Ziani, Zoheir Mentouri, Adel Boudiaf
Abstract:
Rolling is a pressure treatment technique that modifies the shape of steel ingots or billets between rotating rollers. During this process, defects may form on the surface of the rolled sheets and are likely to affect the performance and quality of the finished product. In our study, we developed a method for detecting surface hole defects using a pixel classification approach. This work includes several steps. First, we performed image preprocessing to delimit areas with and without hole defects on the sheet image. Then, we developed the histograms of each area to generate the gray level membership intervals of the pixels that characterize each area. As we noticed an intersection between the characteristics of the gray level intervals of the images of the two areas, we finally performed a learning step based on a series of detection tests to refine the membership intervals of each area, and to choose the defect detection criterion in order to optimize the recognition of the surface hole.Keywords: classification, defect, surface, detection, hole
Procedia PDF Downloads 177940 Minimizing the Impact of Covariate Detection Limit in Logistic Regression
Authors: Shahadut Hossain, Jacek Wesolowski, Zahirul Hoque
Abstract:
In many epidemiological and environmental studies covariate measurements are subject to the detection limit. In most applications, covariate measurements are usually truncated from below which is known as left-truncation. Because the measuring device, which we use to measure the covariate, fails to detect values falling below the certain threshold. In regression analyses, it causes inflated bias and inaccurate mean squared error (MSE) to the estimators. This paper suggests a response-based regression calibration method to correct the deleterious impact introduced by the covariate detection limit in the estimators of the parameters of simple logistic regression model. Compared to the maximum likelihood method, the proposed method is computationally simpler, and hence easier to implement. It is robust to the violation of distributional assumption about the covariate of interest. In producing correct inference, the performance of the proposed method compared to the other competing methods has been investigated through extensive simulations. A real-life application of the method is also shown using data from a population-based case-control study of non-Hodgkin lymphoma.Keywords: environmental exposure, detection limit, left truncation, bias, ad-hoc substitution
Procedia PDF Downloads 2377939 Development of an Integrated Route Information Management Software
Authors: Oluibukun G. Ajayi, Joseph O. Odumosu, Oladimeji T. Babafemi, Azeez Z. Opeyemi, Asaleye O. Samuel
Abstract:
The need for the complete automation of every procedure of surveying and most especially, its engineering applications cannot be overemphasized due to the many demerits of the conventional manual or analogue approach. This paper presents the summarized details of the development of a Route Information Management (RIM) software. The software, codenamed ‘AutoROUTE’, was encoded using Microsoft visual studio-visual basic package, and it offers complete automation of the computational procedures and plan production involved in route surveying. It was experimented using a route survey data (longitudinal profile and cross sections) of a 2.7 km road which stretches from Dama to Lunko village in Minna, Niger State, acquired with the aid of a Hi-Target DGPS receiver. The developed software (AutoROUTE) is capable of computing the various simple curve parameters, horizontal curve, and vertical curve, and it can also plot road alignment, longitudinal profile, and cross-section with a capability to store this on the SQL incorporated into the Microsoft visual basic software. The plotted plans with AutoROUTE were compared with the plans produced with the conventional AutoCAD Civil 3D software, and AutoROUTE proved to be more user-friendly and accurate because it plots in three decimal places whereas AutoCAD plots in two decimal places. Also, it was discovered that AutoROUTE software is faster in plotting and the stages involved is less cumbersome compared to AutoCAD Civil 3D software.Keywords: automated systems, cross sections, curves, engineering construction, longitudinal profile, route surveying
Procedia PDF Downloads 1487938 Hybrid Anomaly Detection Using Decision Tree and Support Vector Machine
Authors: Elham Serkani, Hossein Gharaee Garakani, Naser Mohammadzadeh, Elaheh Vaezpour
Abstract:
Intrusion detection systems (IDS) are the main components of network security. These systems analyze the network events for intrusion detection. The design of an IDS is through the training of normal traffic data or attack. The methods of machine learning are the best ways to design IDSs. In the method presented in this article, the pruning algorithm of C5.0 decision tree is being used to reduce the features of traffic data used and training IDS by the least square vector algorithm (LS-SVM). Then, the remaining features are arranged according to the predictor importance criterion. The least important features are eliminated in the order. The remaining features of this stage, which have created the highest level of accuracy in LS-SVM, are selected as the final features. The features obtained, compared to other similar articles which have examined the selected features in the least squared support vector machine model, are better in the accuracy, true positive rate, and false positive. The results are tested by the UNSW-NB15 dataset.Keywords: decision tree, feature selection, intrusion detection system, support vector machine
Procedia PDF Downloads 2657937 Assessment of Exploitation Vulnerability of Quantum Communication Systems with Phase Encryption
Authors: Vladimir V. Nikulin, Bekmurza H. Aitchanov, Olimzhon A. Baimuratov
Abstract:
Quantum communication technology takes advantage of the intrinsic properties of laser carriers, such as very high data rates and low power requirements, to offer unprecedented data security. Quantum processes at the physical layer of encryption are used for signal encryption with very competitive performance characteristics. The ultimate range of applications for QC systems spans from fiber-based to free-space links and from secure banking operations to mobile airborne and space-borne networking where they are subjected to channel distortions. Under practical conditions, the channel can alter the optical wave front characteristics, including its phase. In addition, phase noise of the communication source and photo-detection noises alter the signal to bring additional ambiguity into the measurement process. If quantized values of photons are used to encrypt the signal, exploitation of quantum communication links becomes extremely difficult. In this paper, we present the results of analysis and simulation studies of the effects of noise on phase estimation for quantum systems with different number of encryption bases and operating at different power levels.Keywords: encryption, phase distortion, quantum communication, quantum noise
Procedia PDF Downloads 5537936 Developing an Accurate AI Algorithm for Histopathologic Cancer Detection
Authors: Leah Ning
Abstract:
This paper discusses the development of a machine learning algorithm that accurately detects metastatic breast cancer (cancer has spread elsewhere from its origin part) in selected images that come from pathology scans of lymph node sections. Being able to develop an accurate artificial intelligence (AI) algorithm would help significantly in breast cancer diagnosis since manual examination of lymph node scans is both tedious and oftentimes highly subjective. The usage of AI in the diagnosis process provides a much more straightforward, reliable, and efficient method for medical professionals and would enable faster diagnosis and, therefore, more immediate treatment. The overall approach used was to train a convolution neural network (CNN) based on a set of pathology scan data and use the trained model to binarily classify if a new scan were benign or malignant, outputting a 0 or a 1, respectively. The final model’s prediction accuracy is very high, with 100% for the train set and over 70% for the test set. Being able to have such high accuracy using an AI model is monumental in regard to medical pathology and cancer detection. Having AI as a new tool capable of quick detection will significantly help medical professionals and patients suffering from cancer.Keywords: breast cancer detection, AI, machine learning, algorithm
Procedia PDF Downloads 917935 Collision Detection Algorithm Based on Data Parallelism
Authors: Zhen Peng, Baifeng Wu
Abstract:
Modern computing technology enters the era of parallel computing with the trend of sustainable and scalable parallelism. Single Instruction Multiple Data (SIMD) is an important way to go along with the trend. It is able to gather more and more computing ability by increasing the number of processor cores without the need of modifying the program. Meanwhile, in the field of scientific computing and engineering design, many computation intensive applications are facing the challenge of increasingly large amount of data. Data parallel computing will be an important way to further improve the performance of these applications. In this paper, we take the accurate collision detection in building information modeling as an example. We demonstrate a model for constructing a data parallel algorithm. According to the model, a complex object is decomposed into the sets of simple objects; collision detection among complex objects is converted into those among simple objects. The resulting algorithm is a typical SIMD algorithm, and its advantages in parallelism and scalability is unparalleled in respect to the traditional algorithms.Keywords: data parallelism, collision detection, single instruction multiple data, building information modeling, continuous scalability
Procedia PDF Downloads 2907934 Root Mean Square-Based Method for Fault Diagnosis and Fault Detection and Isolation of Current Fault Sensor in an Induction Machine
Authors: Ahmad Akrad, Rabia Sehab, Fadi Alyoussef
Abstract:
Nowadays, induction machines are widely used in industry thankful to their advantages comparing to other technologies. Indeed, there is a big demand because of their reliability, robustness and cost. The objective of this paper is to deal with diagnosis, detection and isolation of faults in a three-phase induction machine. Among the faults, Inter-turn short-circuit fault (ITSC), current sensors fault and single-phase open circuit fault are selected to deal with. However, a fault detection method is suggested using residual errors generated by the root mean square (RMS) of phase currents. The application of this method is based on an asymmetric nonlinear model of Induction Machine considering the winding fault of the three axes frame state space. In addition, current sensor redundancy and sensor fault detection and isolation (FDI) are adopted to ensure safety operation of induction machine drive. Finally, a validation is carried out by simulation in healthy and faulty operation modes to show the benefit of the proposed method to detect and to locate with, a high reliability, the three types of faults.Keywords: induction machine, asymmetric nonlinear model, fault diagnosis, inter-turn short-circuit fault, root mean square, current sensor fault, fault detection and isolation
Procedia PDF Downloads 1997933 Optimizing Machine Learning Through Python Based Image Processing Techniques
Authors: Srinidhi. A, Naveed Ahmed, Twinkle Hareendran, Vriksha Prakash
Abstract:
This work reviews some of the advanced image processing techniques for deep learning applications. Object detection by template matching, image denoising, edge detection, and super-resolution modelling are but a few of the tasks. The paper looks in into great detail, given that such tasks are crucial preprocessing steps that increase the quality and usability of image datasets in subsequent deep learning tasks. We review some of the methods for the assessment of image quality, more specifically sharpness, which is crucial to ensure a robust performance of models. Further, we will discuss the development of deep learning models specific to facial emotion detection, age classification, and gender classification, which essentially includes the preprocessing techniques interrelated with model performance. Conclusions from this study pinpoint the best practices in the preparation of image datasets, targeting the best trade-off between computational efficiency and retaining important image features critical for effective training of deep learning models.Keywords: image processing, machine learning applications, template matching, emotion detection
Procedia PDF Downloads 177932 Self-Organizing Maps for Credit Card Fraud Detection
Authors: ChunYi Peng, Wei Hsuan CHeng, Shyh Kuang Ueng
Abstract:
This study focuses on the application of self-organizing maps (SOM) technology in analyzing credit card transaction data, aiming to enhance the accuracy and efficiency of fraud detection. Som, as an artificial neural network, is particularly suited for pattern recognition and data classification, making it highly effective for the complex and variable nature of credit card transaction data. By analyzing transaction characteristics with SOM, the research identifies abnormal transaction patterns that could indicate potentially fraudulent activities. Moreover, this study has developed a specialized visualization tool to intuitively present the relationships between SOM analysis outcomes and transaction data, aiding financial institution personnel in quickly identifying and responding to potential fraud, thereby reducing financial losses. Additionally, the research explores the integration of SOM technology with composite intelligent system technologies (including finite state machines, fuzzy logic, and decision trees) to further improve fraud detection accuracy. This multimodal approach provides a comprehensive perspective for identifying and understanding various types of fraud within credit card transactions. In summary, by integrating SOM technology with visualization tools and composite intelligent system technologies, this research offers a more effective method of fraud detection for the financial industry, not only enhancing detection accuracy but also deepening the overall understanding of fraudulent activities.Keywords: self-organizing map technology, fraud detection, information visualization, data analysis, composite intelligent system technologies, decision support technologies
Procedia PDF Downloads 587931 On the Representation of Actuator Faults Diagnosis and Systems Invertibility
Authors: F. Sallem, B. Dahhou, A. Kamoun
Abstract:
In this work, the main problem considered is the detection and the isolation of the actuator fault. A new formulation of the linear system is generated to obtain the conditions of the actuator fault diagnosis. The proposed method is based on the representation of the actuator as a subsystem connected with the process system in cascade manner. The designed formulation is generated to obtain the conditions of the actuator fault detection and isolation. Detectability conditions are expressed in terms of the invertibility notions. An example and a comparative analysis with the classic formulation illustrate the performances of such approach for simple actuator fault diagnosis by using the linear model of nuclear reactor.Keywords: actuator fault, Fault detection, left invertibility, nuclear reactor, observability, parameter intervals, system inversion
Procedia PDF Downloads 4057930 A Procedure for Post-Earthquake Damage Estimation Based on Detection of High-Frequency Transients
Authors: Aleksandar Zhelyazkov, Daniele Zonta, Helmut Wenzel, Peter Furtner
Abstract:
In the current research structural health monitoring is considered for addressing the critical issue of post-earthquake damage detection. A non-standard approach for damage detection via acoustic emission is presented - acoustic emissions are monitored in the low frequency range (up to 120 Hz). Such emissions are termed high-frequency transients. Further a damage indicator defined as the Time-Ratio Damage Indicator is introduced. The indicator relies on time-instance measurements of damage initiation and deformation peaks. Based on the time-instance measurements a procedure for estimation of the maximum drift ratio is proposed. Monitoring data is used from a shaking-table test of a full-scale reinforced concrete bridge pier. Damage of the experimental column is successfully detected and the proposed damage indicator is calculated.Keywords: acoustic emission, damage detection, shaking table test, structural health monitoring
Procedia PDF Downloads 2327929 Real-Time Control of Grid-Connected Inverter Based on labVIEW
Authors: L. Benbaouche, H. E. , F. Krim
Abstract:
In this paper we propose real-time control of grid-connected single phase inverter, which is flexible and efficient. The first step is devoted to the study and design of the controller through simulation, conducted by the LabVIEW software on the computer 'host'. The second step is running the application from PXI 'target'. LabVIEW software, combined with NI-DAQmx, gives the tools to easily build applications using the digital to analog converter to generate the PWM control signals. Experimental results show that the effectiveness of LabVIEW software applied to power electronics.Keywords: real-time control, labview, inverter, PWM
Procedia PDF Downloads 5097928 A Method and System for Secure Authentication Using One Time QR Code
Authors: Divyans Mahansaria
Abstract:
User authentication is an important security measure for protecting confidential data and systems. However, the vulnerability while authenticating into a system has significantly increased. Thus, necessary mechanisms must be deployed during the process of authenticating a user to safeguard him/her from the vulnerable attacks. The proposed solution implements a novel authentication mechanism to counter various forms of security breach attacks including phishing, Trojan horse, replay, key logging, Asterisk logging, shoulder surfing, brute force search and others. QR code (Quick Response Code) is a type of matrix barcode or two-dimensional barcode that can be used for storing URLs, text, images and other information. In the proposed solution, during each new authentication request, a QR code is dynamically generated and presented to the user. A piece of generic information is mapped to plurality of elements and stored within the QR code. The mapping of generic information with plurality of elements, randomizes in each new login, and thus the QR code generated for each new authentication request is for one-time use only. In order to authenticate into the system, the user needs to decode the QR code using any QR code decoding software. The QR code decoding software needs to be installed on handheld mobile devices such as smartphones, personal digital assistant (PDA), etc. On decoding the QR code, the user will be presented a mapping between the generic piece of information and plurality of elements using which the user needs to derive cipher secret information corresponding to his/her actual password. Now, in place of the actual password, the user will use this cipher secret information to authenticate into the system. The authentication terminal will receive the cipher secret information and use a validation engine that will decipher the cipher secret information. If the entered secret information is correct, the user will be provided access to the system. Usability study has been carried out on the proposed solution, and the new authentication mechanism was found to be easy to learn and adapt. Mathematical analysis of the time taken to carry out brute force attack on the proposed solution has been carried out. The result of mathematical analysis showed that the solution is almost completely resistant to brute force attack. Today’s standard methods for authentication are subject to a wide variety of software, hardware, and human attacks. The proposed scheme can be very useful in controlling the various types of authentication related attacks especially in a networked computer environment where the use of username and password for authentication is common.Keywords: authentication, QR code, cipher / decipher text, one time password, secret information
Procedia PDF Downloads 2687927 The Suitability of Agile Practices in Healthcare Industry with Regard to Healthcare Regulations
Authors: Mahmood Alsaadi, Alexei Lisitsa
Abstract:
Nowadays, medical devices rely completely on software whether as whole software or as embedded software, therefore, the organization that develops medical device software can benefit from adopting agile practices. Using agile practices in healthcare software development industries would bring benefits such as producing a product of a high-quality with low cost and in short period. However, medical device software development companies faced challenges in adopting agile practices. These due to the gaps that exist between agile practices and the requirements of healthcare regulations such as documentation, traceability, and formality. This research paper will conduct a study to investigate the adoption rate of agile practice in medical device software development, and they will extract and outline the requirements of healthcare regulations such as Food and Drug Administration (FDA), Health Insurance Portability and Accountability Act (HIPAA), and Medical Device Directive (MDD) that affect directly or indirectly on software development life cycle. Moreover, this research paper will evaluate the suitability of using agile practices in healthcare industries by analyzing the most popular agile practices such as eXtream Programming (XP), Scrum, and Feature-Driven Development (FDD) from healthcare industry point of view and in comparison with the requirements of healthcare regulations. Finally, the authors propose an agile mixture model that consists of different practices from different agile methods. As result, the adoption rate of agile practices in healthcare industries still low and agile practices should enhance with regard to requirements of the healthcare regulations in order to be used in healthcare software development organizations. Therefore, the proposed agile mixture model may assist in minimizing the gaps existing between healthcare regulations and agile practices and increase the adoption rate in the healthcare industry. As this research paper part of the ongoing project, an evaluation of agile mixture model will be conducted in the near future.Keywords: adoption of agile, agile gaps, agile mixture model, agile practices, healthcare regulations
Procedia PDF Downloads 2367926 Self-Organizing Maps for Credit Card Fraud Detection and Visualization
Authors: Peng Chun-Yi, Chen Wei-Hsuan, Ueng Shyh-Kuang
Abstract:
This study focuses on the application of self-organizing maps (SOM) technology in analyzing credit card transaction data, aiming to enhance the accuracy and efficiency of fraud detection. Som, as an artificial neural network, is particularly suited for pattern recognition and data classification, making it highly effective for the complex and variable nature of credit card transaction data. By analyzing transaction characteristics with SOM, the research identifies abnormal transaction patterns that could indicate potentially fraudulent activities. Moreover, this study has developed a specialized visualization tool to intuitively present the relationships between SOM analysis outcomes and transaction data, aiding financial institution personnel in quickly identifying and responding to potential fraud, thereby reducing financial losses. Additionally, the research explores the integration of SOM technology with composite intelligent system technologies (including finite state machines, fuzzy logic, and decision trees) to further improve fraud detection accuracy. This multimodal approach provides a comprehensive perspective for identifying and understanding various types of fraud within credit card transactions. In summary, by integrating SOM technology with visualization tools and composite intelligent system technologies, this research offers a more effective method of fraud detection for the financial industry, not only enhancing detection accuracy but also deepening the overall understanding of fraudulent activities.Keywords: self-organizing map technology, fraud detection, information visualization, data analysis, composite intelligent system technologies, decision support technologies
Procedia PDF Downloads 597925 Malaria and Environmental Sanitation
Authors: Soorya Vennila
Abstract:
A comprehensive study of malaria in 165 villages (hamlets) in Harur block, Dharmapuri district, has revealed the fact that there are distinct episodes of malaria due to An. culicifacies, the vector, causes persistent transmission in the revenue village called Vedakatamaduvu. A total of 300 household adult samples are randomly selected to study both quantitatively and qualitatively the vulnerability of malaria. On the basis of the response, the problem uncommon with groups was identified as the outdoor routine, particularly open defecation, with which the samples needed to be stratified into two major groups; users of toilets 21 and those who practice open defecation 279. Open defecation, as the habit-based vulnerability, is measured with the Pearson correlation coefficient to estimate the relationship between malaria and open defecation. It is also verified from the literature that plant fluids provide mosquitoes not only with energy but also with nutrition, to the extent that they can develop fertile eggs. In the endemic areas, the bushy Presopis Juliflora, which naturally serves as a feeding and resting spot for mosquitoes, serves as a cover to practice open defecation as well. Eventually, those who get resort to Presopis for open defecation have a higher chance of getting exposed to mosquito bites and being infected with malaria. The study concludes that the combination of bushy Prosopis Juliflora and open defecation leaves the place perpetually vulnerable to malaria.Keywords: Malaria, open defecation, endemic, presopis juliflora
Procedia PDF Downloads 1007924 Automatic Thresholding for Data Gap Detection for a Set of Sensors in Instrumented Buildings
Authors: Houda Najeh, Stéphane Ploix, Mahendra Pratap Singh, Karim Chabir, Mohamed Naceur Abdelkrim
Abstract:
Building systems are highly vulnerable to different kinds of faults and failures. In fact, various faults, failures and human behaviors could affect the building performance. This paper tackles the detection of unreliable sensors in buildings. Different literature surveys on diagnosis techniques for sensor grids in buildings have been published but all of them treat only bias and outliers. Occurences of data gaps have also not been given an adequate span of attention in the academia. The proposed methodology comprises the automatic thresholding for data gap detection for a set of heterogeneous sensors in instrumented buildings. Sensor measurements are considered to be regular time series. However, in reality, sensor values are not uniformly sampled. So, the issue to solve is from which delay each sensor become faulty? The use of time series is required for detection of abnormalities on the delays. The efficiency of the method is evaluated on measurements obtained from a real power plant: an office at Grenoble Institute of technology equipped by 30 sensors.Keywords: building system, time series, diagnosis, outliers, delay, data gap
Procedia PDF Downloads 2457923 A Dynamic Neural Network Model for Accurate Detection of Masked Faces
Authors: Oladapo Tolulope Ibitoye
Abstract:
Neural networks have become prominent and widely engaged in algorithmic-based machine learning networks. They are perfect in solving day-to-day issues to a certain extent. Neural networks are computing systems with several interconnected nodes. One of the numerous areas of application of neural networks is object detection. This is a prominent area due to the coronavirus disease pandemic and the post-pandemic phases. Wearing a face mask in public slows the spread of the virus, according to experts’ submission. This calls for the development of a reliable and effective model for detecting face masks on people's faces during compliance checks. The existing neural network models for facemask detection are characterized by their black-box nature and large dataset requirement. The highlighted challenges have compromised the performance of the existing models. The proposed model utilized Faster R-CNN Model on Inception V3 backbone to reduce system complexity and dataset requirement. The model was trained and validated with very few datasets and evaluation results shows an overall accuracy of 96% regardless of skin tone.Keywords: convolutional neural network, face detection, face mask, masked faces
Procedia PDF Downloads 687922 A Novel Approach of NPSO on Flexible Logistic (S-Shaped) Model for Software Reliability Prediction
Authors: Pooja Rani, G. S. Mahapatra, S. K. Pandey
Abstract:
In this paper, we propose a novel approach of Neural Network and Particle Swarm Optimization methods for software reliability prediction. We first explain how to apply compound function in neural network so that we can derive a Flexible Logistic (S-shaped) Growth Curve (FLGC) model. This model mathematically represents software failure as a random process and can be used to evaluate software development status during testing. To avoid trapping in local minima, we have applied Particle Swarm Optimization method to train proposed model using failure test data sets. We drive our proposed model using computational based intelligence modeling. Thus, proposed model becomes Neuro-Particle Swarm Optimization (NPSO) model. We do test result with different inertia weight to update particle and update velocity. We obtain result based on best inertia weight compare along with Personal based oriented PSO (pPSO) help to choose local best in network neighborhood. The applicability of proposed model is demonstrated through real time test data failure set. The results obtained from experiments show that the proposed model has a fairly accurate prediction capability in software reliability.Keywords: software reliability, flexible logistic growth curve model, software cumulative failure prediction, neural network, particle swarm optimization
Procedia PDF Downloads 3447921 Multi-Vehicle Detection Using Histogram of Oriented Gradients Features and Adaptive Sliding Window Technique
Authors: Saumya Srivastava, Rina Maiti
Abstract:
In order to achieve a better performance of vehicle detection in a complex environment, we present an efficient approach for a multi-vehicle detection system using an adaptive sliding window technique. For a given frame, image segmentation is carried out to establish the region of interest. Gradient computation followed by thresholding, denoising, and morphological operations is performed to extract the binary search image. Near-region field and far-region field are defined to generate hypotheses using the adaptive sliding window technique on the resultant binary search image. For each vehicle candidate, features are extracted using a histogram of oriented gradients, and a pre-trained support vector machine is applied for hypothesis verification. Later, the Kalman filter is used for tracking the vanishing point. The experimental results show that the method is robust and effective on various roads and driving scenarios. The algorithm was tested on highways and urban roads in India.Keywords: gradient, vehicle detection, histograms of oriented gradients, support vector machine
Procedia PDF Downloads 1247920 Advances in Fiber Optic Technology for High-Speed Data Transmission
Authors: Salim Yusif
Abstract:
Fiber optic technology has revolutionized telecommunications and data transmission, providing unmatched speed, bandwidth, and reliability. This paper presents the latest advancements in fiber optic technology, focusing on innovations in fiber materials, transmission techniques, and network architectures that enhance the performance of high-speed data transmission systems. Key advancements include the development of ultra-low-loss optical fibers, multi-core fibers, advanced modulation formats, and the integration of fiber optics into next-generation network architectures such as Software-Defined Networking (SDN) and Network Function Virtualization (NFV). Additionally, recent developments in fiber optic sensors are discussed, extending the utility of optical fibers beyond data transmission. Through comprehensive analysis and experimental validation, this research offers valuable insights into the future directions of fiber optic technology, highlighting its potential to drive innovation across various industries.Keywords: fiber optics, high-speed data transmission, ultra-low-loss optical fibers, multi-core fibers, modulation formats, coherent detection, software-defined networking, network function virtualization, fiber optic sensors
Procedia PDF Downloads 617919 Concentric Circle Detection based on Edge Pre-Classification and Extended RANSAC
Authors: Zhongjie Yu, Hancheng Yu
Abstract:
In this paper, we propose an effective method to detect concentric circles with imperfect edges. First, the gradient of edge pixel is coded and a 2-D lookup table is built to speed up normal generation. Then we take an accumulator to estimate the rough center and collect plausible edges of concentric circles through gradient and distance. Later, we take the contour-based method, which takes the contour and edge intersection, to pre-classify the edges. Finally, we use the extended RANSAC method to find all the candidate circles. The center of concentric circles is determined by the two circles with the highest concentricity. Experimental results demonstrate that the proposed method has both good performance and accuracy for the detection of concentric circles.Keywords: concentric circle detection, gradient, contour, edge pre-classification, RANSAC
Procedia PDF Downloads 1317918 Electrochemical Bioassay for Haptoglobin Quantification: Application in Bovine Mastitis Diagnosis
Authors: Soledad Carinelli, Iñigo Fernández, José Luis González-Mora, Pedro A. Salazar-Carballo
Abstract:
Mastitis is the most relevant inflammatory disease in cattle, affecting the animal health and causing important economic losses on dairy farms. This disease takes place in the mammary gland or udder when some opportunistic microorganisms, such as Staphylococcus aureus, Streptococcus agalactiae, Corynebacterium bovis, etc., invade the teat canal. According to the severity of the inflammation, mastitis can be classified as sub-clinical, clinical and chronic. Standard methods for mastitis detection include counts of somatic cells, cell culture, electrical conductivity of the milk, and California test (evaluation of “gel-like” matrix consistency after cell lysed with detergents). However, these assays present some limitations for accurate detection of subclinical mastitis. Currently, haptoglobin, an acute phase protein, has been proposed as novel and effective biomarker for mastitis detection. In this work, an electrochemical biosensor based on polydopamine-modified magnetic nanoparticles (MNPs@pDA) for haptoglobin detection is reported. Thus, MNPs@pDA has been synthesized by our group and functionalized with hemoglobin due to its high affinity to haptoglobin protein. The protein was labeled with specific antibodies modified with alkaline phosphatase enzyme for its electrochemical detection using an electroactive substrate (1-naphthyl phosphate) by differential pulse voltammetry. After the optimization of assay parameters, the haptoglobin determination was evaluated in milk. The strategy presented in this work shows a wide range of detection, achieving a limit of detection of 43 ng/mL. The accuracy of the strategy was determined by recovery assays, being of 84 and 94.5% for two Hp levels around the cut off value. Milk real samples were tested and the prediction capacity of the electrochemical biosensor was compared with a Haptoglobin commercial ELISA kit. The performance of the assay has demonstrated this strategy is an excellent and real alternative as screen method for sub-clinical bovine mastitis detection.Keywords: bovine mastitis, haptoglobin, electrochemistry, magnetic nanoparticles, polydopamine
Procedia PDF Downloads 1737917 A Model of Human Security: A Comparison of Vulnerabilities and Timespace
Authors: Anders Troedsson
Abstract:
For us humans, risks are intimately linked to human vulnerabilities - where there is vulnerability, there is potentially insecurity, and risk. Reducing vulnerability through compensatory measures means increasing security and decreasing risk. The paper suggests that a meaningful way to approach the study of risks (including threats, assaults, crisis etc.), is to understand the vulnerabilities these external phenomena evoke in humans. As is argued, the basis of risk evaluation, as well as responses, is the more or less subjective perception by the individual person, or a group of persons, exposed to the external event or phenomena in question. This will be determined primarily by the vulnerability or vulnerabilities that the external factor are perceived to evoke. In this way, risk perception is primarily an inward dynamic, rather than an outward one. Therefore, a route towards an understanding of the perception of risks, is a closer scrutiny of the vulnerabilities which they can evoke, thereby approaching an understanding of what in the paper is called the essence of risk (including threat, assault etc.), or that which a certain perceived risk means to an individual or group of individuals. As a necessary basis for gauging the wide spectrum of potential risks and their meaning, the paper proposes a model of human vulnerabilities, drawing from i.a. a long tradition of needs theory. In order to account for the subjectivity factor, which mediates between the innate vulnerabilities on the one hand, and the event or phenomenon out there on the other hand, an ensuing ontological discussion about the timespace characteristics of risk/threat/assault as perceived by humans leads to the positing of two dimensions. These two dimensions are applied on the vulnerabilities, resulting in a modelling effort featuring four realms of vulnerabilities which are related to each other and together represent a dynamic whole. In approaching the problem of risk perception, the paper thus defines the relevant realms of vulnerabilities, depicting them as a dynamic whole. With reference to a substantial body of literature and a growing international policy trend since the 1990s, this model is put in the language of human security - a concept relevant not only for international security studies and policy, but also for other academic disciplines and spheres of human endeavor.Keywords: human security, timespace, vulnerabilities, risk perception
Procedia PDF Downloads 3367916 Application of Hybrid Honey Bees Mating Optimization Algorithm in Multiuser Detection of Wireless Communication Systems
Abstract:
Wireless communication systems have changed dramatically and shown spectacular evolution over the past two decades. These radio technologies are engaged in a quest endless high-speed transmission coupled to a constant need to improve transmission quality. Various radio communication systems being developed use code division multiple access (CDMA) technique. This work analyses a hybrid honey bees mating optimization algorithm (HBMO) applied to multiuser detection (MuD) in CDMA communication systems. The HBMO is a swarm-based optimization algorithm, which simulates the mating process of real honey bees. We apply a hybridization of HBMO with simulated annealing (SA) in order to improve the solution generated by the HBMO. Simulation results show that the detection based on Hybrid HBMO, in term of bit error rate (BER), is viable option when compared with the classic detectors from literature under Rayleigh flat fading channel.Keywords: BER, DS-CDMA multiuser detection, genetic algorithm, hybrid HBMO, simulated annealing
Procedia PDF Downloads 4357915 Heat Vulnerability Index (HVI) Mapping in Extreme Heat Days Coupled with Air Pollution Using Principal Component Analysis (PCA) Technique: A Case Study of Amiens, France
Authors: Aiman Mazhar Qureshi, Ahmed Rachid
Abstract:
Extreme heat events are emerging human environmental health concerns in dense urban areas due to anthropogenic activities. High spatial and temporal resolution heat maps are important for urban heat adaptation and mitigation, helping to indicate hotspots that are required for the attention of city planners. The Heat Vulnerability Index (HVI) is the important approach used by decision-makers and urban planners to identify heat-vulnerable communities and areas that require heat stress mitigation strategies. Amiens is a medium-sized French city, where the average temperature has been increasing since the year 2000 by +1°C. Extreme heat events are recorded in the month of July for the last three consecutive years, 2018, 2019 and 2020. Poor air quality, especially ground-level ozone, has been observed mainly during the same hot period. In this study, we evaluated the HVI in Amiens during extreme heat days recorded last three years (2018,2019,2020). The Principal Component Analysis (PCA) technique is used for fine-scale vulnerability mapping. The main data we considered for this study to develop the HVI model are (a) socio-economic and demographic data; (b) Air pollution; (c) Land use and cover; (d) Elderly heat-illness; (e) socially vulnerable; (f) Remote sensing data (Land surface temperature (LST), mean elevation, NDVI and NDWI). The output maps identified the hot zones through comprehensive GIS analysis. The resultant map shows that high HVI exists in three typical areas: (1) where the population density is quite high and the vegetation cover is small (2) the artificial surfaces (built-in areas) (3) industrial zones that release thermal energy and ground-level ozone while those with low HVI are located in natural landscapes such as rivers and grasslands. The study also illustrates the system theory with a causal diagram after data analysis where anthropogenic activities and air pollution appear in correspondence with extreme heat events in the city. Our suggested index can be a useful tool to guide urban planners and municipalities, decision-makers and public health professionals in targeting areas at high risk of extreme heat and air pollution for future interventions adaptation and mitigation measures.Keywords: heat vulnerability index, heat mapping, heat health-illness, remote sensing, urban heat mitigation
Procedia PDF Downloads 148