Search results for: fault detection and classification
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5548

Search results for: fault detection and classification

3748 Review on Quaternion Gradient Operator with Marginal and Vector Approaches for Colour Edge Detection

Authors: Nadia Ben Youssef, Aicha Bouzid

Abstract:

Gradient estimation is one of the most fundamental tasks in the field of image processing in general, and more particularly for color images since that the research in color image gradient remains limited. The widely used gradient method is Di Zenzo’s gradient operator, which is based on the measure of squared local contrast of color images. The proposed gradient mechanism, presented in this paper, is based on the principle of the Di Zenzo’s approach using quaternion representation. This edge detector is compared to a marginal approach based on multiscale product of wavelet transform and another vector approach based on quaternion convolution and vector gradient approach. The experimental results indicate that the proposed color gradient operator outperforms marginal approach, however, it is less efficient then the second vector approach.

Keywords: gradient, edge detection, color image, quaternion

Procedia PDF Downloads 222
3747 An Aptasensor Based on Magnetic Relaxation Switch and Controlled Magnetic Separation for the Sensitive Detection of Pseudomonas aeruginosa

Authors: Fei Jia, Xingjian Bai, Xiaowei Zhang, Wenjie Yan, Ruitong Dai, Xingmin Li, Jozef Kokini

Abstract:

Pseudomonas aeruginosa is a Gram-negative, aerobic, opportunistic human pathogen that is present in the soil, water, and food. This microbe has been recognized as a representative food-borne spoilage bacterium that can lead to many types of infections. Considering the casualties and property loss caused by P. aeruginosa, the development of a rapid and reliable technique for the detection of P. aeruginosa is crucial. The whole-cell aptasensor, an emerging biosensor using aptamer as a capture probe to bind to the whole cell, for food-borne pathogens detection has attracted much attention due to its convenience and high sensitivity. Here, a low-field magnetic resonance imaging (LF-MRI) aptasensor for the rapid detection of P. aeruginosa was developed. The basic detection principle of the magnetic relaxation switch (MRSw) nanosensor lies on the ‘T₂-shortening’ effect of magnetic nanoparticles in NMR measurements. Briefly speaking, the transverse relaxation time (T₂) of neighboring water protons get shortened when magnetic nanoparticles are clustered due to the cross-linking upon the recognition and binding of biological targets, or simply when the concentration of the magnetic nanoparticles increased. Such shortening is related to both the state change (aggregation or dissociation) and the concentration change of magnetic nanoparticles and can be detected using NMR relaxometry or MRI scanners. In this work, two different sizes of magnetic nanoparticles, which are 10 nm (MN₁₀) and 400 nm (MN₄₀₀) in diameter, were first immobilized with anti- P. aeruginosa aptamer through 1-Ethyl-3-(3-dimethylaminopropyl) carbodiimide (EDC)/N-hydroxysuccinimide (NHS) chemistry separately, to capture and enrich the P. aeruginosa cells. When incubating with the target, a ‘sandwich’ (MN₁₀-bacteria-MN₄₀₀) complex are formed driven by the bonding of MN400 with P. aeruginosa through aptamer recognition, as well as the conjugate aggregation of MN₁₀ on the surface of P. aeruginosa. Due to the different magnetic performance of the MN₁₀ and MN₄₀₀ in the magnetic field caused by their different saturation magnetization, the MN₁₀-bacteria-MN₄₀₀ complex, as well as the unreacted MN₄₀₀ in the solution, can be quickly removed by magnetic separation, and as a result, only unreacted MN₁₀ remain in the solution. The remaining MN₁₀, which are superparamagnetic and stable in low field magnetic field, work as a signal readout for T₂ measurement. Under the optimum condition, the LF-MRI platform provides both image analysis and quantitative detection of P. aeruginosa, with the detection limit as low as 100 cfu/mL. The feasibility and specificity of the aptasensor are demonstrated in detecting real food samples and validated by using plate counting methods. Only two steps and less than 2 hours needed for the detection procedure, this robust aptasensor can detect P. aeruginosa with a wide linear range from 3.1 ×10² cfu/mL to 3.1 ×10⁷ cfu/mL, which is superior to conventional plate counting method and other molecular biology testing assay. Moreover, the aptasensor has a potential to detect other bacteria or toxins by changing suitable aptamers. Considering the excellent accuracy, feasibility, and practicality, the whole-cell aptasensor provides a promising platform for a quick, direct and accurate determination of food-borne pathogens at cell-level.

Keywords: magnetic resonance imaging, meat spoilage, P. aeruginosa, transverse relaxation time

Procedia PDF Downloads 138
3746 Monitoring of Quantitative and Qualitative Changes in Combustible Material in the Białowieża Forest

Authors: Damian Czubak

Abstract:

The Białowieża Forest is a very valuable natural area, included in the World Natural Heritage at UNESCO, where, due to infestation by the bark beetle (Ips typographus), norway spruce (Picea abies) have deteriorated. This catastrophic scenario led to an increase in fire danger. This was due to the occurrence of large amounts of dead wood and grass cover, as light penetrated to the bottom of the stands. These factors in a dry state are materials that favour the possibility of fire and the rapid spread of fire. One of the objectives of the study was to monitor the quantitative and qualitative changes of combustible material on the permanent decay plots of spruce stands from 2012-2022. In addition, the size of the area with highly flammable vegetation was monitored and a classification of the stands of the Białowieża Forest by flammability classes was made. The key factor that determines the potential fire hazard of a forest is combustible material. Primarily its type, quantity, moisture content, size and spatial structure. Based on the inventory data on the areas of forest districts in the Białowieża Forest, the average fire load and its changes over the years were calculated. The analysis was carried out taking into account the changes in the health status of the stands and sanitary operations. The quantitative and qualitative assessment of fallen timber and fire load of ground cover used the results of the 2019 and 2021 inventories. Approximately 9,000 circular plots were used for the study. An assessment was made of the amount of potential fuel, understood as ground cover vegetation and dead wood debris. In addition, monitoring of areas with vegetation that poses a high fire risk was conducted using data from 2019 and 2021. All sub-areas were inventoried where vegetation posing a specific fire hazard represented at least 10% of the area with species characteristic of that cover. In addition to the size of the area with fire-prone vegetation, a very important element is the size of the fire load on the indicated plots. On representative plots, the biomass of the land cover was measured on an area of 10 m2 and then the amount of biomass of each component was determined. The resulting element of variability of ground covers in stands was their flammability classification. The classification developed made it possible to track changes in the flammability classes of stands over the period covered by the measurements.

Keywords: classification, combustible material, flammable vegetation, Norway spruce

Procedia PDF Downloads 81
3745 Collision Avoidance Based on Model Predictive Control for Nonlinear Octocopter Model

Authors: Doğan Yıldız, Aydan Müşerref Erkmen

Abstract:

The controller of the octocopter is mostly based on the PID controller. For complex maneuvers, PID controllers have limited performance capability like in collision avoidance. When an octocopter needs avoidance from an obstacle, it must instantly show an agile maneuver. Also, this kind of maneuver is affected severely by the nonlinear characteristic of octocopter. When these kinds of limitations are considered, the situation is highly challenging for the PID controller. In the proposed study, these challenges are tried to minimize by using the model predictive controller (MPC) for collision avoidance with a nonlinear octocopter model. The aim is to show that MPC-based collision avoidance has the capability to deal with fast varying conditions in case of obstacle detection and diminish the nonlinear effects of octocopter with varying disturbances.

Keywords: model predictive control, nonlinear octocopter model, collision avoidance, obstacle detection

Procedia PDF Downloads 181
3744 Predictive Analysis for Big Data: Extension of Classification and Regression Trees Algorithm

Authors: Ameur Abdelkader, Abed Bouarfa Hafida

Abstract:

Since its inception, predictive analysis has revolutionized the IT industry through its robustness and decision-making facilities. It involves the application of a set of data processing techniques and algorithms in order to create predictive models. Its principle is based on finding relationships between explanatory variables and the predicted variables. Past occurrences are exploited to predict and to derive the unknown outcome. With the advent of big data, many studies have suggested the use of predictive analytics in order to process and analyze big data. Nevertheless, they have been curbed by the limits of classical methods of predictive analysis in case of a large amount of data. In fact, because of their volumes, their nature (semi or unstructured) and their variety, it is impossible to analyze efficiently big data via classical methods of predictive analysis. The authors attribute this weakness to the fact that predictive analysis algorithms do not allow the parallelization and distribution of calculation. In this paper, we propose to extend the predictive analysis algorithm, Classification And Regression Trees (CART), in order to adapt it for big data analysis. The major changes of this algorithm are presented and then a version of the extended algorithm is defined in order to make it applicable for a huge quantity of data.

Keywords: predictive analysis, big data, predictive analysis algorithms, CART algorithm

Procedia PDF Downloads 132
3743 Classification of Factors Influencing Buyer-Supplier Relationship: A Case Study from the Cement Industry

Authors: Alberto Piatto, Zaza Nadja Lee Hansen, Peter Jacobsen

Abstract:

This paper examines the quantitative and qualitative factors influencing the buyer-supplier relationship. Understanding and acting on the right factors influencing supplier relationship management is crucial when a company outsource an important part of its business as it can be for engineering to order (ETO) company executing only the designing part in-house. Acting on these factors increase the quality of the relationship obtaining for both parties what they want and expect from an improved relationship. Best practices in supplier relationship management are considered and a case study of a large global company, called Cement A/S, operating in the cement business is carried out. One study is conducted including a large international company and hundreds of its suppliers. Data from the company is collected using semi-structured interviews and data from the suppliers is collected using a survey. Based on these inputs and an extensive literature review a classification of factors influencing the relationship buyer-supplier is presented and discussed. The results show that different managers among the company are assessing supplier from various perspectives, a standard approach to measure the performance of suppliers does not exist. The factors used nowadays in the company to measure performances of the suppliers are mostly related to time and cost. Quality is a key factor, but it has not been addressed properly since no data are available in the system. From a practical perspective, managers can learn from this paper which factors to consider when applying best practices of Supplier Relationship Management. Furthermore, from a theoretical perspective, this paper contributes with new knowledge in the area as limited research in collaboration with the company has been conducted. For this reason, a company, its suppliers and few studies for this type of industry have been conducted. For further research, it is suggested to define the correlation of factors to the profitability of the company and calculate its impact. When conducting this analysis it is important to focus on the efficient and effective use of factors that can be measurable and accepted from the supplier.

Keywords: buyer-supplier relationship, cement industry, classification of factors, ETO

Procedia PDF Downloads 264
3742 The Complementary Effect of Internal Control System and Whistleblowing Policy on Prevention and Detection of Fraud in Nigerian Deposit Money Banks

Authors: Dada Durojaye Joshua

Abstract:

The study examined the combined effect of internal control system and whistle blowing policy while it pursues the following specific objectives, which are to: examine the relationship between monitoring activities and fraud’s detection and prevention; investigate the effect of control activities on fraud’s detection and prevention in Nigerian Deposit Money Banks (DMBs). The population of the study comprises the 89,275 members of staff in the 20 DMBs in Nigeria as at June 2019. Purposive and convenient sampling techniques were used in the selection of the 80 members of staff at the supervisory level of the Internal Audit Departments of the head offices of the sampled banks, that is, selecting 4 respondents (Audit Executive/Head, Internal Control; Manager, Operation Risk Management; Head, Financial Crime Control; the Chief Compliance Officer) from each of the 20 DMBs in Nigeria. A standard questionnaire was adapted from 2017/2018 Internal Control Questionnaire and Assessment, Bureau of Financial Monitoring and Accountability Florida Department of Economic Opportunity. It was modified to serve the purpose for which it was meant to serve. It was self-administered to gather data from the 80 respondents at the respective headquarters of the sampled banks at their respective locations across Nigeria. Two likert-scales was used in achieving the stated objectives. A logit regression was used in analysing the stated hypotheses. It was found that effect of monitoring activities using the construct of conduct of ongoing or separate evaluation (COSE), evaluation and communication of deficiencies (ECD) revealed that monitoring activities is significant and positively related to fraud’s detection and prevention in Nigerian DMBS. So also, it was found that control activities using selection and development of control activities (SDCA), selection and development of general controls over technology to prevent financial fraud (SDGCTF), development of control activities that gives room for transparency through procedures that put policies into actions (DCATPPA) contributed to influence fraud detection and prevention in the Nigerian DMBs. In addition, it was found that transparency, accountability, reliability, independence and value relevance have significant effect on fraud detection and prevention ibn Nigerian DMBs. The study concluded that the board of directors demonstrated independence from management and exercises oversight of the development and performance of internal control. Part of the conclusion was that there was accountability on the part of the owners and preparers of the financial reports and that the system gives room for the members of staff to account for their responsibilities. Among the recommendations was that the management of Nigerian DMBs should create and establish a standard Internal Control System strong enough to deter fraud in order to encourage continuity of operations by ensuring liquidity, solvency and going concern of the banks. It was also recommended that the banks create a structure that encourages whistleblowing to complement the internal control system.

Keywords: internal control, whistleblowing, deposit money banks, fraud prevention, fraud detection

Procedia PDF Downloads 67
3741 Comparison Of Data Mining Models To Predict Future Bridge Conditions

Authors: Pablo Martinez, Emad Mohamed, Osama Mohsen, Yasser Mohamed

Abstract:

Highway and bridge agencies, such as the Ministry of Transportation in Ontario, use the Bridge Condition Index (BCI) which is defined as the weighted condition of all bridge elements to determine the rehabilitation priorities for its bridges. Therefore, accurate forecasting of BCI is essential for bridge rehabilitation budgeting planning. The large amount of data available in regard to bridge conditions for several years dictate utilizing traditional mathematical models as infeasible analysis methods. This research study focuses on investigating different classification models that are developed to predict the bridge condition index in the province of Ontario, Canada based on the publicly available data for 2800 bridges over a period of more than 10 years. The data preparation is a key factor to develop acceptable classification models even with the simplest one, the k-NN model. All the models were tested, compared and statistically validated via cross validation and t-test. A simple k-NN model showed reasonable results (within 0.5% relative error) when predicting the bridge condition in an incoming year.

Keywords: asset management, bridge condition index, data mining, forecasting, infrastructure, knowledge discovery in databases, maintenance, predictive models

Procedia PDF Downloads 179
3740 Classifications of Images for the Recognition of People’s Behaviors by SIFT and SVM

Authors: Henni Sid Ahmed, Belbachir Mohamed Faouzi, Jean Caelen

Abstract:

Behavior recognition has been studied for realizing drivers assisting system and automated navigation and is an important studied field in the intelligent Building. In this paper, a recognition method of behavior recognition separated from a real image was studied. Images were divided into several categories according to the actual weather, distance and angle of view etc. SIFT was firstly used to detect key points and describe them because the SIFT (Scale Invariant Feature Transform) features were invariant to image scale and rotation and were robust to changes in the viewpoint and illumination. My goal is to develop a robust and reliable system which is composed of two fixed cameras in every room of intelligent building which are connected to a computer for acquisition of video sequences, with a program using these video sequences as inputs, we use SIFT represented different images of video sequences, and SVM (support vector machine) Lights as a programming tool for classification of images in order to classify people’s behaviors in the intelligent building in order to give maximum comfort with optimized energy consumption.

Keywords: video analysis, people behavior, intelligent building, classification

Procedia PDF Downloads 370
3739 Complementary Effect of Wistleblowing Policy and Internal Control System on Prevention and Detection of Fraud in Nigerian Deposit Money Banks

Authors: Dada Durojaye Joshua

Abstract:

The study examined the combined effect of internal control system and whistle blowing policy while it pursues the following specific objectives, which are to: examine the relationship between monitoring activities and fraud’s detection and prevention; investigate the effect of control activities on fraud’s detection and prevention in Nigerian Deposit Money Banks (DMBs). The population of the study comprises the 89,275 members of staff in the 20 DMBs in Nigeria as at June 2019. Purposive and convenient sampling techniques were used in the selection of the 80 members of staff at the supervisory level of the Internal Audit Departments of the head offices of the sampled banks, that is, selecting 4 respondents (Audit Executive/Head, Internal Control; Manager, Operation Risk Management; Head, Financial Crime Control; the Chief Compliance Officer) from each of the 20 DMBs in Nigeria. A standard questionnaire was adapted from 2017/2018 Internal Control Questionnaire and Assessment, Bureau of Financial Monitoring and Accountability Florida Department of Economic Opportunity. It was modified to serve the purpose for which it was meant to serve. It was self-administered to gather data from the 80 respondents at the respective headquarters of the sampled banks at their respective locations across Nigeria. Two likert-scales was used in achieving the stated objectives. A logit regression was used in analysing the stated hypotheses. It was found that effect of monitoring activities using the construct of conduct of ongoing or separate evaluation (COSE), evaluation and communication of deficiencies (ECD) revealed that monitoring activities is significant and positively related to fraud’s detection and prevention in Nigerian DMBS. So also, it was found that control activities using selection and development of control activities (SDCA), selection and development of general controls over technology to prevent financial fraud (SDGCTF), development of control activities that gives room for transparency through procedures that put policies into actions (DCATPPA) contributed to influence fraud detection and prevention in the Nigerian DMBs. In addition, it was found that transparency, accountability, reliability, independence and value relevance have significant effect on fraud detection and prevention ibn Nigerian DMBs. The study concluded that the board of directors demonstrated independence from management and exercises oversight of the development and performance of internal control. Part of the conclusion was that there was accountability on the part of the owners and preparers of the financial reports and that the system gives room for the members of staff to account for their responsibilities. Among the recommendations was that the management of Nigerian DMBs should create and establish a standard Internal Control System strong enough to deter fraud in order to encourage continuity of operations by ensuring liquidity, solvency and going concern of the banks. It was also recommended that the banks create a structure that encourages whistleblowing to complement the internal control system.

Keywords: internal control, whistleblowing, deposit money banks, fraud prevention, fraud detection

Procedia PDF Downloads 61
3738 High Thermal Selective Detection of NOₓ Using High Electron Mobility Transistor Based on Gallium Nitride

Authors: Hassane Ouazzani Chahdi, Omar Helli, Bourzgui Nour Eddine, Hassan Maher, Ali Soltani

Abstract:

The real-time knowledge of the NO, NO₂ concentration at high temperature, would allow manufacturers of automobiles to meet the upcoming stringent EURO7 anti-pollution measures for diesel engines. Knowledge of the concentration of each of these species will also enable engines to run leaner (i.e., more fuel efficient) while still meeting the anti-pollution requirements. Our proposed technology is promising in the field of automotive sensors. It consists of nanostructured semiconductors based on gallium nitride and zirconia dioxide. The development of new technologies for selective detection of NO and NO₂ gas species would be a critical enabler of superior depollution. The current response was well correlated to the NO concentration in the range of 0–2000 ppm, 0-2500 ppm NO₂, and 0-300 ppm NH₃ at a temperature of 600.

Keywords: NOₓ sensors, HEMT transistor, anti-pollution, gallium nitride, gas sensor

Procedia PDF Downloads 234
3737 Hand Gesture Recognition for Sign Language: A New Higher Order Fuzzy HMM Approach

Authors: Saad M. Darwish, Magda M. Madbouly, Murad B. Khorsheed

Abstract:

Sign Languages (SL) are the most accomplished forms of gestural communication. Therefore, their automatic analysis is a real challenge, which is interestingly implied to their lexical and syntactic organization levels. Hidden Markov models (HMM’s) have been used prominently and successfully in speech recognition and, more recently, in handwriting recognition. Consequently, they seem ideal for visual recognition of complex, structured hand gestures such as are found in sign language. In this paper, several results concerning static hand gesture recognition using an algorithm based on Type-2 Fuzzy HMM (T2FHMM) are presented. The features used as observables in the training as well as in the recognition phases are based on Singular Value Decomposition (SVD). SVD is an extension of Eigen decomposition to suit non-square matrices to reduce multi attribute hand gesture data to feature vectors. SVD optimally exposes the geometric structure of a matrix. In our approach, we replace the basic HMM arithmetic operators by some adequate Type-2 fuzzy operators that permits us to relax the additive constraint of probability measures. Therefore, T2FHMMs are able to handle both random and fuzzy uncertainties existing universally in the sequential data. Experimental results show that T2FHMMs can effectively handle noise and dialect uncertainties in hand signals besides a better classification performance than the classical HMMs. The recognition rate of the proposed system is 100% for uniform hand images and 86.21% for cluttered hand images.

Keywords: hand gesture recognition, hand detection, type-2 fuzzy logic, hidden Markov Model

Procedia PDF Downloads 450
3736 Brain Computer Interface Implementation for Affective Computing Sensing: Classifiers Comparison

Authors: Ramón Aparicio-García, Gustavo Juárez Gracia, Jesús Álvarez Cedillo

Abstract:

A research line of the computer science that involve the study of the Human-Computer Interaction (HCI), which search to recognize and interpret the user intent by the storage and the subsequent analysis of the electrical signals of the brain, for using them in the control of electronic devices. On the other hand, the affective computing research applies the human emotions in the HCI process helping to reduce the user frustration. This paper shows the results obtained during the hardware and software development of a Brain Computer Interface (BCI) capable of recognizing the human emotions through the association of the brain electrical activity patterns. The hardware involves the sensing stage and analogical-digital conversion. The interface software involves algorithms for pre-processing of the signal in time and frequency analysis and the classification of patterns associated with the electrical brain activity. The methods used for the analysis and classification of the signal have been tested separately, by using a database that is accessible to the public, besides to a comparison among classifiers in order to know the best performing.

Keywords: affective computing, interface, brain, intelligent interaction

Procedia PDF Downloads 373
3735 The Reasons for Vegetarianism in Estonia and its Effects to Body Composition

Authors: Ülle Parm, Kata Pedamäe, Jaak Jürimäe, Evelin Lätt, Aivar Orav, Anna-Liisa Tamm

Abstract:

Vegetarianism has gained popularity across the world. It`s being chosen for multiple reasons, but among Estonians, these have remained unknown. Previously, attention to bone health and probable nutrient deficiency of vegetarians has been paid and in vegetarians lower body mass index (BMI) and blood cholesterol level has been found but the results are inconclusive. The goal was to explain reasons for choosing vegetarian diet in Estonia and impact of vegetarianism to body composition – BMI, fat percentage (fat%), fat mass (FM), and fat free mass (FFM). The study group comprised of 68 vegetarians and 103 omnivorous. The determining body composition with DXA (Hologic) was concluded in 2013. Body mass (medical electronic scale, A&D Instruments, Abingdon, UK) and height (Martin metal anthropometer to the nearest 0.1 cm) were measured and BMI calculated (kg/m2). General data (physical activity level included) was collected with questionnaires. The main reasons why vegetarianism was chosen were the healthiness of the vegetarian diet (59%) and the wish to fight for animal rights (72%) Food additives were consumed by less than half of vegetarians, more often by men. Vegetarians had lower BMI than omnivores, especially amongst men. Based on BMI classification, vegetarians were less obese than omnivores. However, there were no differences in the FM, FFM and fat percentage figures of the two groups. Higher BMI might be the cause of higher physical activity level among omnivores compared with vegetarians. For classifying people as underweight, normal weight, overweight and obese both BMI and fat% criteria were used. By BMI classification in comparison with fat%, more people in the normal weight group were considered; by using fat% in comparison with BMI classification, however, more people categorized as overweight. It can be concluded that the main reasons for vegetarianism chosen in Estonia are healthiness of the vegetarian diet and the wish to fight for animal rights and vegetarian diet has no effect on body fat percentage, FM and FFM.

Keywords: body composition, body fat percentage, body mass index, vegetarianism

Procedia PDF Downloads 406
3734 Financial Service of Financial Institution for SME in Thailand

Authors: Charawee Butbumrung

Abstract:

This research aim to study the financial service of the Thailand financial Institution, second is to identify "best practices" offered by four financial institutions, namely, Kasikornthai Bank, Bangkok Bank, Siam Commercial Bank, and Thanachart Bank. In-depth interviews with managers of financial institution and borrowers reveal best practices from each financial institution. Close monitoring of and a close relationship with borrowers appear to be important for early detection of any problem. Another aspect that may be important is building up loyalty and developing reliability among members. A close and informal relationship with borrowers may also help in monitoring and early detection of problems that may arise in non-repayment of loans. Other factors that may be considered important to the success of a financial service scheme are cooperation and coordination among various agencies that provide additional support to borrowers. Indirectly, these support systems contribute to the success of a SME in Thailand.

Keywords: best practices, financial service, financial institution, SME in Thailand

Procedia PDF Downloads 279
3733 Quality Control of Automotive Gearbox Based On Vibration Signal Analysis

Authors: Nilson Barbieri, Bruno Matos Martins, Gabriel de Sant'Anna Vitor Barbieri

Abstract:

In more complex systems, such as automotive gearbox, a rigorous treatment of the data is necessary because there are several moving parts (gears, bearings, shafts, etc.), and in this way, there are several possible sources of errors and also noise. The basic objective of this work is the detection of damage in automotive gearbox. The detection methods used are the wavelet method, the bispectrum; advanced filtering techniques (selective filtering) of vibrational signals and mathematical morphology. Gearbox vibration tests were performed (gearboxes in good condition and with defects) of a production line of a large vehicle assembler. The vibration signals are obtained using five accelerometers in different positions of the sample. The results obtained using the kurtosis, bispectrum, wavelet and mathematical morphology showed that it is possible to identify the existence of defects in automotive gearboxes.

Keywords: automotive gearbox, mathematical morphology, wavelet, bispectrum

Procedia PDF Downloads 462
3732 AI-Based Techniques for Online Social Media Network Sentiment Analysis: A Methodical Review

Authors: A. M. John-Otumu, M. M. Rahman, O. C. Nwokonkwo, M. C. Onuoha

Abstract:

Online social media networks have long served as a primary arena for group conversations, gossip, text-based information sharing and distribution. The use of natural language processing techniques for text classification and unbiased decision-making has not been far-fetched. Proper classification of this textual information in a given context has also been very difficult. As a result, we decided to conduct a systematic review of previous literature on sentiment classification and AI-based techniques that have been used in order to gain a better understanding of the process of designing and developing a robust and more accurate sentiment classifier that can correctly classify social media textual information of a given context between hate speech and inverted compliments with a high level of accuracy by assessing different artificial intelligence techniques. We evaluated over 250 articles from digital sources like ScienceDirect, ACM, Google Scholar, and IEEE Xplore and whittled down the number of research to 31. Findings revealed that Deep learning approaches such as CNN, RNN, BERT, and LSTM outperformed various machine learning techniques in terms of performance accuracy. A large dataset is also necessary for developing a robust sentiment classifier and can be obtained from places like Twitter, movie reviews, Kaggle, SST, and SemEval Task4. Hybrid Deep Learning techniques like CNN+LSTM, CNN+GRU, CNN+BERT outperformed single Deep Learning techniques and machine learning techniques. Python programming language outperformed Java programming language in terms of sentiment analyzer development due to its simplicity and AI-based library functionalities. Based on some of the important findings from this study, we made a recommendation for future research.

Keywords: artificial intelligence, natural language processing, sentiment analysis, social network, text

Procedia PDF Downloads 105
3731 Hydrothermal Synthesis of Mesoporous Carbon Nanospheres and Their Electrochemical Properties for Glucose Detection

Authors: Ali Akbar Kazemi Asl, Mansour Rahsepar

Abstract:

Mesoporous carbon nanospheres (MCNs) with uniform particle size distribution having an average of 290 nm and large specific surface area (274.4 m²/g) were synthesized by a one-step hydrothermal method followed by the calcination process and then utilized as an enzyme-free glucose biosensor. Morphology, crystal structure, and porous nature of the synthesized nanospheres were characterized by scanning electron microscopy (SEM), X-Ray diffraction (XRD), and Brunauer–Emmett–Teller (BET) analysis, respectively. Also, the electrochemical performance of the MCNs@GCE electrode for the measurement of glucose concentration in alkaline media was investigated by electrochemical impedance spectroscopy (EIS), cyclic voltammetry (CV), and chronoamperometry (CA). MCNs@GCE electrode shows good sensing performance, including a rapid glucose oxidation response within 3.1 s, a wide linear range of 0.026-12 mM, a sensitivity of 212.34 μA.mM⁻¹.cm⁻², and a detection limit of 25.7 μM with excellent selectivity.

Keywords: biosensor, electrochemical, glucose, mesoporous carbon, non-enzymatic

Procedia PDF Downloads 178
3730 Study of Geological Structure for Potential Fresh-Groundwater Aquifer Determination around Cidaun Beach, Cianjur Regency, West Java Province, Indonesia

Authors: Ilham Aji Dermawan, M. Sapari Dwi Hadian, R. Irvan Sophian, Iyan Haryanto

Abstract:

The study of the geological structure in the surrounding area of Cidaun, Cianjur Regency, West Java Province, Indonesia was conducted around the southern coast of Java Island. This study aims to determine the potentially structural trap deposits of freshwater resources in the study area, according to that the study area is an area directly adjacent to the beach, where the water around it did not seem fresh and brackish due to the exposure of sea water intrusion. This study uses the method of geomorphological analysis and geological mapping by taking the data directly in the field within 10x10 km of the research area. Geomorphological analysis was done by calculating the watershed drainage density value and roundness of watershed value ratio. The goal is to determine the permeability of the sub-soil conditions, rock constituent, and the flow of surface water. While the field geological mapping aims to take the geological structure data and then will do the reconstruction to determine the geological conditions of research area. The result, from geomorphology aspects, that the considered area of potential groundwater consisted of permeable surface material, permeable sub-soil, and low of water run-off flow. It is very good for groundwater recharge area. While the results of geological reconstruction after conducted of geological mapping is joints that present were initiated for the Cipandak Fault that cuts Cipandak River. That fault across until the Cibako Syncline fold through the Cibako River. This syncline is expected to place of influent groundwater aquifer. The tip of Cibako River then united with Cipandak River, where the Cipandak River extends through Cipandak Syncline fold axis in the southern regions close to its estuary. This syncline is expected to place of influent groundwater aquifer too.

Keywords: geological structure, groundwater, hydrogeology, influent aquifer, structural trap

Procedia PDF Downloads 194
3729 Relation of Optimal Pilot Offsets in the Shifted Constellation-Based Method for the Detection of Pilot Contamination Attacks

Authors: Dimitriya A. Mihaylova, Zlatka V. Valkova-Jarvis, Georgi L. Iliev

Abstract:

One possible approach for maintaining the security of communication systems relies on Physical Layer Security mechanisms. However, in wireless time division duplex systems, where uplink and downlink channels are reciprocal, the channel estimate procedure is exposed to attacks known as pilot contamination, with the aim of having an enhanced data signal sent to the malicious user. The Shifted 2-N-PSK method involves two random legitimate pilots in the training phase, each of which belongs to a constellation, shifted from the original N-PSK symbols by certain degrees. In this paper, legitimate pilots’ offset values and their influence on the detection capabilities of the Shifted 2-N-PSK method are investigated. As the implementation of the technique depends on the relation between the shift angles rather than their specific values, the optimal interconnection between the two legitimate constellations is investigated. The results show that no regularity exists in the relation between the pilot contamination attacks (PCA) detection probability and the choice of offset values. Therefore, an adversary who aims to obtain the exact offset values can only employ a brute-force attack but the large number of possible combinations for the shifted constellations makes such a type of attack difficult to successfully mount. For this reason, the number of optimal shift value pairs is also studied for both 100% and 98% probabilities of detecting pilot contamination attacks. Although the Shifted 2-N-PSK method has been broadly studied in different signal-to-noise ratio scenarios, in multi-cell systems the interference from the signals in other cells should be also taken into account. Therefore, the inter-cell interference impact on the performance of the method is investigated by means of a large number of simulations. The results show that the detection probability of the Shifted 2-N-PSK decreases inversely to the signal-to-interference-plus-noise ratio.

Keywords: channel estimation, inter-cell interference, pilot contamination attacks, wireless communications

Procedia PDF Downloads 203
3728 A Dual Channel Optical Sensor for Norepinephrine via Situ Generated Silver Nanoparticles

Authors: Shalini Menon, K. Girish Kumar

Abstract:

Norepinephrine (NE) is one of the naturally occurring catecholamines which act both as a neurotransmitter and a hormone. Catecholamine levels are used for the diagnosis and regulation of phaeochromocytoma, a neuroendocrine tumor of the adrenal medulla. The development of simple, rapid and cost-effective sensors for NE still remains a great challenge. Herein, a dual-channel sensor has been developed for the determination of NE. A mixture of AgNO₃, NaOH, NH₃.H₂O and cetrimonium bromide in appropriate concentrations was taken as the working solution. To the thoroughly vortexed mixture, an appropriate volume of NE solution was added. After a particular time, the fluorescence and absorbance were measured. Fluorescence measurements were made by exciting at a wavelength of 400 nm. A dual-channel optical sensor has been developed for the colorimetric as well as the fluorimetric determination of NE. Metal enhanced fluorescence property of nanoparticles forms the basis of the fluorimetric detection of this assay, whereas the appearance of brown color in the presence of NE leads to colorimetric detection. Wide linear ranges and sub-micromolar detection limits were obtained using both the techniques. Moreover, the colorimetric approach was applied for the determination of NE in synthetic blood serum and the results obtained were compared with the classic high-performance liquid chromatography (HPLC) method. Recoveries between 97% and 104% were obtained using the proposed method. Based on five replicate measurements, relative standard deviation (RSD) for NE determination in the examined synthetic blood serum was found to be 2.3%. This indicates the reliability of the proposed sensor for real sample analysis.

Keywords: norepinephrine, colorimetry, fluorescence, silver nanoparticles

Procedia PDF Downloads 104
3727 Semi-Supervised Learning Using Pseudo F Measure

Authors: Mahesh Balan U, Rohith Srinivaas Mohanakrishnan, Venkat Subramanian

Abstract:

Positive and unlabeled learning (PU) has gained more attention in both academic and industry research literature recently because of its relevance to existing business problems today. Yet, there still seems to be some existing challenges in terms of validating the performance of PU learning, as the actual truth of unlabeled data points is still unknown in contrast to a binary classification where we know the truth. In this study, we propose a novel PU learning technique based on the Pseudo-F measure, where we address this research gap. In this approach, we train the PU model to discriminate the probability distribution of the positive and unlabeled in the validation and spy data. The predicted probabilities of the PU model have a two-fold validation – (a) the predicted probabilities of reliable positives and predicted positives should be from the same distribution; (b) the predicted probabilities of predicted positives and predicted unlabeled should be from a different distribution. We experimented with this approach on a credit marketing case study in one of the world’s biggest fintech platforms and found evidence for benchmarking performance and backtested using historical data. This study contributes to the existing literature on semi-supervised learning.

Keywords: PU learning, semi-supervised learning, pseudo f measure, classification

Procedia PDF Downloads 224
3726 Application of Biosensors in Forensic Analysis

Authors: Shirin jalili, Hadi Shirzad, Samaneh Nabavi, Somayeh Khanjani

Abstract:

Biosensors in forensic analysis are ideal biological tools that can be used for rapid and sensitive initial screening and testing to detect of suspicious components like biological and chemical agent in crime scenes. The wide use of different biomolecules such as proteins, nucleic acids, microorganisms, antibodies and enzymes makes it possible. These biosensors have great advantages such as rapidity, little sample manipulation and high sensitivity, also Because of their stability, specificity and low cost they have become a very important tool to Forensic analysis and detection of crime. In crime scenes different substances such as rape samples, Semen, saliva fingerprints and blood samples, act as a detecting elements for biosensors. On the other hand, successful fluid recovery via biosensor has the propensity to yield a highly valuable source of genetic material, which is important in finding the suspect. Although current biological fluid testing techniques are impaired for identification of body fluids. But these methods have disadvantages. For example if they are to be used simultaneously, Often give false positive result. These limitations can negatively result the output of a case through missed or misinterpreted evidence. The use of biosensor enable criminal researchers the highly sensitive and non-destructive detection of biological fluid through interaction with several fluid-endogenous and other biological and chemical contamination at the crime scene. For this reason, using of the biosensors for detecting the biological fluid found at the crime scenes which play an important role in identifying the suspect and solving the criminal.

Keywords: biosensors, forensic analysis, biological fluid, crime detection

Procedia PDF Downloads 1098
3725 Requirement Engineering for Intrusion Detection Systems in Wireless Sensor Networks

Authors: Afnan Al-Romi, Iman Al-Momani

Abstract:

The urge of applying the Software Engineering (SE) processes is both of vital importance and a key feature in critical, complex large-scale systems, for example, safety systems, security service systems, and network systems. Inevitably, associated with this are risks, such as system vulnerabilities and security threats. The probability of those risks increases in unsecured environments, such as wireless networks in general and in Wireless Sensor Networks (WSNs) in particular. WSN is a self-organizing network of sensor nodes connected by wireless links. WSNs consist of hundreds to thousands of low-power, low-cost, multi-function sensor nodes that are small in size and communicate over short-ranges. The distribution of sensor nodes in an open environment that could be unattended in addition to the resource constraints in terms of processing, storage and power, make such networks in stringent limitations such as lifetime (i.e. period of operation) and security. The importance of WSN applications that could be found in many militaries and civilian aspects has drawn the attention of many researchers to consider its security. To address this important issue and overcome one of the main challenges of WSNs, security solution systems have been developed by researchers. Those solutions are software-based network Intrusion Detection Systems (IDSs). However, it has been witnessed, that those developed IDSs are neither secure enough nor accurate to detect all malicious behaviours of attacks. Thus, the problem is the lack of coverage of all malicious behaviours in proposed IDSs, leading to unpleasant results, such as delays in the detection process, low detection accuracy, or even worse, leading to detection failure, as illustrated in the previous studies. Also, another problem is energy consumption in WSNs caused by IDS. So, in other words, not all requirements are implemented then traced. Moreover, neither all requirements are identified nor satisfied, as for some requirements have been compromised. The drawbacks in the current IDS are due to not following structured software development processes by researches and developers when developing IDS. Consequently, they resulted in inadequate requirement management, process, validation, and verification of requirements quality. Unfortunately, WSN and SE research communities have been mostly impermeable to each other. Integrating SE and WSNs is a real subject that will be expanded as technology evolves and spreads in industrial applications. Therefore, this paper will study the importance of Requirement Engineering when developing IDSs. Also, it will study a set of existed IDSs and illustrate the absence of Requirement Engineering and its effect. Then conclusions are drawn in regard of applying requirement engineering to systems to deliver the required functionalities, with respect to operational constraints, within an acceptable level of performance, accuracy and reliability.

Keywords: software engineering, requirement engineering, Intrusion Detection System, IDS, Wireless Sensor Networks, WSN

Procedia PDF Downloads 313
3724 A Graph Theoretic Algorithm for Bandwidth Improvement in Computer Networks

Authors: Mehmet Karaata

Abstract:

Given two distinct vertices (nodes) source s and target t of a graph G = (V, E), the two node-disjoint paths problem is to identify two node-disjoint paths between s ∈ V and t ∈ V . Two paths are node-disjoint if they have no common intermediate vertices. In this paper, we present an algorithm with O(m)-time complexity for finding two node-disjoint paths between s and t in arbitrary graphs where m is the number of edges. The proposed algorithm has a wide range of applications in ensuring reliability and security of sensor, mobile and fixed communication networks.

Keywords: disjoint paths, distributed systems, fault-tolerance, network routing, security

Procedia PDF Downloads 428
3723 The Inattentional Blindness Paradigm: A Breaking Wave for Attentional Biases in Test Anxiety

Authors: Kritika Kulhari, Aparna Sahu

Abstract:

Test anxiety results from concerns about failure in examinations or evaluative situations. Attentional biases are known to pronounce the symptomatic expression of test anxiety. In recent times, the inattentional blindness (IB) paradigm has shown promise as an attention bias modification treatment (ABMT) for anxiety by overcoming practice and expectancy effects which preexisting paradigms fail to counter. The IB paradigm assesses the inability of an individual to attend to a stimulus that appears suddenly while indulging in a perceptual discrimination task. The present study incorporated an IB task with three critical items (book, face, and triangle) appearing randomly in the perceptual discrimination task. Attentional biases were assessed as detection and identification of the critical item. The sample (N = 50) consisted of low test anxiety (LTA) and high test anxiety (HTA) groups based on the reactions to tests scale scores. Test threat manipulation was done with pre- and post-test assessment of test anxiety using the State Test Anxiety Inventory. A mixed factorial design with gender, test anxiety, presence or absence of test threat, and critical items was conducted to assess their effects on attentional biases. Results showed only a significant main effect for test anxiety on detection with higher accuracy of detection of the critical item for the LTA group. The study presents promising results in the realm of ABMT for test anxiety.

Keywords: attentional bias, attentional bias modification treatment, inattentional blindness, test anxiety

Procedia PDF Downloads 211
3722 Assessment of a Rapid Detection Sensor of Faecal Pollution in Freshwater

Authors: Ciprian Briciu-Burghina, Brendan Heery, Dermot Brabazon, Fiona Regan

Abstract:

Good quality bathing water is a highly desirable natural resource which can provide major economic, social, and environmental benefits. Both in Ireland and Europe, such water bodies are managed under the European Directive for the management of bathing water quality (BWD). The BWD aims mainly: (i) to improve health protection for bathers by introducing stricter standards for faecal pollution assessment (E. coli, enterococci), (ii) to establish a more pro-active approach to the assessment of possible pollution risks and the management of bathing waters, and (iii) to increase public involvement and dissemination of information to the general public. Standard methods for E. coli and enterococci quantification rely on cultivation of the target organism which requires long incubation periods (from 18h to a few days). This is not ideal when immediate action is required for risk mitigation. Municipalities that oversee the bathing water quality and deploy appropriate signage have to wait for laboratory results. During this time, bathers can be exposed to pollution events and health risks. Although forecasting tools exist, they are site specific and as consequence extensive historical data is required to be effective. Another approach for early detection of faecal pollution is the use of marker enzymes. β-glucuronidase (GUS) is a widely accepted biomarker for E. coli detection in microbiological water quality control. GUS assay is particularly attractive as they are rapid, less than 4 h, easy to perform and they do not require specialised training. A method for on-site detection of GUS from environmental samples in less than 75 min was previously demonstrated. In this study, the capability of ColiSense as an early warning system for faecal pollution in freshwater is assessed. The system successfully detected GUS activity in all of the 45 freshwater samples tested. GUS activity was found to correlate linearly with E. coli (r2=0.53, N=45, p < 0.001) and enterococci (r2=0.66, N=45, p < 0.001) Although GUS is a marker for E. coli, a better correlation was obtained for enterococci. For this study water samples were collected from 5 rivers in the Dublin area over 1 month. This suggests a high diversity of pollution sources (agricultural, industrial, etc) as well as point and diffuse pollution sources were captured in the sample size. Such variety in the source of E. coli can account for different GUS activities/culturable cell and different ratios of viable but not culturable to viable culturable bacteria. A previously developed protocol for the recovery and detection of E. coli was coupled with a miniaturised fluorometer (ColiSense) and the system was assessed for the rapid detection FIB in freshwater samples. Further work will be carried out to evaluate the system’s performance on seawater samples.

Keywords: faecal pollution, β-glucuronidase (GUS), bathing water, E. coli

Procedia PDF Downloads 271
3721 Classification of Random Doppler-Radar Targets during the Surveillance Operations

Authors: G. C. Tikkiwal, Mukesh Upadhyay

Abstract:

During the surveillance operations at war or peace time, the Radar operator gets a scatter of targets over the screen. This may be a tracked vehicle like tank vis-à-vis T72, BMP etc, or it may be a wheeled vehicle like ALS, TATRA, 2.5Tonne, Shaktiman or moving the army, moving convoys etc. The radar operator selects one of the promising targets into single target tracking (STT) mode. Once the target is locked, the operator gets a typical audible signal into his headphones. With reference to the gained experience and training over the time, the operator then identifies the random target. But this process is cumbersome and is solely dependent on the skills of the operator, thus may lead to misclassification of the object. In this paper, we present a technique using mathematical and statistical methods like fast fourier transformation (FFT) and principal component analysis (PCA) to identify the random objects. The process of classification is based on transforming the audible signature of target into music octave-notes. The whole methodology is then automated by developing suitable software. This automation increases the efficiency of identification of the random target by reducing the chances of misclassification. This whole study is based on live data.

Keywords: radar target, FFT, principal component analysis, eigenvector, octave-notes, DSP

Procedia PDF Downloads 386
3720 Prevalence of Lower Third Molar Impactions and Angulations Among Yemeni Population

Authors: Khawlah Al-Khalidi

Abstract:

Prevalence of lower third molar impactions and angulations among Yemeni population The purpose of this study was to look into the prevalence of lower third molars in a sample of patients from Ibb University Affiliated Hospital, as well as to study and categorise their position by using Pell and Gregory classification, and to look into a possible correlation between their position and the indication for extraction. Materials and methods: This is a retrospective, observational study in which a sample of 200 patients from Ibb University Affiliated Hospital were studied, including patient record validation and orthopantomography performed in screening appointments in people aged 16 to 21. Results and discussion: Males make up 63% of the sample, while people aged 19 to 20 make up 41.2%. Lower third molars were found in 365 of the 365 instances examined, accounting for 91% of the sample under study. According to Pell and Gregory's categorisation, the most common position is IIB, with 37%, followed by IIA with 21%; less common classes are IIIA, IC, and IIIC, with 1%, 3%, and 3%, respectively. It was feasible to determine that 56% of the lower third molars in the sample were recommended for extraction during the screening consultation. Finally, there are differences in third molar location and angulation. There was, however, a link between the available space for third molar eruption and the need for tooth extraction.

Keywords: lower third molar, extraction, Pell and Gregory classification, lower third molar impaction

Procedia PDF Downloads 37
3719 A Machine Learning Approach to Detecting Evasive PDF Malware

Authors: Vareesha Masood, Ammara Gul, Nabeeha Areej, Muhammad Asif Masood, Hamna Imran

Abstract:

The universal use of PDF files has prompted hackers to use them for malicious intent by hiding malicious codes in their victim’s PDF machines. Machine learning has proven to be the most efficient in identifying benign files and detecting files with PDF malware. This paper has proposed an approach using a decision tree classifier with parameters. A modern, inclusive dataset CIC-Evasive-PDFMal2022, produced by Lockheed Martin’s Cyber Security wing is used. It is one of the most reliable datasets to use in this field. We designed a PDF malware detection system that achieved 99.2%. Comparing the suggested model to other cutting-edge models in the same study field, it has a great performance in detecting PDF malware. Accordingly, we provide the fastest, most reliable, and most efficient PDF Malware detection approach in this paper.

Keywords: PDF, PDF malware, decision tree classifier, random forest classifier

Procedia PDF Downloads 79