Search results for: signal detection theory
8364 iCount: An Automated Swine Detection and Production Monitoring System Based on Sobel Filter and Ellipse Fitting Model
Authors: Jocelyn B. Barbosa, Angeli L. Magbaril, Mariel T. Sabanal, John Paul T. Galario, Mikka P. Baldovino
Abstract:
The use of technology has become ubiquitous in different areas of business today. With the advent of digital imaging and database technology, business owners have been motivated to integrate technology to their business operation ranging from small, medium to large enterprises. Technology has been found to have brought many benefits that can make a business grow. Hog or swine raising, for example, is a very popular enterprise in the Philippines, whose challenges in production monitoring can be addressed through technology integration. Swine production monitoring can become a tedious task as the enterprise goes larger. Specifically, problems like delayed and inconsistent reports are most likely to happen if counting of swine per pen of which building is done manually. In this study, we present iCount, which aims to ensure efficient swine detection and counting that hastens the swine production monitoring task. We develop a system that automatically detects and counts swine based on Sobel filter and ellipse fitting model, given the still photos of the group of swine captured in a pen. We improve the Sobel filter detection result through 8-neigbhorhood rule implementation. Ellipse fitting technique is then employed for proper swine detection. Furthermore, the system can generate periodic production reports and can identify the specific consumables to be served to the swine according to schedules. Experiments reveal that our algorithm provides an efficient way for detecting swine, thereby providing a significant amount of accuracy in production monitoring.Keywords: automatic swine counting, swine detection, swine production monitoring, ellipse fitting model, sobel filter
Procedia PDF Downloads 3118363 Exploring the Role of Media Activity Theory as a Conceptual Basis for Advancing Journalism Education: A Comprehensive Analysis of Its Impact on News Production and Consumption in the Digital Age
Authors: Shohnaza Uzokova Beknazarovna
Abstract:
This research study provides a comprehensive exploration of the Theory of Media Activity and its relevance as a conceptual framework for journalism education. The author offers a thorough review of existing literature on media activity theory, emphasizing its potential to enhance the understanding of the evolving media landscape and its implications for journalism practice. Through a combination of theoretical analysis and practical examples, the paper elucidates the ways in which the Theory of Media Activity can inform and enrich journalism education, particularly in relation to the interactive and participatory nature of contemporary media. The author presents a compelling argument for the integration of media activity theory into journalism curricula, emphasizing its capacity to equip students with a nuanced understanding of the reciprocal relationship between media producers and consumers. Furthermore, the paper discusses the implications of technological advancements on media production and consumption, highlighting the need for journalism educators to prepare students to navigate and contribute to the future of journalism in a rapidly changing media environment. Overall, this research paper offers valuable insights into the potential benefits of embracing the Theory of Media Activity as a foundational framework for journalism education. Its thorough analysis and practical implications make it a valuable resource for educators, researchers, and practitioners seeking to enhance journalism pedagogy in response to the dynamic nature of contemporary media.Keywords: theory of media activity, journalism education, media landscape, media production, media consumption, interactive media, participatory media, technological advancements, media producers, media consumers, journalism practice, contemporary media environment, journalism pedagogy, media theory, media studies
Procedia PDF Downloads 478362 Multi-scale Spatial and Unified Temporal Feature-fusion Network for Multivariate Time Series Anomaly Detection
Authors: Hang Yang, Jichao Li, Kewei Yang, Tianyang Lei
Abstract:
Multivariate time series anomaly detection is a significant research topic in the field of data mining, encompassing a wide range of applications across various industrial sectors such as traffic roads, financial logistics, and corporate production. The inherent spatial dependencies and temporal characteristics present in multivariate time series introduce challenges to the anomaly detection task. Previous studies have typically been based on the assumption that all variables belong to the same spatial hierarchy, neglecting the multi-level spatial relationships. To address this challenge, this paper proposes a multi-scale spatial and unified temporal feature fusion network, denoted as MSUT-Net, for multivariate time series anomaly detection. The proposed model employs a multi-level modeling approach, incorporating both temporal and spatial modules. The spatial module is designed to capture the spatial characteristics of multivariate time series data, utilizing an adaptive graph structure learning model to identify the multi-level spatial relationships between data variables and their attributes. The temporal module consists of a unified temporal processing module, which is tasked with capturing the temporal features of multivariate time series. This module is capable of simultaneously identifying temporal dependencies among different variables. Extensive testing on multiple publicly available datasets confirms that MSUT-Net achieves superior performance on the majority of datasets. Our method is able to model and accurately detect systems data with multi-level spatial relationships from a spatial-temporal perspective, providing a novel perspective for anomaly detection analysis.Keywords: data mining, industrial system, multivariate time series, anomaly detection
Procedia PDF Downloads 158361 E-Bike FE Model Analysis: Connection Stiffness of Elements with Different DOFs
Authors: Lele Zhang, Hui Leng Choo, Alexander Konyukhov, Shuguang Li
Abstract:
Finite Element (FE) model of simplified e-bike structure was generated by main frame with two tiers, which consisted of pipe, mass, beam, and shell elements (pipe 289, beam188, shell 181, shell 281, combin14, link11, mass21). These elements would be introduced and demonstrated using mathematical formulas. Based on coupling theory, constrain equations was proposed. Exporting all the parameters obtained from theory part, the connection stiffness matrix of the whole e-bike structure between each of these elements was detected.Keywords: coupling theory, stiffness matrix, e-bike, finite element model
Procedia PDF Downloads 3758360 The Significance of ‘Practice’ in Art Research: Indian and Western Perspective
Authors: Mukta Avachat-Shirke
Abstract:
The process of manifestation in art has been studied deeply by various Indian and Western philosophers through times. In the art of painting, ‘Practice’ is always considered as techniques or making and ‘Theory’ is related to intelligence or the ‘conceptual.' The question about the significance of ‘Practice’ in artistic research has been a topic of debate. The aim of this qualitative study is to find the relevance of practice and theory while creating artworks. This study analyzes the thoughts and philosophy of Abhinavgupta, Hegel, and Croce to find a new perspective for looking at practice and theory within artistic research. With the method of grounded theory, the study attempts to establish the importance of both in artistic research. It discusses the issues like stages of creating art, role of tacit knowledge and importance of the decision-making the ability of the artist. This comparative analysis of these three philosophers along with the present systems can be used as a point of reference for further developments in the pedagogy of art research and artists, to understand the psychology and to follow the process of creativity effectively.Keywords: artistic research, Indian philosophy, practice, Western Philosophy
Procedia PDF Downloads 2998359 A Fast Community Detection Algorithm
Authors: Chung-Yuan Huang, Yu-Hsiang Fu, Chuen-Tsai Sun
Abstract:
Community detection represents an important data-mining tool for analyzing and understanding real-world complex network structures and functions. We believe that at least four criteria determine the appropriateness of a community detection algorithm: (a) it produces useable normalized mutual information (NMI) and modularity results for social networks, (b) it overcomes resolution limitation problems associated with synthetic networks, (c) it produces good NMI results and performance efficiency for Lancichinetti-Fortunato-Radicchi (LFR) benchmark networks, and (d) it produces good modularity and performance efficiency for large-scale real-world complex networks. To our knowledge, no existing community detection algorithm meets all four criteria. In this paper, we describe a simple hierarchical arc-merging (HAM) algorithm that uses network topologies and rule-based arc-merging strategies to identify community structures that satisfy the criteria. We used five well-studied social network datasets and eight sets of LFR benchmark networks to validate the ground-truth community correctness of HAM, eight large-scale real-world complex networks to measure its performance efficiency, and two synthetic networks to determine its susceptibility to resolution limitation problems. Our results indicate that the proposed HAM algorithm is capable of providing satisfactory performance efficiency and that HAM-identified communities were close to ground-truth communities in social and LFR benchmark networks while overcoming resolution limitation problems.Keywords: complex network, social network, community detection, network hierarchy
Procedia PDF Downloads 2288358 Stabilization of Rotational Motion of Spacecrafts Using Quantized Two Torque Inputs Based on Random Dither
Authors: Yusuke Kuramitsu, Tomoaki Hashimoto, Hirokazu Tahara
Abstract:
The control problem of underactuated spacecrafts has attracted a considerable amount of interest. The control method for a spacecraft equipped with less than three control torques is useful when one of the three control torques had failed. On the other hand, the quantized control of systems is one of the important research topics in recent years. The random dither quantization method that transforms a given continuous signal to a discrete signal by adding artificial random noise to the continuous signal before quantization has also attracted a considerable amount of interest. The objective of this study is to develop the control method based on random dither quantization method for stabilizing the rotational motion of a rigid spacecraft with two control inputs. In this paper, the effectiveness of random dither quantization control method for the stabilization of rotational motion of spacecrafts with two torque inputs is verified by numerical simulations.Keywords: spacecraft control, quantized control, nonlinear control, random dither method
Procedia PDF Downloads 1808357 Modeling of Maximum Rainfall Using Poisson-Generalized Pareto Distribution in Kigali, Rwanda
Authors: Emmanuel Iyamuremye
Abstract:
Extreme rainfall events have caused significant damage to agriculture, ecology, and infrastructure, disruption of human activities, injury, and loss of life. They also have significant social, economic, and environmental consequences because they considerably damage urban as well as rural areas. Early detection of extreme maximum rainfall helps to implement strategies and measures, before they occur, hence mitigating the consequences. Extreme value theory has been used widely in modeling extreme rainfall and in various disciplines, such as financial markets, the insurance industry, failure cases. Climatic extremes have been analyzed by using either generalized extreme value (GEV) or generalized Pareto (GP) distributions, which provides evidence of the importance of modeling extreme rainfall from different regions of the world. In this paper, we focused on Peak Over Thresholds approach, where the Poisson-generalized Pareto distribution is considered as the proper distribution for the study of the exceedances. This research also considers the use of the generalized Pareto (GP) distribution with a Poisson model for arrivals to describe peaks over a threshold. The research used statistical techniques to fit models that used to predict extreme rainfall in Kigali. The results indicate that the proposed Poisson-GP distribution provides a better fit to maximum monthly rainfall data. Further, the Poisson-GP models are able to estimate various return levels. The research also found a slow increase in return levels for maximum monthly rainfall for higher return periods, and further, the intervals are increasingly wider as the return period is increasing.Keywords: exceedances, extreme value theory, generalized Pareto distribution, Poisson generalized Pareto distribution
Procedia PDF Downloads 1368356 Efficient Human Motion Detection Feature Set by Using Local Phase Quantization Method
Authors: Arwa Alzughaibi
Abstract:
Human Motion detection is a challenging task due to a number of factors including variable appearance, posture and a wide range of illumination conditions and background. So, the first need of such a model is a reliable feature set that can discriminate between a human and a non-human form with a fair amount of confidence even under difficult conditions. By having richer representations, the classification task becomes easier and improved results can be achieved. The Aim of this paper is to investigate the reliable and accurate human motion detection models that are able to detect the human motions accurately under varying illumination levels and backgrounds. Different sets of features are tried and tested including Histogram of Oriented Gradients (HOG), Deformable Parts Model (DPM), Local Decorrelated Channel Feature (LDCF) and Aggregate Channel Feature (ACF). However, we propose an efficient and reliable human motion detection approach by combining Histogram of oriented gradients (HOG) and local phase quantization (LPQ) as the feature set, and implementing search pruning algorithm based on optical flow to reduce the number of false positive. Experimental results show the effectiveness of combining local phase quantization descriptor and the histogram of gradient to perform perfectly well for a large range of illumination conditions and backgrounds than the state-of-the-art human detectors. Areaunder th ROC Curve (AUC) of the proposed method achieved 0.781 for UCF dataset and 0.826 for CDW dataset which indicates that it performs comparably better than HOG, DPM, LDCF and ACF methods.Keywords: human motion detection, histograms of oriented gradient, local phase quantization, local phase quantization
Procedia PDF Downloads 2578355 Marker-Controlled Level-Set for Segmenting Breast Tumor from Thermal Images
Authors: Swathi Gopakumar, Sruthi Krishna, Shivasubramani Krishnamoorthy
Abstract:
Contactless, painless and radiation-free thermal imaging technology is one of the preferred screening modalities for detection of breast cancer. However, poor signal to noise ratio and the inexorable need to preserve edges defining cancer cells and normal cells, make the segmentation process difficult and hence unsuitable for computer-aided diagnosis of breast cancer. This paper presents key findings from a research conducted on the appraisal of two promising techniques, for the detection of breast cancer: (I) marker-controlled, Level-set segmentation of anisotropic diffusion filtered preprocessed image versus (II) Segmentation using marker-controlled level-set on a Gaussian-filtered image. Gaussian-filtering processes the image uniformly, whereas anisotropic filtering processes only in specific areas of a thermographic image. The pre-processed (Gaussian-filtered and anisotropic-filtered) images of breast samples were then applied for segmentation. The segmentation of breast starts with initial level-set function. In this study, marker refers to the position of the image to which initial level-set function is applied. The markers are generally placed on the left and right side of the breast, which may vary with the breast size. The proposed method was carried out on images from an online database with samples collected from women of varying breast characteristics. It was observed that the breast was able to be segmented out from the background by adjustment of the markers. From the results, it was observed that as a pre-processing technique, anisotropic filtering with level-set segmentation, preserved the edges more effectively than Gaussian filtering. Segmented image, by application of anisotropic filtering was found to be more suitable for feature extraction, enabling automated computer-aided diagnosis of breast cancer.Keywords: anisotropic diffusion, breast, Gaussian, level-set, thermograms
Procedia PDF Downloads 3808354 Meditation and Insight Interpretation Using Quantum Circle Based-on Experiment and Quantum Relativity Formalism
Authors: Somnath Bhattachryya, Montree Bunruangses, Somchat Sonasang, Preecha Yupapin
Abstract:
In this study and research on meditation and insight, the design and experiment with electronic circuits to manipulate the meditators' mental circles that call the chakras to have the same size is proposed. The shape of the circuit is 4-ports, called an add-drop multiplexer, that studies the meditation structure called the four-mindfulness foundation, then uses an AC power signal as an input instead of the meditation time function, where various behaviors with the method of re-filtering the signal (successive filtering), like eight noble paths. Start by inputting a signal at a frequency that causes the velocity of the wave on the perimeter of the circuit to cause particles to have the speed of light in a vacuum. The signal changes from electromagnetic waves and matter waves according to the velocity (frequency) until it reaches the point of the relativistic limit. The electromagnetic waves are transformed into photons with properties of wave-particle overcoming the limits of the speed of light. As for the matter wave, it will travel to the other side and cannot pass through the relativistic limit, called a shadow signal (echo) that can have power from increasing speed but cannot create speed faster than light or insight. In the experiment, the only the side where the velocity is positive, only where the speed above light or the corresponding frequency indicates intelligence. Other side(echo) can be done by changing the input signal to the other side of the circuit to get the same result. But there is no intelligence or speed beyond light. It is also used to study the stretching, contraction of time and wormholes that can be applied for teleporting, Bose-Einstein condensate and teleprinting, quantum telephone. The teleporting can happen throughout the system with wave-particle and echo, which is when the speed of the particle is faster than the stretching or contraction of time, the particle will submerge in the wormhole, when the destination and time are determined, will travel through the wormhole. In a wormhole, time can determine in the future and the past. The experimental results using the microstrip circuit have been found to be by the principle of quantum relativity, which can be further developed for both tools and meditation practitioners for quantum technology.Keywords: quantu meditation, insight picture, quantum circuit, absolute time, teleportation
Procedia PDF Downloads 648353 Specific Emitter Identification Based on Refined Composite Multiscale Dispersion Entropy
Authors: Shaoying Guo, Yanyun Xu, Meng Zhang, Weiqing Huang
Abstract:
The wireless communication network is developing rapidly, thus the wireless security becomes more and more important. Specific emitter identification (SEI) is an vital part of wireless communication security as a technique to identify the unique transmitters. In this paper, a SEI method based on multiscale dispersion entropy (MDE) and refined composite multiscale dispersion entropy (RCMDE) is proposed. The algorithms of MDE and RCMDE are used to extract features for identification of five wireless devices and cross-validation support vector machine (CV-SVM) is used as the classifier. The experimental results show that the total identification accuracy is 99.3%, even at low signal-to-noise ratio(SNR) of 5dB, which proves that MDE and RCMDE can describe the communication signal series well. In addition, compared with other methods, the proposed method is effective and provides better accuracy and stability for SEI.Keywords: cross-validation support vector machine, refined com- posite multiscale dispersion entropy, specific emitter identification, transient signal, wireless communication device
Procedia PDF Downloads 1298352 Modified Poly (Pyrrole) Film-Based Biosensors for Phenol Detection
Authors: S. Korkut, M. S. Kilic, E. Erhan
Abstract:
In order to detect and quantify the phenolic contents of a wastewater with biosensors, two working electrodes based on modified Poly (Pyrrole) films were fabricated. Enzyme horseradish peroxidase was used as biomolecule of the prepared electrodes. Various phenolics were tested at the biosensor. Phenol detection was realized by electrochemical reduction of quinones produced by enzymatic activity. Analytical parameters were calculated and the results were compared with each other.Keywords: carbon nanotube, phenol biosensor, polypyrrole, poly (glutaraldehyde)
Procedia PDF Downloads 4198351 Edge Detection Using Multi-Agent System: Evaluation on Synthetic and Medical MR Images
Authors: A. Nachour, L. Ouzizi, Y. Aoura
Abstract:
Recent developments on multi-agent system have brought a new research field on image processing. Several algorithms are used simultaneously and improved in deferent applications while new methods are investigated. This paper presents a new automatic method for edge detection using several agents and many different actions. The proposed multi-agent system is based on parallel agents that locally perceive their environment, that is to say, pixels and additional environmental information. This environment is built using Vector Field Convolution that attract free agent to the edges. Problems of partial, hidden or edges linking are solved with the cooperation between agents. The presented method was implemented and evaluated using several examples on different synthetic and medical images. The obtained experimental results suggest that this approach confirm the efficiency and accuracy of detected edge.Keywords: edge detection, medical MRImages, multi-agent systems, vector field convolution
Procedia PDF Downloads 3918350 Edge Detection and Morphological Image for Estimating Gestational Age Based on Fetus Length Automatically
Authors: Retno Supriyanti, Ahmad Chuzaeri, Yogi Ramadhani, A. Haris Budi Widodo
Abstract:
The use of ultrasonography in the medical world has been very popular including the diagnosis of pregnancy. In determining pregnancy, ultrasonography has many roles, such as to check the position of the fetus, abnormal pregnancy, fetal age and others. Unfortunately, all these things still need to analyze the role of the obstetrician in the sense of image raised by ultrasonography. One of the most striking is the determination of gestational age. Usually, it is done by measuring the length of the fetus manually by obstetricians. In this study, we developed a computer-aided diagnosis for the determination of gestational age by measuring the length of the fetus automatically using edge detection method and image morphology. Results showed that the system is sufficiently accurate in determining the gestational age based image processing.Keywords: computer aided diagnosis, gestational age, and diameter of uterus, length of fetus, edge detection method, morphology image
Procedia PDF Downloads 2948349 A Theoretical Framework for Design Theories in Mobile Learning: A Higher Education Perspective
Authors: Paduri Veerabhadram, Antoinette Lombard
Abstract:
In this paper a framework for hypothesizing about mobile learning to complement theories of formal and informal learning is presented. As such, activity theory will form the main theoretical lens through which the elements involved in formal and informal learning for mobile learning will be explored, specifically related to context-aware mobile learning application. The author believes that the complexity of the relationships involved can best be analysed using activity theory. Activity theory, as a social, cultural and activity theory can be used as a mobile learning framework in an academic environment, but to develop an optimal artifact, through investigation of inherent system's contradictions. As such, it serves as a powerful modelling tool to explore and understand the design of a mobile learning environment in the study’s environment. The Academic Tool Kit Framework (ATKF) as also employed for designing of a constructivism learning environment, effective in assisting universities to facilitate lecturers to effectively implement learning through utilizing mobile devices. Results indicate a positive perspective of students in the use of mobile devices for formal and informal learning, based on the context-aware learning environment developed through the use of activity theory and ATKF.Keywords: collaborative learning, cooperative learning, context-aware learning environment, mobile learning, pedagogy
Procedia PDF Downloads 5688348 Detecting Characters as Objects Towards Character Recognition on Licence Plates
Authors: Alden Boby, Dane Brown, James Connan
Abstract:
Character recognition is a well-researched topic across disciplines. Regardless, creating a solution that can cater to multiple situations is still challenging. Vehicle licence plates lack an international standard, meaning that different countries and regions have their own licence plate format. A problem that arises from this is that the typefaces and designs from different regions make it difficult to create a solution that can cater to a wide range of licence plates. The main issue concerning detection is the character recognition stage. This paper aims to create an object detection-based character recognition model trained on a custom dataset that consists of typefaces of licence plates from various regions. Given that characters have featured consistently maintained across an array of fonts, YOLO can be trained to recognise characters based on these features, which may provide better performance than OCR methods such as Tesseract OCR.Keywords: computer vision, character recognition, licence plate recognition, object detection
Procedia PDF Downloads 1218347 Identification of Damage Mechanisms in Interlock Reinforced Composites Using a Pattern Recognition Approach of Acoustic Emission Data
Authors: M. Kharrat, G. Moreau, Z. Aboura
Abstract:
The latest advances in the weaving industry, combined with increasingly sophisticated means of materials processing, have made it possible to produce complex 3D composite structures. Mainly used in aeronautics, composite materials with 3D architecture offer better mechanical properties than 2D reinforced composites. Nevertheless, these materials require a good understanding of their behavior. Because of the complexity of such materials, the damage mechanisms are multiple, and the scenario of their appearance and evolution depends on the nature of the exerted solicitations. The AE technique is a well-established tool for discriminating between the damage mechanisms. Suitable sensors are used during the mechanical test to monitor the structural health of the material. Relevant AE-features are then extracted from the recorded signals, followed by a data analysis using pattern recognition techniques. In order to better understand the damage scenarios of interlock composite materials, a multi-instrumentation was set-up in this work for tracking damage initiation and development, especially in the vicinity of the first significant damage, called macro-damage. The deployed instrumentation includes video-microscopy, Digital Image Correlation, Acoustic Emission (AE) and micro-tomography. In this study, a multi-variable AE data analysis approach was developed for the discrimination between the different signal classes representing the different emission sources during testing. An unsupervised classification technique was adopted to perform AE data clustering without a priori knowledge. The multi-instrumentation and the clustered data served to label the different signal families and to build a learning database. This latter is useful to construct a supervised classifier that can be used for automatic recognition of the AE signals. Several materials with different ingredients were tested under various solicitations in order to feed and enrich the learning database. The methodology presented in this work was useful to refine the damage threshold for the new generation materials. The damage mechanisms around this threshold were highlighted. The obtained signal classes were assigned to the different mechanisms. The isolation of a 'noise' class makes it possible to discriminate between the signals emitted by damages without resorting to spatial filtering or increasing the AE detection threshold. The approach was validated on different material configurations. For the same material and the same type of solicitation, the identified classes are reproducible and little disturbed. The supervised classifier constructed based on the learning database was able to predict the labels of the classified signals.Keywords: acoustic emission, classifier, damage mechanisms, first damage threshold, interlock composite materials, pattern recognition
Procedia PDF Downloads 1558346 From Values to Sustainable Actions: A Dual-Theory Approach to Green Consumerism
Authors: Jiyeon Kim
Abstract:
This conceptual paper examines the psychological drivers of green consumerism and sustainable consumption by integrating the Value-Belief-Norm (VBN) Theory and the Theory of Reasoned Action (TRA). With growing environmental concerns, green consumerism promotes eco-friendly choices such as purchasing sustainable products and supporting environmentally responsible companies. However, there remains a need for research that effectively guides strategies to encourage sustainable behaviors. This paper evaluates VBN Theory’s role in driving pro-environmental behaviors. By incorporating TRA, the paper proposes an enhanced model that improves understanding of the factors driving sustained pro-environmental actions. Focusing on values, beliefs, and norms, this integrated model provides a deeper understanding of the cognitive and motivational factors that influence sustainable consumption. The findings offer valuable theoretical and practical insights for developing strategies to support long-term responsible consumer behavior.Keywords: green consumerism, sustainable behavior, TRA, VBN
Procedia PDF Downloads 98345 A Comprehensive Survey on Machine Learning Techniques and User Authentication Approaches for Credit Card Fraud Detection
Authors: Niloofar Yousefi, Marie Alaghband, Ivan Garibay
Abstract:
With the increase of credit card usage, the volume of credit card misuse also has significantly increased, which may cause appreciable financial losses for both credit card holders and financial organizations issuing credit cards. As a result, financial organizations are working hard on developing and deploying credit card fraud detection methods, in order to adapt to ever-evolving, increasingly sophisticated defrauding strategies and identifying illicit transactions as quickly as possible to protect themselves and their customers. Compounding on the complex nature of such adverse strategies, credit card fraudulent activities are rare events compared to the number of legitimate transactions. Hence, the challenge to develop fraud detection that are accurate and efficient is substantially intensified and, as a consequence, credit card fraud detection has lately become a very active area of research. In this work, we provide a survey of current techniques most relevant to the problem of credit card fraud detection. We carry out our survey in two main parts. In the first part, we focus on studies utilizing classical machine learning models, which mostly employ traditional transnational features to make fraud predictions. These models typically rely on some static physical characteristics, such as what the user knows (knowledge-based method), or what he/she has access to (object-based method). In the second part of our survey, we review more advanced techniques of user authentication, which use behavioral biometrics to identify an individual based on his/her unique behavior while he/she is interacting with his/her electronic devices. These approaches rely on how people behave (instead of what they do), which cannot be easily forged. By providing an overview of current approaches and the results reported in the literature, this survey aims to drive the future research agenda for the community in order to develop more accurate, reliable and scalable models of credit card fraud detection.Keywords: Credit Card Fraud Detection, User Authentication, Behavioral Biometrics, Machine Learning, Literature Survey
Procedia PDF Downloads 1218344 Analyzing a Tourism System by Bifurcation Theory
Authors: Amin Behradfar
Abstract:
Tourism has a direct impact on the national revenue for all touristic countries. It creates work opportunities, industries, and several investments to serve and raise nations performance and cultures. This paper is devoted to analyze dynamical behaviour of a four-dimensional non-linear tourism-based social-ecological system by using the codimension two bifurcation theory. In fact we investigate the cusp bifurcation of that. Implications of our mathematical results to the tourism industry are discussed. Moreover, profitability, compatibility and sustainability of the tourism system are shown by the aid of cusp bifurcation and numerical techniques.Keywords: tourism-based social-ecological dynamical systems, cusp bifurcation, center manifold theory, profitability, compatibility, sustainability
Procedia PDF Downloads 5028343 A Virtual Set-Up to Evaluate Augmented Reality Effect on Simulated Driving
Authors: Alicia Yanadira Nava Fuentes, Ilse Cervantes Camacho, Amadeo José Argüelles Cruz, Ana María Balboa Verduzco
Abstract:
Augmented reality promises being present in future driving, with its immersive technology let to show directions and maps to identify important places indicating with graphic elements when the car driver requires the information. On the other side, driving is considered a multitasking activity and, for some people, a complex activity where different situations commonly occur that require the immediate attention of the car driver to make decisions that contribute to avoid accidents; therefore, the main aim of the project is the instrumentation of a platform with biometric sensors that allows evaluating the performance in driving vehicles with the influence of augmented reality devices to detect the level of attention in drivers, since it is important to know the effect that it produces. In this study, the physiological sensors EPOC X (EEG), ECG06 PRO and EMG Myoware are joined in the driving test platform with a Logitech G29 steering wheel and the simulation software City Car Driving in which the level of traffic can be controlled, as well as the number of pedestrians that exist within the simulation obtaining a driver interaction in real mode and through a MSP430 microcontroller achieves the acquisition of data for storage. The sensors bring a continuous analog signal in time that needs signal conditioning, at this point, a signal amplifier is incorporated due to the acquired signals having a sensitive range of 1.25 mm/mV, also filtering that consists in eliminating the frequency bands of the signal in order to be interpretative and without noise to convert it from an analog signal into a digital signal to analyze the physiological signals of the drivers, these values are stored in a database. Based on this compilation, we work on the extraction of signal features and implement K-NN (k-nearest neighbor) classification methods and decision trees (unsupervised learning) that enable the study of data for the identification of patterns and determine by classification methods different effects of augmented reality on drivers. The expected results of this project include are a test platform instrumented with biometric sensors for data acquisition during driving and a database with the required variables to determine the effect caused by augmented reality on people in simulated driving.Keywords: augmented reality, driving, physiological signals, test platform
Procedia PDF Downloads 1428342 Electrochemical Study of Interaction of Thiol Containing Proteins with As (III)
Authors: Sunil Mittal, Sukhpreet Singh, Hardeep Kaur
Abstract:
The affinity of thiol group with heavy metals is a well-established phenomenon. The present investigation has been focused on electrochemical response of cysteine and thioredoxin against arsenite (As III) on indium tin oxide (ITO) electrodes. It was observed that both the compounds produce distinct response in free and immobilised form at the electrode. The SEM, FTIR, and impedance studies of the modified electrode were conducted for characterization. Various parameters were optimized to achieve As (III) effect on the reduction potential of the compounds. Cyclic voltammetry and linear sweep voltammetry were employed as the analysis techniques. The optimum response was observed at neutral pH in both the cases, at optimum concentration of 2 mM and 4.27 µM for cysteine and thioredoxin respectively. It was observed that presence of As (III) increases the reduction current of both the moieties. The linear range of detection for As (III) with cysteine was from 1 to 10 mg L⁻¹ with detection limit of 0.8 mg L⁻¹. The thioredoxin was found more sensitive to As (III) and displayed a linear range from 0.1 to 1 mg L⁻¹ with detection limit of 10 µg L⁻¹.Keywords: arsenite, cyclic voltammetry, cysteine, thioredoxin
Procedia PDF Downloads 2118341 Dynamic Analysis of Nanosize FG Rectangular Plates Based on Simple Nonlocal Quasi 3D HSDT
Authors: Sabrina Boutaleb, Fouad Bourad, Kouider Halim Benrahou, Abdelouahed Tounsi
Abstract:
In the present work, the dynamic analysis of the functionally graded rectangular nanoplates is studied. The theory of nonlocal elasticity based on the quasi 3D high shear deformation theory (quasi 3D HSDT) has been employed to determine the natural frequencies of the nanosized FG plate. In HSDT, a cubic function is employed in terms of thickness coordinates to introduce the influence of transverse shear deformation and stretching thickness. The theory of nonlocal elasticity is utilized to examine the impact of the small scale on the natural frequency of the FG rectangular nanoplate. The equations of motion are deduced by implementing Hamilton’s principle. To demonstrate the accuracy of the proposed method, the calculated results in specific cases are compared and examined with available results in the literature, and a good agreement is observed. Finally, the influence of the various parameters, such as the nonlocal coefficient, the material indexes, the aspect ratio, and the thickness-to-length ratio, on the dynamic properties of the FG nanoplates is illustrated and discussed in detail.Keywords: nonlocal elasticity theory, FG nanoplate, free vibration, refined theory, elastic foundation
Procedia PDF Downloads 1208340 Probability-Based Damage Detection of Structures Using Model Updating with Enhanced Ideal Gas Molecular Movement Algorithm
Authors: M. R. Ghasemi, R. Ghiasi, H. Varaee
Abstract:
Model updating method has received increasing attention in damage detection structures based on measured modal parameters. Therefore, a probability-based damage detection (PBDD) procedure based on a model updating procedure is presented in this paper, in which a one-stage model-based damage identification technique based on the dynamic features of a structure is investigated. The presented framework uses a finite element updating method with a Monte Carlo simulation that considers the uncertainty caused by measurement noise. Enhanced ideal gas molecular movement (EIGMM) is used as the main algorithm for model updating. Ideal gas molecular movement (IGMM) is a multiagent algorithm based on the ideal gas molecular movement. Ideal gas molecules disperse rapidly in different directions and cover all the space inside. This is embedded in the high speed of molecules, collisions between them and with the surrounding barriers. In IGMM algorithm to accomplish the optimal solutions, the initial population of gas molecules is randomly generated and the governing equations related to the velocity of gas molecules and collisions between those are utilized. In this paper, an enhanced version of IGMM, which removes unchanged variables after specified iterations, is developed. The proposed method is implemented on two numerical examples in the field of structural damage detection. The results show that the proposed method can perform well and competitive in PBDD of structures.Keywords: enhanced ideal gas molecular movement (EIGMM), ideal gas molecular movement (IGMM), model updating method, probability-based damage detection (PBDD), uncertainty quantification
Procedia PDF Downloads 2778339 Human Gesture Recognition for Real-Time Control of Humanoid Robot
Authors: S. Aswath, Chinmaya Krishna Tilak, Amal Suresh, Ganesh Udupa
Abstract:
There are technologies to control a humanoid robot in many ways. But the use of Electromyogram (EMG) electrodes has its own importance in setting up the control system. The EMG based control system helps to control robotic devices with more fidelity and precision. In this paper, development of an electromyogram based interface for human gesture recognition for the control of a humanoid robot is presented. To recognize control signs in the gestures, a single channel EMG sensor is positioned on the muscles of the human body. Instead of using a remote control unit, the humanoid robot is controlled by various gestures performed by the human. The EMG electrodes attached to the muscles generates an analog signal due to the effect of nerve impulses generated on moving muscles of the human being. The analog signals taken up from the muscles are supplied to a differential muscle sensor that processes the given signal to generate a signal suitable for the microcontroller to get the control over a humanoid robot. The signal from the differential muscle sensor is converted to a digital form using the ADC of the microcontroller and outputs its decision to the CM-530 humanoid robot controller through a Zigbee wireless interface. The output decision of the CM-530 processor is sent to a motor driver in order to control the servo motors in required direction for human like actions. This method for gaining control of a humanoid robot could be used for performing actions with more accuracy and ease. In addition, a study has been conducted to investigate the controllability and ease of use of the interface and the employed gestures.Keywords: electromyogram, gesture, muscle sensor, humanoid robot, microcontroller, Zigbee
Procedia PDF Downloads 4078338 Gravitational Frequency Shifts for Photons and Particles
Authors: Jing-Gang Xie
Abstract:
The research, in this case, considers the integration of the Quantum Field Theory and the General Relativity Theory. As two successful models in explaining behaviors of particles, they are incompatible since they work at different masses and scales of energy, with the evidence that regards the description of black holes and universe formation. It is so considering previous efforts in merging the two theories, including the likes of the String Theory, Quantum Gravity models, and others. In a bid to prove an actionable experiment, the paper’s approach starts with the derivations of the existing theories at present. It goes on to test the derivations by applying the same initial assumptions, coupled with several deviations. The resulting equations get similar results to those of classical Newton model, quantum mechanics, and general relativity as long as conditions are normal. However, outcomes are different when conditions are extreme, specifically with no breakdowns even for less than Schwarzschild radius, or at Planck length cases. Even so, it proves the possibilities of integrating the two theories.Keywords: general relativity theory, particles, photons, Quantum Gravity Model, gravitational frequency shift
Procedia PDF Downloads 3598337 Microstructural Evidences for Exhaustion Theory of Low Temperature Creep in Martensitic Steels
Authors: Nagarjuna Remalli, Robert Brandt
Abstract:
Down-sizing of combustion engines in automobiles are prevailed owing to required increase in efficiency. This leads to a stress increment on valve springs, which affects their intended function due to an increase in relaxation. High strength martensitic steels are used for valve spring applications. Recent investigations unveiled that low temperature creep (LTC) in martensitic steels obey a logarithmic creep law. The exhaustion theory links the logarithmic creep behavior to an activation energy which is characteristic for any given time during creep. This activation energy increases with creep strain due to barriers of low activation energies exhausted during creep. The assumption of the exhaustion theory is that the material is inhomogeneous in microscopic scale. According to these assumptions it is anticipated that small obstacles (e. g. ε–carbides) having a wide range of size distribution are non-uniformly distributed in the materials. X-ray diffraction studies revealed the presence of ε–carbides in high strength martensitic steels. In this study, high strength martensitic steels that are crept in the temperature range of 75 – 150 °C were investigated with the aid of a transmission electron microscope for the evidence of an inhomogeneous distribution of obstacles having different size to examine the validation of exhaustion theory.Keywords: creep mechanisms, exhaustion theory, low temperature creep, martensitic steels
Procedia PDF Downloads 2638336 Hyperspectral Imaging and Nonlinear Fukunaga-Koontz Transform Based Food Inspection
Authors: Hamidullah Binol, Abdullah Bal
Abstract:
Nowadays, food safety is a great public concern; therefore, robust and effective techniques are required for detecting the safety situation of goods. Hyperspectral Imaging (HSI) is an attractive material for researchers to inspect food quality and safety estimation such as meat quality assessment, automated poultry carcass inspection, quality evaluation of fish, bruise detection of apples, quality analysis and grading of citrus fruits, bruise detection of strawberry, visualization of sugar distribution of melons, measuring ripening of tomatoes, defect detection of pickling cucumber, and classification of wheat kernels. HSI can be used to concurrently collect large amounts of spatial and spectral data on the objects being observed. This technique yields with exceptional detection skills, which otherwise cannot be achieved with either imaging or spectroscopy alone. This paper presents a nonlinear technique based on kernel Fukunaga-Koontz transform (KFKT) for detection of fat content in ground meat using HSI. The KFKT which is the nonlinear version of FKT is one of the most effective techniques for solving problems involving two-pattern nature. The conventional FKT method has been improved with kernel machines for increasing the nonlinear discrimination ability and capturing higher order of statistics of data. The proposed approach in this paper aims to segment the fat content of the ground meat by regarding the fat as target class which is tried to be separated from the remaining classes (as clutter). We have applied the KFKT on visible and nearinfrared (VNIR) hyperspectral images of ground meat to determine fat percentage. The experimental studies indicate that the proposed technique produces high detection performance for fat ratio in ground meat.Keywords: food (ground meat) inspection, Fukunaga-Koontz transform, hyperspectral imaging, kernel methods
Procedia PDF Downloads 4318335 Dependency Theory on Examining the Relationship between the United States and the Middle East: In the Case of Iran, Saudi Arabia, and Turkey
Authors: Abdelhafez Abdel Hafez
Abstract:
Dependency theory was developed since 1950s, with economic concerns. It divided the world into two parts, the states of the peripheral (third world countries) and the states of the core (the developed capitalist countries). Another perspective developed to the theory with the implementation of the idea of semi-peripheral states in the new world order. With these divisions (core, peripheral, semi-peripheral) this study aims to develop a concept from the perspective of dependency theory, to understand the nature of the relationship of the U.S. with the Middle East Regions through its relation with Iran, Saudi Arabia, and Turkey. The tested countries (Saudi Arabia, Iran and Turkey) are seeking a foothold and influential role in the region. The paper argued that the U.S. directs its policies toward the region, in the way to guarantee no country of the region will be in semi-peripheral level (that could create competitions or danger on the U.S. interest). Therefore, U.S. policies in the region have varied from declaring war to diplomatic channels and sometimes ignoring. The paper is based on the dependency theory, and other international relations theories used to study the Middle East in the international context.Keywords: dependency, hegemony, imperialism, middle east
Procedia PDF Downloads 130