Search results for: zernike moment feature descriptor
2111 Analysis and Identification of Different Factors Affecting Students’ Performance Using a Correlation-Based Network Approach
Authors: Jeff Chak-Fu Wong, Tony Chun Yin Yip
Abstract:
The transition from secondary school to university seems exciting for many first-year students but can be more challenging than expected. Enabling instructors to know students’ learning habits and styles enhances their understanding of the students’ learning backgrounds, allows teachers to provide better support for their students, and has therefore high potential to improve teaching quality and learning, especially in any mathematics-related courses. The aim of this research is to collect students’ data using online surveys, to analyze students’ factors using learning analytics and educational data mining and to discover the characteristics of the students at risk of falling behind in their studies based on students’ previous academic backgrounds and collected data. In this paper, we use correlation-based distance methods and mutual information for measuring student factor relationships. We then develop a factor network using the Minimum Spanning Tree method and consider further study for analyzing the topological properties of these networks using social network analysis tools. Under the framework of mutual information, two graph-based feature filtering methods, i.e., unsupervised and supervised infinite feature selection algorithms, are used to analyze the results for students’ data to rank and select the appropriate subsets of features and yield effective results in identifying the factors affecting students at risk of failing. This discovered knowledge may help students as well as instructors enhance educational quality by finding out possible under-performers at the beginning of the first semester and applying more special attention to them in order to help in their learning process and improve their learning outcomes.Keywords: students' academic performance, correlation-based distance method, social network analysis, feature selection, graph-based feature filtering method
Procedia PDF Downloads 1292110 Features Reduction Using Bat Algorithm for Identification and Recognition of Parkinson Disease
Authors: P. Shrivastava, A. Shukla, K. Verma, S. Rungta
Abstract:
Parkinson's disease is a chronic neurological disorder that directly affects human gait. It leads to slowness of movement, causes muscle rigidity and tremors. Gait serve as a primary outcome measure for studies aiming at early recognition of disease. Using gait techniques, this paper implements efficient binary bat algorithm for an early detection of Parkinson's disease by selecting optimal features required for classification of affected patients from others. The data of 166 people, both fit and affected is collected and optimal feature selection is done using PSO and Bat algorithm. The reduced dataset is then classified using neural network. The experiments indicate that binary bat algorithm outperforms traditional PSO and genetic algorithm and gives a fairly good recognition rate even with the reduced dataset.Keywords: parkinson, gait, feature selection, bat algorithm
Procedia PDF Downloads 5452109 Curvelet Features with Mouth and Face Edge Ratios for Facial Expression Identification
Authors: S. Kherchaoui, A. Houacine
Abstract:
This paper presents a facial expression recognition system. It performs identification and classification of the seven basic expressions; happy, surprise, fear, disgust, sadness, anger, and neutral states. It consists of three main parts. The first one is the detection of a face and the corresponding facial features to extract the most expressive portion of the face, followed by a normalization of the region of interest. Then calculus of curvelet coefficients is performed with dimensionality reduction through principal component analysis. The resulting coefficients are combined with two ratios; mouth ratio and face edge ratio to constitute the whole feature vector. The third step is the classification of the emotional state using the SVM method in the feature space.Keywords: facial expression identification, curvelet coefficient, support vector machine (SVM), recognition system
Procedia PDF Downloads 2322108 Experimental Study on Single Bay RC Frame Designed Using EC8 under In-Plane Cyclic Loading
Authors: N. H. Hamid, M. S. Syaref, M. I. Adiyanto, M. Mohamed
Abstract:
A one-half scale of single-bay two-storey RC frame together with foundation beam and mass concrete block is investigated. Moment resisting RC frame was designed using EC8 by including the provision for seismic loading and detailing of its connection. The objective of the experimental work is to determine seismic behaviour RC frame under in-plane lateral cyclic loading using displacement control method. A double actuator is placed at centre of the mass concrete block at top of frame to represent the seismic load. The percentage drifts are starting from ±0.01% until ±2.25% with increment of ±0.25% drift. The ultimate lateral load of 158.48 kN was recorded at +2.25% drift in pushing and -126.09 kN in pulling direction. From the experimental hysteresis loops, the parameters such as lateral strength capacity, stiffness, ductility and equivalent viscous damping can be obtained. RC frame behaves in the elastic manner followed by inelastic behaviour after reaches the yield limit. The ductility value for this type frame is 4 which lies between the limit 3 and 6. Therefore, it is recommended to build this RC frame for moderate seismic regions under Ductility Class Medium (DCM) such as in Sabah, East Malaysia.Keywords: single bay, moment resisting RC frame, ductility class medium, inelastic behavior, seismic load
Procedia PDF Downloads 3882107 Recognizing Customer Preferences Using Review Documents: A Hybrid Text and Data Mining Approach
Authors: Oshin Anand, Atanu Rakshit
Abstract:
The vast increment in the e-commerce ventures makes this area a prominent research stream. Besides several quantified parameters, the textual content of reviews is a storehouse of many information that can educate companies and help them earn profit. This study is an attempt in this direction. The article attempts to categorize data based on a computed metric that quantifies the influencing capacity of reviews rendering two categories of high and low influential reviews. Further, each of these document is studied to conclude several product feature categories. Each of these categories along with the computed metric is converted to linguistic identifiers and are used in an association mining model. The article makes a novel attempt to combine feature attraction with quantified metric to categorize review text and finally provide frequent patterns that depict customer preferences. Frequent mentions in a highly influential score depict customer likes or preferred features in the product whereas prominent pattern in low influencing reviews highlights what is not important for customers. This is achieved using a hybrid approach of text mining for feature and term extraction, sentiment analysis, multicriteria decision-making technique and association mining model.Keywords: association mining, customer preference, frequent pattern, online reviews, text mining
Procedia PDF Downloads 3882106 From Type-I to Type-II Fuzzy System Modeling for Diagnosis of Hepatitis
Authors: Shahabeddin Sotudian, M. H. Fazel Zarandi, I. B. Turksen
Abstract:
Hepatitis is one of the most common and dangerous diseases that affects humankind, and exposes millions of people to serious health risks every year. Diagnosis of Hepatitis has always been a challenge for physicians. This paper presents an effective method for diagnosis of hepatitis based on interval Type-II fuzzy. This proposed system includes three steps: pre-processing (feature selection), Type-I and Type-II fuzzy classification, and system evaluation. KNN-FD feature selection is used as the preprocessing step in order to exclude irrelevant features and to improve classification performance and efficiency in generating the classification model. In the fuzzy classification step, an “indirect approach” is used for fuzzy system modeling by implementing the exponential compactness and separation index for determining the number of rules in the fuzzy clustering approach. Therefore, we first proposed a Type-I fuzzy system that had an accuracy of approximately 90.9%. In the proposed system, the process of diagnosis faces vagueness and uncertainty in the final decision. Thus, the imprecise knowledge was managed by using interval Type-II fuzzy logic. The results that were obtained show that interval Type-II fuzzy has the ability to diagnose hepatitis with an average accuracy of 93.94%. The classification accuracy obtained is the highest one reached thus far. The aforementioned rate of accuracy demonstrates that the Type-II fuzzy system has a better performance in comparison to Type-I and indicates a higher capability of Type-II fuzzy system for modeling uncertainty.Keywords: hepatitis disease, medical diagnosis, type-I fuzzy logic, type-II fuzzy logic, feature selection
Procedia PDF Downloads 3062105 Money Laundering and Governance in Cryptocurrencies: The Double-Edged Sword of Blockchain Technology
Abstract:
With the growing popularity of bitcoin transactions, criminals have exploited the bitcoin like cryptocurrencies, and cybercriminals such as money laundering have thrived. Unlike traditional currencies, the Internet-based virtual currencies can be used anonymously via the blockchain technology underpinning. In this paper, we analyze the double-edged sword features of blockchain technology in the context of money laundering. In particular, the traceability feature of blockchain-based system facilitates a level of governance, while the decentralization feature of blockchain-based system may bring governing difficulties. Based on the analysis, we propose guidelines for policy makers in governing blockchain-based cryptocurrency systems.Keywords: cryptocurrency, money laundering, blockchain, decentralization, traceability
Procedia PDF Downloads 2022104 Torsional Rigidities of Reinforced Concrete Beams Subjected to Elastic Lateral Torsional Buckling
Authors: Ilker Kalkan, Saruhan Kartal
Abstract:
Reinforced concrete (RC) beams rarely undergo lateral-torsional buckling (LTB), since these beams possess large lateral bending and torsional rigidities owing to their stocky cross-sections, unlike steel beams. However, the problem of LTB is becoming more and more pronounced in the last decades as the span lengths of concrete beams increase and the cross-sections become more slender with the use of pre-stressed concrete. The buckling moment of a beam mainly depends on its lateral bending rigidity and torsional rigidity. The nonhomogeneous and elastic-inelastic nature of RC complicates estimation of the buckling moments of concrete beams. Furthermore, the lateral bending and torsional rigidities of RC beams and the buckling moments are affected from different forms of concrete cracking, including flexural, torsional and restrained shrinkage cracking. The present study pertains to the effects of concrete cracking on the torsional rigidities of RC beams prone to elastic LTB. A series of tests on rather slender RC beams indicated that torsional cracking does not initiate until buckling in elastic LTB, while flexural cracking associated with lateral bending takes place even at the initial stages of loading. Hence, the present study clearly indicated that the un-cracked torsional rigidity needs to be used for estimating the buckling moments of RC beams liable to elastic LTB.Keywords: lateral stability, post-cracking torsional rigidity, uncracked torsional rigidity, critical moment
Procedia PDF Downloads 2362103 Terrain Classification for Ground Robots Based on Acoustic Features
Authors: Bernd Kiefer, Abraham Gebru Tesfay, Dietrich Klakow
Abstract:
The motivation of our work is to detect different terrain types traversed by a robot based on acoustic data from the robot-terrain interaction. Different acoustic features and classifiers were investigated, such as Mel-frequency cepstral coefficient and Gamma-tone frequency cepstral coefficient for the feature extraction, and Gaussian mixture model and Feed forward neural network for the classification. We analyze the system’s performance by comparing our proposed techniques with some other features surveyed from distinct related works. We achieve precision and recall values between 87% and 100% per class, and an average accuracy at 95.2%. We also study the effect of varying audio chunk size in the application phase of the models and find only a mild impact on performance.Keywords: acoustic features, autonomous robots, feature extraction, terrain classification
Procedia PDF Downloads 3682102 Using Scale Invariant Feature Transform Features to Recognize Characters in Natural Scene Images
Authors: Belaynesh Chekol, Numan Çelebi
Abstract:
The main purpose of this work is to recognize individual characters extracted from natural scene images using scale invariant feature transform (SIFT) features as an input to K-nearest neighbor (KNN); a classification learner algorithm. For this task, 1,068 and 78 images of English alphabet characters taken from Chars74k data set is used to train and test the classifier respectively. For each character image, We have generated describing features by using SIFT algorithm. This set of features is fed to the learner so that it can recognize and label new images of English characters. Two types of KNN (fine KNN and weighted KNN) were trained and the resulted classification accuracy is 56.9% and 56.5% respectively. The training time taken was the same for both fine and weighted KNN.Keywords: character recognition, KNN, natural scene image, SIFT
Procedia PDF Downloads 2812101 Behavior Factors Evaluation for Reinforced Concrete Structures
Authors: Muhammad Rizwan, Naveed Ahmad, Akhtar Naeem Khan
Abstract:
Seismic behavior factors are evaluated for the performance assessment of low rise reinforced concrete RC frame structures based on experimental study of unidirectional dynamic shake table testing of two 1/3rd reduced scaled two storey frames, with a code confirming special moment resisting frame (SMRF) model and a noncompliant model of similar characteristics but built in low strength concrete .The models were subjected to a scaled accelerogram record of 1994 Northridge earthquake to deformed the test models to final collapse stage in order to obtain the structural response parameters. The fully compliant model was observed with more stable beam-sway response, experiencing beam flexure yielding and ground-storey column base yielding upon subjecting to 100% of the record. The response modification factor - R factor obtained for the code complaint and deficient prototype structures were 7.5 and 4.5 respectively, which is about 10% and 40% less than the UBC-97 specified value for special moment resisting reinforced concrete frame structures.Keywords: Northridge 1994 earthquake, reinforced concrete frame, response modification factor, shake table testing
Procedia PDF Downloads 1732100 Structural Health Monitoring using Fibre Bragg Grating Sensors in Slab and Beams
Authors: Pierre van Tonder, Dinesh Muthoo, Kim twiname
Abstract:
Many existing and newly built structures are constructed on the design basis of the engineer and the workmanship of the construction company. However, when considering larger structures where more people are exposed to the building, its structural integrity is of great importance considering the safety of its occupants (Raghu, 2013). But how can the structural integrity of a building be monitored efficiently and effectively. This is where the fourth industrial revolution step in, and with minimal human interaction, data can be collected, analysed, and stored, which could also give an indication of any inconsistencies found in the data collected, this is where the Fibre Bragg Grating (FBG) monitoring system is introduced. This paper illustrates how data can be collected and converted to develop stress – strain behaviour and to produce bending moment diagrams for the utilisation and prediction of the structure’s integrity. Embedded fibre optic sensors were used in this study– fibre Bragg grating sensors in particular. The procedure entailed making use of the shift in wavelength demodulation technique and an inscription process of the phase mask technique. The fibre optic sensors considered in this report were photosensitive and embedded in the slab and beams for data collection and analysis. Two sets of fibre cables have been inserted, one purposely to collect temperature recordings and the other to collect strain and temperature. The data was collected over a time period and analysed used to produce bending moment diagrams to make predictions of the structure’s integrity. The data indicated the fibre Bragg grating sensing system proved to be useful and can be used for structural health monitoring in any environment. From the experimental data for the slab and beams, the moments were found to be64.33 kN.m, 64.35 kN.m and 45.20 kN.m (from the experimental bending moment diagram), and as per the idealistic (Ultimate Limit State), the data of 133 kN.m and 226.2 kN.m were obtained. The difference in values gave room for an early warning system, in other words, a reserve capacity of approximately 50% to failure.Keywords: fibre bragg grating, structural health monitoring, fibre optic sensors, beams
Procedia PDF Downloads 1392099 Machine Learning and Deep Learning Approach for People Recognition and Tracking in Crowd for Safety Monitoring
Authors: A. Degale Desta, Cheng Jian
Abstract:
Deep learning application in computer vision is rapidly advancing, giving it the ability to monitor the public and quickly identify potentially anomalous behaviour from crowd scenes. Therefore, the purpose of the current work is to improve the performance of safety of people in crowd events from panic behaviour through introducing the innovative idea of Aggregation of Ensembles (AOE), which makes use of the pre-trained ConvNets and a pool of classifiers to find anomalies in video data with packed scenes. According to the theory of algorithms that applied K-means, KNN, CNN, SVD, and Faster-CNN, YOLOv5 architectures learn different levels of semantic representation from crowd videos; the proposed approach leverages an ensemble of various fine-tuned convolutional neural networks (CNN), allowing for the extraction of enriched feature sets. In addition to the above algorithms, a long short-term memory neural network to forecast future feature values and a handmade feature that takes into consideration the peculiarities of the crowd to understand human behavior. On well-known datasets of panic situations, experiments are run to assess the effectiveness and precision of the suggested method. Results reveal that, compared to state-of-the-art methodologies, the system produces better and more promising results in terms of accuracy and processing speed.Keywords: action recognition, computer vision, crowd detecting and tracking, deep learning
Procedia PDF Downloads 1612098 A Deep Learning Approach to Subsection Identification in Electronic Health Records
Authors: Nitin Shravan, Sudarsun Santhiappan, B. Sivaselvan
Abstract:
Subsection identification, in the context of Electronic Health Records (EHRs), is identifying the important sections for down-stream tasks like auto-coding. In this work, we classify the text present in EHRs according to their information, using machine learning and deep learning techniques. We initially describe briefly about the problem and formulate it as a text classification problem. Then, we discuss upon the methods from the literature. We try two approaches - traditional feature extraction based machine learning methods and deep learning methods. Through experiments on a private dataset, we establish that the deep learning methods perform better than the feature extraction based Machine Learning Models.Keywords: deep learning, machine learning, semantic clinical classification, subsection identification, text classification
Procedia PDF Downloads 2172097 Application of the Micropolar Beam Theory for the Construction of the Discrete-Continual Model of Carbon Nanotubes
Authors: Samvel H. Sargsyan
Abstract:
Together with the study of electron-optical properties of nanostructures and proceeding from experiment-based data, the study of the mechanical properties of nanostructures has become quite actual. For the study of the mechanical properties of fullerene, carbon nanotubes, graphene and other nanostructures one of the crucial issues is the construction of their adequate mathematical models. Among all mathematical models of graphene or carbon nano-tubes, this so-called discrete-continuous model is specifically important. It substitutes the interactions between atoms by elastic beams or springs. The present paper demonstrates the construction of the discrete-continual beam model for carbon nanotubes or graphene, where the micropolar beam model based on the theory of moment elasticity is accepted. With the account of the energy balance principle, the elastic moment constants for the beam model, expressed by the physical and geometrical parameters of carbon nanotube or graphene, are determined. By switching from discrete-continual beam model to the continual, the models of micropolar elastic cylindrical shell and micropolar elastic plate are confirmed as continual models for carbon nanotube and graphene respectively.Keywords: carbon nanotube, discrete-continual, elastic, graphene, micropolar, plate, shell
Procedia PDF Downloads 1592096 A Survey of Feature-Based Steganalysis for JPEG Images
Authors: Syeda Mainaaz Unnisa, Deepa Suresh
Abstract:
Due to the increase in usage of public domain channels, such as the internet, and communication technology, there is a concern about the protection of intellectual property and security threats. This interest has led to growth in researching and implementing techniques for information hiding. Steganography is the art and science of hiding information in a private manner such that its existence cannot be recognized. Communication using steganographic techniques makes not only the secret message but also the presence of hidden communication, invisible. Steganalysis is the art of detecting the presence of this hidden communication. Parallel to steganography, steganalysis is also gaining prominence, since the detection of hidden messages can prevent catastrophic security incidents from occurring. Steganalysis can also be incredibly helpful in identifying and revealing holes with the current steganographic techniques, which makes them vulnerable to attacks. Through the formulation of new effective steganalysis methods, further research to improve the resistance of tested steganography techniques can be developed. Feature-based steganalysis method for JPEG images calculates the features of an image using the L1 norm of the difference between a stego image and the calibrated version of the image. This calibration can help retrieve some of the parameters of the cover image, revealing the variations between the cover and stego image and enabling a more accurate detection. Applying this method to various steganographic schemes, experimental results were compared and evaluated to derive conclusions and principles for more protected JPEG steganography.Keywords: cover image, feature-based steganalysis, information hiding, steganalysis, steganography
Procedia PDF Downloads 2162095 Contactless Attendance System along with Temperature Monitoring
Authors: Nalini C. Iyer, Shraddha H., Anagha B. Varahamurthy, Dikshith C. S., Ishwar G. Kubasad, Vinayak I. Karalatti, Pavan B. Mulimani
Abstract:
The current scenario of the pandemic due to COVID-19 has led to the awareness among the people to avoid unneces-sary contact in public places. There is a need to avoid contact with physical objects to stop the spreading of infection. The contactless feature has to be included in the systems in public places wherever possible. For example, attendance monitoring systems with fingerprint biometric can be replaced with a contactless feature. One more important protocol followed in the current situation is temperature monitoring and screening. The paper describes an attendance system with a contactless feature and temperature screening for the university. The system displays a QR code to scan, which redirects to the student login web page only if the location is valid (the location where the student scans the QR code should be the location of the display of the QR code). Once the student logs in, the temperature of the student is scanned by the contactless temperature sensor (mlx90614) with an error of 0.5°C. If the temperature falls in the range of the desired value (range of normal body temperature), then the attendance of the student is marked as present, stored in the database, and the door opens automatically. The attendance is marked as absent in the other case, alerted with the display of temperature, and the door remains closed. The door is automated with the help of a servomotor. To avoid the proxy, IR sensors are used to count the number of students in the classroom. The hardware system consisting of a contactless temperature sensor and IR sensor is implemented on the microcontroller, NodeMCU.Keywords: NodeMCU, IR sensor, attendance monitoring, contactless, temperature
Procedia PDF Downloads 1852094 Studying on Pile Seismic Operation with Numerical Method by Using FLAC 3D Software
Authors: Hossein Motaghedi, Kaveh Arkani, Siavash Salamatpoor
Abstract:
Usually the piles are important tools for safety and economical design of high and heavy structures. For this aim the response of single pile under dynamic load is so effective. Also, the agents which have influence on single pile response are properties of pile geometrical, soil and subjected loads. In this study the finite difference numerical method and by using FLAC 3D software is used for evaluation of single pile behavior under peak ground acceleration (PGA) of El Centro earthquake record in California (1940). The results of this models compared by experimental results of other researchers and it will be seen that the results of this models are approximately coincide by experimental data's. For example the maximum moment and displacement in top of the pile is corresponding to the other experimental results of pervious researchers. Furthermore, in this paper is tried to evaluate the effective properties between soil and pile. The results is shown that by increasing the pile diagonal, the pile top displacement will be decreased. As well as, by increasing the length of pile, the top displacement will be increased. Also, by increasing the stiffness ratio of pile to soil, the produced moment in pile body will be increased and the taller piles have more interaction by soils and have high inertia. So, these results can help directly to optimization design of pile dimensions.Keywords: pile seismic response, interaction between soil and pile, numerical analysis, FLAC 3D
Procedia PDF Downloads 3882093 USE-Net: SE-Block Enhanced U-Net Architecture for Robust Speaker Identification
Authors: Kilari Nikhil, Ankur Tibrewal, Srinivas Kruthiventi S. S.
Abstract:
Conventional speaker identification systems often fall short of capturing the diverse variations present in speech data due to fixed-scale architectures. In this research, we propose a CNN-based architecture, USENet, designed to overcome these limitations. Leveraging two key techniques, our approach achieves superior performance on the VoxCeleb 1 Dataset without any pre-training. Firstly, we adopt a U-net-inspired design to extract features at multiple scales, empowering our model to capture speech characteristics effectively. Secondly, we introduce the squeeze and excitation block to enhance spatial feature learning. The proposed architecture showcases significant advancements in speaker identification, outperforming existing methods, and holds promise for future research in this domain.Keywords: multi-scale feature extraction, squeeze and excitation, VoxCeleb1 speaker identification, mel-spectrograms, USENet
Procedia PDF Downloads 742092 DCDNet: Lightweight Document Corner Detection Network Based on Attention Mechanism
Authors: Kun Xu, Yuan Xu, Jia Qiao
Abstract:
The document detection plays an important role in optical character recognition and text analysis. Because the traditional detection methods have weak generalization ability, and deep neural network has complex structure and large number of parameters, which cannot be well applied in mobile devices, this paper proposes a lightweight Document Corner Detection Network (DCDNet). DCDNet is a two-stage architecture. The first stage with Encoder-Decoder structure adopts depthwise separable convolution to greatly reduce the network parameters. After introducing the Feature Attention Union (FAU) module, the second stage enhances the feature information of spatial and channel dim and adaptively adjusts the size of receptive field to enhance the feature expression ability of the model. Aiming at solving the problem of the large difference in the number of pixel distribution between corner and non-corner, Weighted Binary Cross Entropy Loss (WBCE Loss) is proposed to define corner detection problem as a classification problem to make the training process more efficient. In order to make up for the lack of Dataset of document corner detection, a Dataset containing 6620 images named Document Corner Detection Dataset (DCDD) is made. Experimental results show that the proposed method can obtain fast, stable and accurate detection results on DCDD.Keywords: document detection, corner detection, attention mechanism, lightweight
Procedia PDF Downloads 3542091 Development of a Computer Aided Diagnosis Tool for Brain Tumor Extraction and Classification
Authors: Fathi Kallel, Abdulelah Alabd Uljabbar, Abdulrahman Aldukhail, Abdulaziz Alomran
Abstract:
The brain is an important organ in our body since it is responsible about the majority actions such as vision, memory, etc. However, different diseases such as Alzheimer and tumors could affect the brain and conduct to a partial or full disorder. Regular diagnosis are necessary as a preventive measure and could help doctors to early detect a possible trouble and therefore taking the appropriate treatment, especially in the case of brain tumors. Different imaging modalities are proposed for diagnosis of brain tumor. The powerful and most used modality is the Magnetic Resonance Imaging (MRI). MRI images are analyzed by doctor in order to locate eventual tumor in the brain and describe the appropriate and needed treatment. Diverse image processing methods are also proposed for helping doctors in identifying and analyzing the tumor. In fact, a large Computer Aided Diagnostic (CAD) tools including developed image processing algorithms are proposed and exploited by doctors as a second opinion to analyze and identify the brain tumors. In this paper, we proposed a new advanced CAD for brain tumor identification, classification and feature extraction. Our proposed CAD includes three main parts. Firstly, we load the brain MRI. Secondly, a robust technique for brain tumor extraction is proposed. This technique is based on both Discrete Wavelet Transform (DWT) and Principal Component Analysis (PCA). DWT is characterized by its multiresolution analytic property, that’s why it was applied on MRI images with different decomposition levels for feature extraction. Nevertheless, this technique suffers from a main drawback since it necessitates a huge storage and is computationally expensive. To decrease the dimensions of the feature vector and the computing time, PCA technique is considered. In the last stage, according to different extracted features, the brain tumor is classified into either benign or malignant tumor using Support Vector Machine (SVM) algorithm. A CAD tool for brain tumor detection and classification, including all above-mentioned stages, is designed and developed using MATLAB guide user interface.Keywords: MRI, brain tumor, CAD, feature extraction, DWT, PCA, classification, SVM
Procedia PDF Downloads 2492090 Real-Time Pedestrian Detection Method Based on Improved YOLOv3
Authors: Jingting Luo, Yong Wang, Ying Wang
Abstract:
Pedestrian detection in image or video data is a very important and challenging task in security surveillance. The difficulty of this task is to locate and detect pedestrians of different scales in complex scenes accurately. To solve these problems, a deep neural network (RT-YOLOv3) is proposed to realize real-time pedestrian detection at different scales in security monitoring. RT-YOLOv3 improves the traditional YOLOv3 algorithm. Firstly, the deep residual network is added to extract vehicle features. Then six convolutional neural networks with different scales are designed and fused with the corresponding scale feature maps in the residual network to form the final feature pyramid to perform pedestrian detection tasks. This method can better characterize pedestrians. In order to further improve the accuracy and generalization ability of the model, a hybrid pedestrian data set training method is used to extract pedestrian data from the VOC data set and train with the INRIA pedestrian data set. Experiments show that the proposed RT-YOLOv3 method achieves 93.57% accuracy of mAP (mean average precision) and 46.52f/s (number of frames per second). In terms of accuracy, RT-YOLOv3 performs better than Fast R-CNN, Faster R-CNN, YOLO, SSD, YOLOv2, and YOLOv3. This method reduces the missed detection rate and false detection rate, improves the positioning accuracy, and meets the requirements of real-time detection of pedestrian objects.Keywords: pedestrian detection, feature detection, convolutional neural network, real-time detection, YOLOv3
Procedia PDF Downloads 1412089 Integration of Educational Data Mining Models to a Web-Based Support System for Predicting High School Student Performance
Authors: Sokkhey Phauk, Takeo Okazaki
Abstract:
The challenging task in educational institutions is to maximize the high performance of students and minimize the failure rate of poor-performing students. An effective method to leverage this task is to know student learning patterns with highly influencing factors and get an early prediction of student learning outcomes at the timely stage for setting up policies for improvement. Educational data mining (EDM) is an emerging disciplinary field of data mining, statistics, and machine learning concerned with extracting useful knowledge and information for the sake of improvement and development in the education environment. The study is of this work is to propose techniques in EDM and integrate it into a web-based system for predicting poor-performing students. A comparative study of prediction models is conducted. Subsequently, high performing models are developed to get higher performance. The hybrid random forest (Hybrid RF) produces the most successful classification. For the context of intervention and improving the learning outcomes, a feature selection method MICHI, which is the combination of mutual information (MI) and chi-square (CHI) algorithms based on the ranked feature scores, is introduced to select a dominant feature set that improves the performance of prediction and uses the obtained dominant set as information for intervention. By using the proposed techniques of EDM, an academic performance prediction system (APPS) is subsequently developed for educational stockholders to get an early prediction of student learning outcomes for timely intervention. Experimental outcomes and evaluation surveys report the effectiveness and usefulness of the developed system. The system is used to help educational stakeholders and related individuals for intervening and improving student performance.Keywords: academic performance prediction system, educational data mining, dominant factors, feature selection method, prediction model, student performance
Procedia PDF Downloads 1062088 Unsupervised Feature Learning by Pre-Route Simulation of Auto-Encoder Behavior Model
Authors: Youngjae Jin, Daeshik Kim
Abstract:
This paper describes a cycle accurate simulation results of weight values learned by an auto-encoder behavior model in terms of pre-route simulation. Given the results we visualized the first layer representations with natural images. Many common deep learning threads have focused on learning high-level abstraction of unlabeled raw data by unsupervised feature learning. However, in the process of handling such a huge amount of data, the learning method’s computation complexity and time limited advanced research. These limitations came from the fact these algorithms were computed by using only single core CPUs. For this reason, parallel-based hardware, FPGAs, was seen as a possible solution to overcome these limitations. We adopted and simulated the ready-made auto-encoder to design a behavior model in Verilog HDL before designing hardware. With the auto-encoder behavior model pre-route simulation, we obtained the cycle accurate results of the parameter of each hidden layer by using MODELSIM. The cycle accurate results are very important factor in designing a parallel-based digital hardware. Finally this paper shows an appropriate operation of behavior model based pre-route simulation. Moreover, we visualized learning latent representations of the first hidden layer with Kyoto natural image dataset.Keywords: auto-encoder, behavior model simulation, digital hardware design, pre-route simulation, Unsupervised feature learning
Procedia PDF Downloads 4462087 Agent-Base Modeling of IoT Applications by Using Software Product Line
Authors: Asad Abbas, Muhammad Fezan Afzal, Muhammad Latif Anjum, Muhammad Azmat
Abstract:
The Internet of Things (IoT) is used to link up real objects that use the internet to interact. IoT applications allow handling and operating the equipment in accordance with environmental needs, such as transportation and healthcare. IoT devices are linked together via a number of agents that act as a middleman for communications. The operation of a heat sensor differs indoors and outside because agent applications work with environmental variables. In this article, we suggest using Software Product Line (SPL) to model IoT agents and applications' features on an XML-based basis. The contextual diversity within the same domain of application can be handled, and the reusability of features is increased by XML-based feature modelling. For the purpose of managing contextual variability, we have embraced XML for modelling IoT applications, agents, and internet-connected devices.Keywords: IoT agents, IoT applications, software product line, feature model, XML
Procedia PDF Downloads 942086 Examples of RC Design with Eurocode2
Authors: Carla Ferreira, Helena Barros
Abstract:
The paper termed “Design of reinforced concrete with Eurocode 2” presents the theory regarding the design of reinforced concrete sections and the development of the tables and abacuses to verify the concrete section to the ultimate limit and service limit states. This paper is a complement of it, showing how to use the previous tools. Different numerical results are shown, proving the capability of the methodology. When a section of a beam is already chosen, the computer program presents the reinforcing steel in many locations along the structure, and it is the engineer´s task to choose the layout available for the construction, considering the maximum regular kind of reinforcing bars. There are many computer programs available for this task, but the interest of the present kind of tools is the fast and easy way of making the design and choose the optimal solution. Another application of these design tools is in the definition of the section dimensions, in a way that when stresses are evaluated, the final design is acceptable. In the design offices, these are considered by the engineers a very quick and useful way of designing reinforced concrete sections, employing variable strength concrete and higher steel classes. Examples of nonlinear analyses and redistribution of the bending moment will be considered, according to the Eurocode 2 recommendations, for sections under bending moment and axial forces. Examples of the evaluation of the service limit state will be presented.Keywords: design examples, eurocode 2, reinforced concrete, section design
Procedia PDF Downloads 722085 Seismic Response of Belt Truss System in Regular RC Frame Structure at the Different Positions of the Storey
Authors: Mohd Raish Ansari, Tauheed Alam Khan
Abstract:
This research paper is a comparative study of the belt truss in the Regular RC frame structure at the different positions of the floor. The method used in this research is the response spectrum method with the help of the ETABS Software, there are six models in this paper with belt truss. The Indian standard code used in this work are IS 456:2000, IS 800:2007, IS 875 part-1, IS 875 part-1, and IS 1893 Part-1:2016. The cross-section of the belt truss is the I-section, a grade of steel that is made up of Mild Steel. The basic model in this research paper is the same, only position of the belt truss is going to change, and the dimension of the belt truss is remain constant for all models. The plan area of all models is 24.5 meters x 28 meters, and the model has G+20, where the height of the ground floor is 3.5 meters, and all floor height is 3.0 meters remains constant. This comparative research work selected some important seismic parameters to check the stability of all models, the parameters are base shear, fundamental period, storey overturning moment, and maximum storey displacement.Keywords: belt truss, RC frames structure, ETABS, response spectrum analysis, special moment resisting frame
Procedia PDF Downloads 932084 Feature Based Unsupervised Intrusion Detection
Authors: Deeman Yousif Mahmood, Mohammed Abdullah Hussein
Abstract:
The goal of a network-based intrusion detection system is to classify activities of network traffics into two major categories: normal and attack (intrusive) activities. Nowadays, data mining and machine learning plays an important role in many sciences; including intrusion detection system (IDS) using both supervised and unsupervised techniques. However, one of the essential steps of data mining is feature selection that helps in improving the efficiency, performance and prediction rate of proposed approach. This paper applies unsupervised K-means clustering algorithm with information gain (IG) for feature selection and reduction to build a network intrusion detection system. For our experimental analysis, we have used the new NSL-KDD dataset, which is a modified dataset for KDDCup 1999 intrusion detection benchmark dataset. With a split of 60.0% for the training set and the remainder for the testing set, a 2 class classifications have been implemented (Normal, Attack). Weka framework which is a java based open source software consists of a collection of machine learning algorithms for data mining tasks has been used in the testing process. The experimental results show that the proposed approach is very accurate with low false positive rate and high true positive rate and it takes less learning time in comparison with using the full features of the dataset with the same algorithm.Keywords: information gain (IG), intrusion detection system (IDS), k-means clustering, Weka
Procedia PDF Downloads 2962083 Detection and Classification of Mammogram Images Using Principle Component Analysis and Lazy Classifiers
Authors: Rajkumar Kolangarakandy
Abstract:
Feature extraction and selection is the primary part of any mammogram classification algorithms. The choice of feature, attribute or measurements have an important influence in any classification system. Discrete Wavelet Transformation (DWT) coefficients are one of the prominent features for representing images in frequency domain. The features obtained after the decomposition of the mammogram images using wavelet transformations have higher dimension. Even though the features are higher in dimension, they were highly correlated and redundant in nature. The dimensionality reduction techniques play an important role in selecting the optimum number of features from the higher dimension data, which are highly correlated. PCA is a mathematical tool that reduces the dimensionality of the data while retaining most of the variation in the dataset. In this paper, a multilevel classification of mammogram images using reduced discrete wavelet transformation coefficients and lazy classifiers is proposed. The classification is accomplished in two different levels. In the first level, mammogram ROIs extracted from the dataset is classified as normal and abnormal types. In the second level, all the abnormal mammogram ROIs is classified into benign and malignant too. A further classification is also accomplished based on the variation in structure and intensity distribution of the images in the dataset. The Lazy classifiers called Kstar, IBL and LWL are used for classification. The classification results obtained with the reduced feature set is highly promising and the result is also compared with the performance obtained without dimension reduction.Keywords: PCA, wavelet transformation, lazy classifiers, Kstar, IBL, LWL
Procedia PDF Downloads 3352082 Fragility Assessment for Vertically Irregular Buildings with Soft Storey
Authors: N. Akhavan, Sh. Tavousi Tafreshi, A. Ghasemi
Abstract:
Seismic behavior of irregular structures through the past decades indicate that the stated buildings do not have appropriate performance. Among these subjects, the current paper has investigated the behavior of special steel moment frame with different configuration of soft storey vertically. The analyzing procedure has been evaluated with respect to incremental dynamic analysis (IDA), and numeric process was carried out by OpenSees finite element analysis package. To this end, nine 2D steel frames, with different numbers of stories and irregularity positions, which were subjected to seven pairs of ground motion records orthogonally with respect to Ibarra-Krawinkler deterioration model, have been investigated. This paper aims at evaluating the response of two-dimensional buildings incorporating soft storey which subjected to bi-directional seismic excitation. The IDAs were implemented for different stages of PGA with various ground motion records, in order to determine maximum inter-storey drift ratio. According to statistical elements and fracture range (standard deviation), the vulnerability or exceedance from above-mentioned cases has been examined. For this reason, fragility curves for different placement of soft storey in the first, middle and the last floor for 4, 8, and 16 storey buildings have been generated and compared properly.Keywords: special steel moment frame, soft storey, incremental dynamic analysis, fragility curve
Procedia PDF Downloads 349