Search results for: pattern classification
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4457

Search results for: pattern classification

4037 Easily Memorable Strong Password Generation and Retrieval

Authors: Shatadru Das, Natarajan Vijayarangan

Abstract:

In this paper, a system and method for generating and recovering an authorization code has been designed and analyzed. The system creates an authorization code by accepting a base-sentence from a user. Based on the characters present in this base-sentence, the system computes a base-sentence matrix. The system also generates a plurality of patterns. The user can either select the pattern from the multiple patterns suggested by the system or can create his/her own pattern. The system then performs multiplications between the base-sentence matrix and the selected pattern matrix at different stages in the path forward, for obtaining a strong authorization code. In case the user forgets the base sentence, the system has a provision to manage and retrieve 'forgotten authorization code'. This is done by fragmenting the base sentence into different matrices and storing the fragmented matrices into a repository after computing matrix multiplication with a security question-answer approach and with a secret key provided by the user.

Keywords: easy authentication, key retrieval, memorable passwords, strong password generation

Procedia PDF Downloads 374
4036 Ontology-Based Backpropagation Neural Network Classification and Reasoning Strategy for NoSQL and SQL Databases

Authors: Hao-Hsiang Ku, Ching-Ho Chi

Abstract:

Big data applications have become an imperative for many fields. Many researchers have been devoted into increasing correct rates and reducing time complexities. Hence, the study designs and proposes an Ontology-based backpropagation neural network classification and reasoning strategy for NoSQL big data applications, which is called ON4NoSQL. ON4NoSQL is responsible for enhancing the performances of classifications in NoSQL and SQL databases to build up mass behavior models. Mass behavior models are made by MapReduce techniques and Hadoop distributed file system based on Hadoop service platform. The reference engine of ON4NoSQL is the ontology-based backpropagation neural network classification and reasoning strategy. Simulation results indicate that ON4NoSQL can efficiently achieve to construct a high performance environment for data storing, searching, and retrieving.

Keywords: Hadoop, NoSQL, ontology, back propagation neural network, high distributed file system

Procedia PDF Downloads 242
4035 Identification of Breast Anomalies Based on Deep Convolutional Neural Networks and K-Nearest Neighbors

Authors: Ayyaz Hussain, Tariq Sadad

Abstract:

Breast cancer (BC) is one of the widespread ailments among females globally. The early prognosis of BC can decrease the mortality rate. Exact findings of benign tumors can avoid unnecessary biopsies and further treatments of patients under investigation. However, due to variations in images, it is a tough job to isolate cancerous cases from normal and benign ones. The machine learning technique is widely employed in the classification of BC pattern and prognosis. In this research, a deep convolution neural network (DCNN) called AlexNet architecture is employed to get more discriminative features from breast tissues. To achieve higher accuracy, K-nearest neighbor (KNN) classifiers are employed as a substitute for the softmax layer in deep learning. The proposed model is tested on a widely used breast image database called MIAS dataset for experimental purposes and achieved 99% accuracy.

Keywords: breast cancer, DCNN, KNN, mammography

Procedia PDF Downloads 116
4034 Land Use Change Detection Using Satellite Images for Najran City, Kingdom of Saudi Arabia (KSA)

Authors: Ismail Elkhrachy

Abstract:

Determination of land use changing is an important component of regional planning for applications ranging from urban fringe change detection to monitoring change detection of land use. This data are very useful for natural resources management.On the other hand, the technologies and methods of change detection also have evolved dramatically during past 20 years. So it has been well recognized that the change detection had become the best methods for researching dynamic change of land use by multi-temporal remotely-sensed data. The objective of this paper is to assess, evaluate and monitor land use change surrounding the area of Najran city, Kingdom of Saudi Arabia (KSA) using Landsat images (June 23, 2009) and ETM+ image(June. 21, 2014). The post-classification change detection technique was applied. At last,two-time subset images of Najran city are compared on a pixel-by-pixel basis using the post-classification comparison method and the from-to change matrix is produced, the land use change information obtained.Three classes were obtained, urban, bare land and agricultural land from unsupervised classification method by using Erdas Imagine and ArcGIS software. Accuracy assessment of classification has been performed before calculating change detection for study area. The obtained accuracy is between 61% to 87% percent for all the classes. Change detection analysis shows that rapid growth in urban area has been increased by 73.2%, the agricultural area has been decreased by 10.5 % and barren area reduced by 7% between 2009 and 2014. The quantitative study indicated that the area of urban class has unchanged by 58.2 km〗^2, gained 70.3 〖km〗^2 and lost 16 〖km〗^2. For bare land class 586.4〖km〗^2 has unchanged, 53.2〖km〗^2 has gained and 101.5〖km〗^2 has lost. While agriculture area class, 20.2〖km〗^2 has unchanged, 31.2〖km〗^2 has gained and 37.2〖km〗^2 has lost.

Keywords: land use, remote sensing, change detection, satellite images, image classification

Procedia PDF Downloads 504
4033 Instructional Information Resources

Authors: Parveen Kumar

Abstract:

This article discusses institute information resources. Information, in its most restricted technical sense, is a sequence of symbols that can be interpreted as message information can be recorded as signs, or transmitted as signals. Information is any kind of event that affects the state of a dynamic system. Conceptually, information is the message being conveyed. This concept has numerous other meanings in different contexts. Moreover, the concept of information is closely related to notions of constraint, communication, control, data, form, instruction, knowledge, meaning, mental stimulus, pattern, perception, representation, and especially entropy.

Keywords: institutions, information institutions, information services for mission-oriented institute, pattern

Procedia PDF Downloads 349
4032 The Necessity to Standardize Procedures of Providing Engineering Geological Data for Designing Road and Railway Tunneling Projects

Authors: Atefeh Saljooghi Khoshkar, Jafar Hassanpour

Abstract:

One of the main problems of the design stage relating to many tunneling projects is the lack of an appropriate standard for the provision of engineering geological data in a predefined format. In particular, this is more reflected in highway and railroad tunnel projects in which there is a number of tunnels and different professional teams involved. In this regard, comprehensive software needs to be designed using the accepted methods in order to help engineering geologists to prepare standard reports, which contain sufficient input data for the design stage. Regarding this necessity, applied software has been designed using macro capabilities and Visual Basic programming language (VBA) through Microsoft Excel. In this software, all of the engineering geological input data, which are required for designing different parts of tunnels, such as discontinuities properties, rock mass strength parameters, rock mass classification systems, boreability classification, the penetration rate, and so forth, can be calculated and reported in a standard format.

Keywords: engineering geology, rock mass classification, rock mechanic, tunnel

Procedia PDF Downloads 56
4031 Effect of Segregation Pattern of Mn, Si, and C on through Thickness Microstructure and Properties of Hot Rolled Steel

Authors: Waleed M. Al-Othman, Hamid Bayati, Abdullah Al-Shahrani, Haitham Al-Jabr

Abstract:

Pearlite bands commonly form parallel to the surface of the hot rolled steel and have significant influence on the properties of the steel. This study investigated the correlation between segregation pattern of Mn, Si, C and formation of the pearlite bands in hot rolled Gr 60 steel plate. Microstructural study indicated formation of a distinguished thick band at centerline of the plate with number of parallel bands through thickness of the steel plate. The thickness, frequency, and continuity of the bands are reduced from mid-thickness toward external surface of the steel plate. Analysis showed a noticeable increase of C, Si and Mn levels within the bands. Such alloying segregation takes place during metal solidification. EDS analysis verified presence of particles rich in Ti, Nb, Mn, C, N, within the bands. Texture analysis by Electron Backscatter Detector (EBSD) indicated the grains size/misorientation can noticeably change within the bands. Effect of banding on through-thickness properties of the steel was examined by carrying out microhardness, toughness and tensile tests. Results suggest the Mn and C contents are changed in sinusoidal pattern through thickness of the hot rolled plate and pearlite bands are formed at the peaks of this sinusoidal segregation pattern. Changes in grain size/misorientation, formation of highly alloyed particles, and pearlite within these bands, facilitate crack formation along boundaries of these bands.

Keywords: pearlite band, alloying segregation, hot rolling, Ti, Nb, N, C

Procedia PDF Downloads 116
4030 Classification of Manufacturing Data for Efficient Processing on an Edge-Cloud Network

Authors: Onyedikachi Ulelu, Andrew P. Longstaff, Simon Fletcher, Simon Parkinson

Abstract:

The widespread interest in 'Industry 4.0' or 'digital manufacturing' has led to significant research requiring the acquisition of data from sensors, instruments, and machine signals. In-depth research then identifies methods of analysis of the massive amounts of data generated before and during manufacture to solve a particular problem. The ultimate goal is for industrial Internet of Things (IIoT) data to be processed automatically to assist with either visualisation or autonomous system decision-making. However, the collection and processing of data in an industrial environment come with a cost. Little research has been undertaken on how to specify optimally what data to capture, transmit, process, and store at various levels of an edge-cloud network. The first step in this specification is to categorise IIoT data for efficient and effective use. This paper proposes the required attributes and classification to take manufacturing digital data from various sources to determine the most suitable location for data processing on the edge-cloud network. The proposed classification framework will minimise overhead in terms of network bandwidth/cost and processing time of machine tool data via efficient decision making on which dataset should be processed at the ‘edge’ and what to send to a remote server (cloud). A fast-and-frugal heuristic method is implemented for this decision-making. The framework is tested using case studies from industrial machine tools for machine productivity and maintenance.

Keywords: data classification, decision making, edge computing, industrial IoT, industry 4.0

Procedia PDF Downloads 156
4029 A Statistical Approach to Predict and Classify the Commercial Hatchability of Chickens Using Extrinsic Parameters of Breeders and Eggs

Authors: M. S. Wickramarachchi, L. S. Nawarathna, C. M. B. Dematawewa

Abstract:

Hatchery performance is critical for the profitability of poultry breeder operations. Some extrinsic parameters of eggs and breeders cause to increase or decrease the hatchability. This study aims to identify the affecting extrinsic parameters on the commercial hatchability of local chicken's eggs and determine the most efficient classification model with a hatchability rate greater than 90%. In this study, seven extrinsic parameters were considered: egg weight, moisture loss, breeders age, number of fertilised eggs, shell width, shell length, and shell thickness. Multiple linear regression was performed to determine the most influencing variable on hatchability. First, the correlation between each parameter and hatchability were checked. Then a multiple regression model was developed, and the accuracy of the fitted model was evaluated. Linear Discriminant Analysis (LDA), Classification and Regression Trees (CART), k-Nearest Neighbors (kNN), Support Vector Machines (SVM) with a linear kernel, and Random Forest (RF) algorithms were applied to classify the hatchability. This grouping process was conducted using binary classification techniques. Hatchability was negatively correlated with egg weight, breeders' age, shell width, shell length, and positive correlations were identified with moisture loss, number of fertilised eggs, and shell thickness. Multiple linear regression models were more accurate than single linear models regarding the highest coefficient of determination (R²) with 94% and minimum AIC and BIC values. According to the classification results, RF, CART, and kNN had performed the highest accuracy values 0.99, 0.975, and 0.972, respectively, for the commercial hatchery process. Therefore, the RF is the most appropriate machine learning algorithm for classifying the breeder outcomes, which are economically profitable or not, in a commercial hatchery.

Keywords: classification models, egg weight, fertilised eggs, multiple linear regression

Procedia PDF Downloads 72
4028 Kannada HandWritten Character Recognition by Edge Hinge and Edge Distribution Techniques Using Manhatan and Minimum Distance Classifiers

Authors: C. V. Aravinda, H. N. Prakash

Abstract:

In this paper, we tried to convey fusion and state of art pertaining to SIL character recognition systems. In the first step, the text is preprocessed and normalized to perform the text identification correctly. The second step involves extracting relevant and informative features. The third step implements the classification decision. The three stages which involved are Data acquisition and preprocessing, Feature extraction, and Classification. Here we concentrated on two techniques to obtain features, Feature Extraction & Feature Selection. Edge-hinge distribution is a feature that characterizes the changes in direction of a script stroke in handwritten text. The edge-hinge distribution is extracted by means of a windowpane that is slid over an edge-detected binary handwriting image. Whenever the mid pixel of the window is on, the two edge fragments (i.e. connected sequences of pixels) emerging from this mid pixel are measured. Their directions are measured and stored as pairs. A joint probability distribution is obtained from a large sample of such pairs. Despite continuous effort, handwriting identification remains a challenging issue, due to different approaches use different varieties of features, having different. Therefore, our study will focus on handwriting recognition based on feature selection to simplify features extracting task, optimize classification system complexity, reduce running time and improve the classification accuracy.

Keywords: word segmentation and recognition, character recognition, optical character recognition, hand written character recognition, South Indian languages

Procedia PDF Downloads 475
4027 An Optimal Perspective on Research in Translation Studies

Authors: Andrea Musumeci

Abstract:

General theory of translation has suffered the lack of a homogeneous academic dialect, a holistic methodology to account for the diversity of factors involved in the discipline. An underlying pattern amongst theories of translation belonging to different periods and schools has been identified. Such pattern, which is linguistics oriented, could play a role towards unified academic and professional environments, both in terms of research and as a professional category. The implementation of such an approach has also led to a critique of the concept of equivalence, as being not the best of ways to account for translating phenomena.

Keywords: optimal, translating, research translation theory, methodology, descriptive analysis

Procedia PDF Downloads 600
4026 Music Genre Classification Based on Non-Negative Matrix Factorization Features

Authors: Soyon Kim, Edward Kim

Abstract:

In order to retrieve information from the massive stream of songs in the music industry, music search by title, lyrics, artist, mood, and genre has become more important. Despite the subjectivity and controversy over the definition of music genres across different nations and cultures, automatic genre classification systems that facilitate the process of music categorization have been developed. Manual genre selection by music producers is being provided as statistical data for designing automatic genre classification systems. In this paper, an automatic music genre classification system utilizing non-negative matrix factorization (NMF) is proposed. Short-term characteristics of the music signal can be captured based on the timbre features such as mel-frequency cepstral coefficient (MFCC), decorrelated filter bank (DFB), octave-based spectral contrast (OSC), and octave band sum (OBS). Long-term time-varying characteristics of the music signal can be summarized with (1) the statistical features such as mean, variance, minimum, and maximum of the timbre features and (2) the modulation spectrum features such as spectral flatness measure, spectral crest measure, spectral peak, spectral valley, and spectral contrast of the timbre features. Not only these conventional basic long-term feature vectors, but also NMF based feature vectors are proposed to be used together for genre classification. In the training stage, NMF basis vectors were extracted for each genre class. The NMF features were calculated in the log spectral magnitude domain (NMF-LSM) as well as in the basic feature vector domain (NMF-BFV). For NMF-LSM, an entire full band spectrum was used. However, for NMF-BFV, only low band spectrum was used since high frequency modulation spectrum of the basic feature vectors did not contain important information for genre classification. In the test stage, using the set of pre-trained NMF basis vectors, the genre classification system extracted the NMF weighting values of each genre as the NMF feature vectors. A support vector machine (SVM) was used as a classifier. The GTZAN multi-genre music database was used for training and testing. It is composed of 10 genres and 100 songs for each genre. To increase the reliability of the experiments, 10-fold cross validation was used. For a given input song, an extracted NMF-LSM feature vector was composed of 10 weighting values that corresponded to the classification probabilities for 10 genres. An NMF-BFV feature vector also had a dimensionality of 10. Combined with the basic long-term features such as statistical features and modulation spectrum features, the NMF features provided the increased accuracy with a slight increase in feature dimensionality. The conventional basic features by themselves yielded 84.0% accuracy, but the basic features with NMF-LSM and NMF-BFV provided 85.1% and 84.2% accuracy, respectively. The basic features required dimensionality of 460, but NMF-LSM and NMF-BFV required dimensionalities of 10 and 10, respectively. Combining the basic features, NMF-LSM and NMF-BFV together with the SVM with a radial basis function (RBF) kernel produced the significantly higher classification accuracy of 88.3% with a feature dimensionality of 480.

Keywords: mel-frequency cepstral coefficient (MFCC), music genre classification, non-negative matrix factorization (NMF), support vector machine (SVM)

Procedia PDF Downloads 274
4025 Timbuktu Pattern of Islamic Education: A Role Model for the Establishment of Islamic Educational System in Sokoto Caliphate

Authors: A. M. Gada, H. U. Malami

Abstract:

Timbuktu is one of the eight regions in the present day the Republic of Mali. It flourished as one of the earliest centres of Islamic learning in West Africa in the eleventh century CE. The famous Islamic centre in Timbuktu is situated in the Sankore mosque, which is known to be one of the earliest established Islamic University. This centre produced scholars who were zealous in disseminating Islamic education to different parts of West Africa and beyond. As a result, most of these centres adopted the Timbuktu pattern of learning. Some of the beneficiaries of this noble activity are Muslim scholars which are responsible for the establishment of the Sokoto Caliphate in the early nineteenth century. This paper intends to reflect on the pattern of Islamic education of the Timbuktu scholars and see how it impacted on the Islamic centres of learning established by these Jihad-scholars who were successful in the establishment of an Islamic state known as the Sokoto Caliphate.

Keywords: Timbuktu, Sankore, Islamic educational system, Sokoto Caliphate, centres of Islamic learning

Procedia PDF Downloads 389
4024 Experimental Performance of Vertical Diffusion Stills Utilizing Folded Sheets for Water Desalination

Authors: M. Mortada, A. Seleem, M. El-Morsi, M. Younan

Abstract:

The present study introduces the folding technology to be utilized for the first time in vertical diffusion stills. This work represents a model of the distillation process by utilizing chevron pattern of folded structure. An experimental setup has been constructed, to investigate the performance of the folded sheets in the vertical effect diffusion still for a specific range of operating conditions. An experimental comparison between the folded type and the flat type sheets has been carried out. The folded pattern showed a higher performance and there is an increase in the condensate to feed ratio that ranges from 20-30 % through the operating hot plate temperature that ranges through 60-90°C. In addition, a parametric analysis of the system using Design of Experiments statistical technique, has been developed using the experimental results to determine the effect of operating conditions on the system's performance and the best operating conditions of the system has been evaluated.

Keywords: chevron pattern, fold structure, solar distillation, vertical diffusion still

Procedia PDF Downloads 442
4023 Dietary Pattern and Risk of Breast Cancer Among Women:a Case Control Study

Authors: Huma Naqeeb

Abstract:

Epidemiological studies have shown the robust link between breast cancer and dietary pattern. There has been no previous study conducted in Pakistan, which specifically focuses on dietary patterns among breast cancer women. This study aims to examine the association of breast cancer with dietary patterns among Pakistani women. This case-control research was carried in multiple tertiary care facilities. Newly diagnosed primary breast cancer patients were recruited as cases (n = 408); age matched controls (n = 408) were randomly selected from the general population. Data on required parameters were systematically collected using subjective and objective tools. Factor and Principal Component Analysis (PCA) techniques were used to extract women’s dietary patterns. Four dietary patterns were identified based on eigenvalue >1; (i) veg-ovo-fish, (ii) meat-fat-sweet, (iii) mix (milk and its products, and gourds vegetables) and (iv) lentils - spices. Results of the multiple regressions were displayed as adjusted odds ratio (Adj. OR) and their respective confidence intervals (95% CI). After adjusted for potential confounders, veg-ovo-fish dietary pattern was found to be robustly associated with a lower risk of breast cancer among women (Adj. OR: 0.68, 95%CI: (0.46-0.99, p<0.01). The study findings concluded that attachment to the diets majorly composed of fresh vegetables, and high quality protein sources may contribute in lowering the risk of breast cancer among women.

Keywords: breast cancer, dietary pattern, women, principal component analysis

Procedia PDF Downloads 104
4022 A Cadaveric Study of Branching Pattern of Arch of Aorta and Its Clinical Significance in Nepalese Population

Authors: Gulam Anwer Khan, A. Gautam

Abstract:

Background: The arch of aorta is a large artery that arches over the root of the left lung and connects the ascending aorta and descending aorta. It is situated in the superior mediastinum behind the manubrium sterni. It gives off three major branches i.e. brachiocephalic trunk, left common carotid artery and left subclavian artery arising from the superior surface of arch of aorta from right to left. Material and Methods: This was a descriptive study. It was carried out in 44 cadavers, obtained during dissections for undergraduates of Department of Anatomy, Chitwan Medical College, Bharatpur, Chitwan, between March 2015 to October 2016. Cadavers of both sexes were included in the present study. The arch of aorta was dissected and exposed according to the methods described by Romanes in Cunningham’s manual of practical anatomy. Results: Out of 44 dissected cadavers, 35 (79.54%) were male and 9 (20.46%) were female cadavers. The normal branching pattern of the arch of aorta was encountered in 28 (63.64%) cadavers and the remaining 16 (36.36%) cadavers showed variations in the branching pattern of arch of aorta. Two different types of variations on the branching pattern of arch of aorta were noted in the present study, in which 12 (27.27%) cadavers had common trunk of the Arch of Aorta. In 3 (5.00%) male cadavers, we found the origin of the Thyroid ima artery. This variation was noted in 1(1.66%) female cadaver. Conclusion: The present study carried out on adult human cadavers’ revealed wide variations in the branching pattern of the arch of ao rta. These variations are of clinical significance and also very useful for the anatomists, radiologists, anesthesiologists, surgeons for practice during angiography, instrumentation, supra-aortic thoracic, head and neck surgery.

Keywords: arch of aorta, brachiocephalic trunk, left common carotid artery, left subclavian artery, Thyroidea ima artery

Procedia PDF Downloads 312
4021 Decision Making System for Clinical Datasets

Authors: P. Bharathiraja

Abstract:

Computer Aided decision making system is used to enhance diagnosis and prognosis of diseases and also to assist clinicians and junior doctors in clinical decision making. Medical Data used for decision making should be definite and consistent. Data Mining and soft computing techniques are used for cleaning the data and for incorporating human reasoning in decision making systems. Fuzzy rule based inference technique can be used for classification in order to incorporate human reasoning in the decision making process. In this work, missing values are imputed using the mean or mode of the attribute. The data are normalized using min-ma normalization to improve the design and efficiency of the fuzzy inference system. The fuzzy inference system is used to handle the uncertainties that exist in the medical data. Equal-width-partitioning is used to partition the attribute values into appropriate fuzzy intervals. Fuzzy rules are generated using Class Based Associative rule mining algorithm. The system is trained and tested using heart disease data set from the University of California at Irvine (UCI) Machine Learning Repository. The data was split using a hold out approach into training and testing data. From the experimental results it can be inferred that classification using fuzzy inference system performs better than trivial IF-THEN rule based classification approaches. Furthermore it is observed that the use of fuzzy logic and fuzzy inference mechanism handles uncertainty and also resembles human decision making. The system can be used in the absence of a clinical expert to assist junior doctors and clinicians in clinical decision making.

Keywords: decision making, data mining, normalization, fuzzy rule, classification

Procedia PDF Downloads 495
4020 Numerical Simulation of Magnetohydrodynamic (MHD) Blood Flow in a Stenosed Artery

Authors: Sreeparna Majee, G. C. Shit

Abstract:

Unsteady blood flow has been numerically investigated through stenosed arteries to achieve an idea about the physiological blood flow pattern in diseased arteries. The blood is treated as Newtonian fluid and the arterial wall is considered to be rigid having deposition of plaque in its lumen. For direct numerical simulation, vorticity-stream function formulation has been adopted to solve the problem using implicit finite difference method by developing well known Peaceman-Rachford Alternating Direction Implicit (ADI) scheme. The effects of magnetic parameter and Reynolds number on velocity and wall shear stress are being studied and presented quantitatively over the entire arterial segment. The streamlines have been plotted to understand the flow pattern in the stenosed artery, which has significant alterations in the downstream of the stenosis in the presence of magnetic field. The results show that there are nominal changes in the flow pattern when magnetic field strength is enhanced upto 8T which can have remarkable usage to MRI machines.

Keywords: magnetohydrodynamics, blood flow, stenosis, energy dissipation

Procedia PDF Downloads 257
4019 Dual-Channel Reliable Breast Ultrasound Image Classification Based on Explainable Attribution and Uncertainty Quantification

Authors: Haonan Hu, Shuge Lei, Dasheng Sun, Huabin Zhang, Kehong Yuan, Jian Dai, Jijun Tang

Abstract:

This paper focuses on the classification task of breast ultrasound images and conducts research on the reliability measurement of classification results. A dual-channel evaluation framework was developed based on the proposed inference reliability and predictive reliability scores. For the inference reliability evaluation, human-aligned and doctor-agreed inference rationals based on the improved feature attribution algorithm SP-RISA are gracefully applied. Uncertainty quantification is used to evaluate the predictive reliability via the test time enhancement. The effectiveness of this reliability evaluation framework has been verified on the breast ultrasound clinical dataset YBUS, and its robustness is verified on the public dataset BUSI. The expected calibration errors on both datasets are significantly lower than traditional evaluation methods, which proves the effectiveness of the proposed reliability measurement.

Keywords: medical imaging, ultrasound imaging, XAI, uncertainty measurement, trustworthy AI

Procedia PDF Downloads 68
4018 OAS and Interstate Dispute Resolution at the Beginning of the 21st Century: General Pattern and Peculiarities

Authors: Victor Jeifets, Liliia Khadorich

Abstract:

The paper describes the OAS role in dispute resolution. The authors make an attempt to identify a general pattern of the OAS activities within the peaceful settlement of interstate conflicts, in the beginning of 21st century, as well as to analyze some features of Honduras–Belize, Nicaragua–Honduras, Honduras–El Salvador, Costa-Rica–Nicaragua, Colombia–Ecuador cases.

Keywords: OAS, peace maintenance, border dispute, dispute resolution, peaceful settlement

Procedia PDF Downloads 473
4017 A Multi-Output Network with U-Net Enhanced Class Activation Map and Robust Classification Performance for Medical Imaging Analysis

Authors: Jaiden Xuan Schraut, Leon Liu, Yiqiao Yin

Abstract:

Computer vision in medical diagnosis has achieved a high level of success in diagnosing diseases with high accuracy. However, conventional classifiers that produce an image to-label result provides insufficient information for medical professionals to judge and raise concerns over the trust and reliability of a model with results that cannot be explained. In order to gain local insight into cancerous regions, separate tasks such as imaging segmentation need to be implemented to aid the doctors in treating patients, which doubles the training time and costs which renders the diagnosis system inefficient and difficult to be accepted by the public. To tackle this issue and drive AI-first medical solutions further, this paper proposes a multi-output network that follows a U-Net architecture for image segmentation output and features an additional convolutional neural networks (CNN) module for auxiliary classification output. Class activation maps are a method of providing insight into a convolutional neural network’s feature maps that leads to its classification but in the case of lung diseases, the region of interest is enhanced by U-net-assisted Class Activation Map (CAM) visualization. Therefore, our proposed model combines image segmentation models and classifiers to crop out only the lung region of a chest X-ray’s class activation map to provide a visualization that improves the explainability and is able to generate classification results simultaneously which builds trust for AI-led diagnosis systems. The proposed U-Net model achieves 97.61% accuracy and a dice coefficient of 0.97 on testing data from the COVID-QU-Ex Dataset which includes both diseased and healthy lungs.

Keywords: multi-output network model, U-net, class activation map, image classification, medical imaging analysis

Procedia PDF Downloads 174
4016 Optimal Cropping Pattern in an Irrigation Project: A Hybrid Model of Artificial Neural Network and Modified Simplex Algorithm

Authors: Safayat Ali Shaikh

Abstract:

Software has been developed for optimal cropping pattern in an irrigation project considering land constraint, water availability constraint and pick up flow constraint using modified Simplex Algorithm. Artificial Neural Network Models (ANN) have been developed to predict rainfall. AR (1) model used to generate 1000 years rainfall data to train the ANN. Simulation has been done with expected rainfall data. Eight number crops and three types of soil class have been considered for optimization model. Area under each crop and each soil class have been quantified using Modified Simplex Algorithm to get optimum net return. Efficacy of the software has been tested using data of large irrigation project in India.

Keywords: artificial neural network, large irrigation project, modified simplex algorithm, optimal cropping pattern

Procedia PDF Downloads 184
4015 Survival Pattern of Under-five Mortality in High Focus States in India

Authors: Rahul Kumar

Abstract:

Background: Under-FiveMortality Rate(U5MR)ofanationiswidelyacceptedandlong-standing indicators of well-beingofherchildren.They measuredtheprobability of dying before theageoffive(expressedper1000livebirths).TheU5MRisanappropriate indicator of the cumulative exposure totheriskofdeathduringthefirstfiveyearsoflife, and accepted globalindicator ofthehealthandsocioeconomicstatusofagiven population.Itisalsousefulforassessing theimpactofvariousintervention programmes aimed at improving child survival.Under-fivemortalitytrendsconstitutealeadingindicatorofthelevel ofchildhealthandoveralldevelopmentincountries. Objectives: The first aim of our research is to study the level, trends, and Pattern of Under-five mortality using different sources of data. The second objective is to examine the survival pattern of Under-five mortality by different background characteristics. Data Source and Methodology: SRS and NFHS data have been used forobservingthelevelandtrendofUnder-Five mortality rate. Kaplan Meier Estimate has been used to understand the survival Pattern of Under-five mortality. Result: WefindthatallmostallthestatesmadesomeprogressbyreducingU5MRin recent decades.During1992-93highestU5MR(per thousand live birth) was observed in Assam(142)followed by up(141),Odisha(131),MP(130),andBihar(127.5).While the least U5MR(perthousandlive birth)wasobservedinRajasthan(102). The highestU5MR(per thousandlive birth)isobservedinUP(78.1), followed by MP(64.9)and Chhattisgarh(63.7)which are far away from the national level(50). Among them, Uttarakhand(46.7)hadleastU5MR(perthousandlivebirth), followed by Odisha(48.6). TheU5MR(perthousandlivebirth)ofcombinedhighfocusstateis63.7whichisfar away fromthenationallevel(50). Weidentified thatthesurvivalprobability ofunder-fivechildrenfromadolescentmotherislessin comparisontootherchildrenbornby differentagegroupofmothers. thatduringneonatalperiodusually male mortality exceedsthefemale mortality butthisdifferentialreversedinthepostneonatalperiod. Astheirageincreasesand approachingtofiveyears,weidentifiedthatthesurvivalprobability ofbothsexdecreasesbut female’s survival probabilitydecrement is more than male as their ageincreases. The poorer children’s survival probability is minimum. Children using improved toilet facility has more survival probability throughout thefiveyearsthan who uses unimproved. The survival probability of children under five who got Full ANCis more than the survival probability of children under five who doesn’t get any ANC. Conclusions: Improvement of maternal education is an urgent need to improve their health seeking behavior and thus the health of their children. Awareness on reproductive health and environmental sanitation should be strengthened.

Keywords: under-five mortality, survival pattern, ANC, trend

Procedia PDF Downloads 108
4014 Smartphone Video Source Identification Based on Sensor Pattern Noise

Authors: Raquel Ramos López, Anissa El-Khattabi, Ana Lucila Sandoval Orozco, Luis Javier García Villalba

Abstract:

An increasing number of mobile devices with integrated cameras has meant that most digital video comes from these devices. These digital videos can be made anytime, anywhere and for different purposes. They can also be shared on the Internet in a short period of time and may sometimes contain recordings of illegal acts. The need to reliably trace the origin becomes evident when these videos are used for forensic purposes. This work proposes an algorithm to identify the brand and model of mobile device which generated the video. Its procedure is as follows: after obtaining the relevant video information, a classification algorithm based on sensor noise and Wavelet Transform performs the aforementioned identification process. We also present experimental results that support the validity of the techniques used and show promising results.

Keywords: digital video, forensics analysis, key frame, mobile device, PRNU, sensor noise, source identification

Procedia PDF Downloads 408
4013 Exploring the Impact of Body Shape on Bra Fit: Integrating 3D Body Scanning and Traditional Patternmaking Methods

Authors: Yin-Ching Keung, Kit-Lun Yick

Abstract:

The issue of bra fitting has persisted throughout history despite advancements in molded bra cups. To gain a deeper understanding of the interaction between the breast and bra pattern, this study combines the art of traditional bra patternmaking with 3D body scanning technology. By employing a 2D bra pattern drafting method and analyzing the effect of body shape on the desired bra cup shape, the study focuses on the differentiation of the lower cup among bras designed for flat and round body-shaped breasts. The results shed light on the impact of body shape on bra fit and provide valuable insights for further research and improvements in bra design, pattern drafting, and fit. The integration of 3D body scanning technology enhances the accuracy and precision of measurements, allowing for a more comprehensive analysis of the unique contours and dimensions of the breast and body. Ultimately, the study aims to provide individuals with different body shapes a more comfortable and well-fitted bra-wearing experience, contributing to the ongoing efforts to alleviate the longstanding problem of bra fitting.

Keywords: breast shapes, bra fitting, 3D body scanning, bra patternmaking

Procedia PDF Downloads 32
4012 Phenotype Prediction of DNA Sequence Data: A Machine and Statistical Learning Approach

Authors: Mpho Mokoatle, Darlington Mapiye, James Mashiyane, Stephanie Muller, Gciniwe Dlamini

Abstract:

Great advances in high-throughput sequencing technologies have resulted in availability of huge amounts of sequencing data in public and private repositories, enabling a holistic understanding of complex biological phenomena. Sequence data are used for a wide range of applications such as gene annotations, expression studies, personalized treatment and precision medicine. However, this rapid growth in sequence data poses a great challenge which calls for novel data processing and analytic methods, as well as huge computing resources. In this work, a machine and statistical learning approach for DNA sequence classification based on $k$-mer representation of sequence data is proposed. The approach is tested using whole genome sequences of Mycobacterium tuberculosis (MTB) isolates to (i) reduce the size of genomic sequence data, (ii) identify an optimum size of k-mers and utilize it to build classification models, (iii) predict the phenotype from whole genome sequence data of a given bacterial isolate, and (iv) demonstrate computing challenges associated with the analysis of whole genome sequence data in producing interpretable and explainable insights. The classification models were trained on 104 whole genome sequences of MTB isoloates. Cluster analysis showed that k-mers maybe used to discriminate phenotypes and the discrimination becomes more concise as the size of k-mers increase. The best performing classification model had a k-mer size of 10 (longest k-mer) an accuracy, recall, precision, specificity, and Matthews Correlation coeffient of 72.0%, 80.5%, 80.5%, 63.6%, and 0.4 respectively. This study provides a comprehensive approach for resampling whole genome sequencing data, objectively selecting a k-mer size, and performing classification for phenotype prediction. The analysis also highlights the importance of increasing the k-mer size to produce more biological explainable results, which brings to the fore the interplay that exists amongst accuracy, computing resources and explainability of classification results. However, the analysis provides a new way to elucidate genetic information from genomic data, and identify phenotype relationships which are important especially in explaining complex biological mechanisms.

Keywords: AWD-LSTM, bootstrapping, k-mers, next generation sequencing

Procedia PDF Downloads 142
4011 Phenotype Prediction of DNA Sequence Data: A Machine and Statistical Learning Approach

Authors: Darlington Mapiye, Mpho Mokoatle, James Mashiyane, Stephanie Muller, Gciniwe Dlamini

Abstract:

Great advances in high-throughput sequencing technologies have resulted in availability of huge amounts of sequencing data in public and private repositories, enabling a holistic understanding of complex biological phenomena. Sequence data are used for a wide range of applications such as gene annotations, expression studies, personalized treatment and precision medicine. However, this rapid growth in sequence data poses a great challenge which calls for novel data processing and analytic methods, as well as huge computing resources. In this work, a machine and statistical learning approach for DNA sequence classification based on k-mer representation of sequence data is proposed. The approach is tested using whole genome sequences of Mycobacterium tuberculosis (MTB) isolates to (i) reduce the size of genomic sequence data, (ii) identify an optimum size of k-mers and utilize it to build classification models, (iii) predict the phenotype from whole genome sequence data of a given bacterial isolate, and (iv) demonstrate computing challenges associated with the analysis of whole genome sequence data in producing interpretable and explainable insights. The classification models were trained on 104 whole genome sequences of MTB isoloates. Cluster analysis showed that k-mers maybe used to discriminate phenotypes and the discrimination becomes more concise as the size of k-mers increase. The best performing classification model had a k-mer size of 10 (longest k-mer) an accuracy, recall, precision, specificity, and Matthews Correlation coeffient of 72.0 %, 80.5 %, 80.5 %, 63.6 %, and 0.4 respectively. This study provides a comprehensive approach for resampling whole genome sequencing data, objectively selecting a k-mer size, and performing classification for phenotype prediction. The analysis also highlights the importance of increasing the k-mer size to produce more biological explainable results, which brings to the fore the interplay that exists amongst accuracy, computing resources and explainability of classification results. However, the analysis provides a new way to elucidate genetic information from genomic data, and identify phenotype relationships which are important especially in explaining complex biological mechanisms

Keywords: AWD-LSTM, bootstrapping, k-mers, next generation sequencing

Procedia PDF Downloads 129
4010 Identification of Damage Mechanisms in Interlock Reinforced Composites Using a Pattern Recognition Approach of Acoustic Emission Data

Authors: M. Kharrat, G. Moreau, Z. Aboura

Abstract:

The latest advances in the weaving industry, combined with increasingly sophisticated means of materials processing, have made it possible to produce complex 3D composite structures. Mainly used in aeronautics, composite materials with 3D architecture offer better mechanical properties than 2D reinforced composites. Nevertheless, these materials require a good understanding of their behavior. Because of the complexity of such materials, the damage mechanisms are multiple, and the scenario of their appearance and evolution depends on the nature of the exerted solicitations. The AE technique is a well-established tool for discriminating between the damage mechanisms. Suitable sensors are used during the mechanical test to monitor the structural health of the material. Relevant AE-features are then extracted from the recorded signals, followed by a data analysis using pattern recognition techniques. In order to better understand the damage scenarios of interlock composite materials, a multi-instrumentation was set-up in this work for tracking damage initiation and development, especially in the vicinity of the first significant damage, called macro-damage. The deployed instrumentation includes video-microscopy, Digital Image Correlation, Acoustic Emission (AE) and micro-tomography. In this study, a multi-variable AE data analysis approach was developed for the discrimination between the different signal classes representing the different emission sources during testing. An unsupervised classification technique was adopted to perform AE data clustering without a priori knowledge. The multi-instrumentation and the clustered data served to label the different signal families and to build a learning database. This latter is useful to construct a supervised classifier that can be used for automatic recognition of the AE signals. Several materials with different ingredients were tested under various solicitations in order to feed and enrich the learning database. The methodology presented in this work was useful to refine the damage threshold for the new generation materials. The damage mechanisms around this threshold were highlighted. The obtained signal classes were assigned to the different mechanisms. The isolation of a 'noise' class makes it possible to discriminate between the signals emitted by damages without resorting to spatial filtering or increasing the AE detection threshold. The approach was validated on different material configurations. For the same material and the same type of solicitation, the identified classes are reproducible and little disturbed. The supervised classifier constructed based on the learning database was able to predict the labels of the classified signals.

Keywords: acoustic emission, classifier, damage mechanisms, first damage threshold, interlock composite materials, pattern recognition

Procedia PDF Downloads 139
4009 Analyzing the Changing Pattern of Nigerian Vegetation Zones and Its Ecological and Socio-Economic Implications Using Spot-Vegetation Sensor

Authors: B. L. Gadiga

Abstract:

This study assesses the major ecological zones in Nigeria with the view to understanding the spatial pattern of vegetation zones and the implications on conservation within the period of sixteen (16) years. Satellite images used for this study were acquired from the SPOT-VEGETATION between 1998 and 2013. The annual NDVI images selected for this study were derived from SPOT-4 sensor and were acquired within the same season (November) in order to reduce differences in spectral reflectance due to seasonal variations. The images were sliced into five classes based on literatures and knowledge of the area (i.e. <0.16 Non-Vegetated areas; 0.16-0.22 Sahel Savannah; 0.22-0.40 Sudan Savannah, 0.40-0.47 Guinea Savannah and >0.47 Forest Zone). Classification of the 1998 and 2013 images into forested and non forested areas showed that forested area decrease from 511,691 km2 in 1998 to 478,360 km2 in 2013. Differencing change detection method was performed on 1998 and 2013 NDVI images to identify areas of ecological concern. The result shows that areas undergoing vegetation degradation covers an area of 73,062 km2 while areas witnessing some form restoration cover an area of 86,315 km2. The result also shows that there is a weak correlation between rainfall and the vegetation zones. The non-vegetated areas have a correlation coefficient (r) of 0.0088, Sahel Savannah belt 0.1988, Sudan Savannah belt -0.3343, Guinea Savannah belt 0.0328 and Forest belt 0.2635. The low correlation can be associated with the encroachment of the Sudan Savannah belt into the forest belt of South-eastern part of the country as revealed by the image analysis. The degradation of the forest vegetation is therefore responsible for the serious erosion problems witnessed in the South-east. The study recommends constant monitoring of vegetation and strict enforcement of environmental laws in the country.

Keywords: vegetation, NDVI, SPOT-vegetation, ecology, degradation

Procedia PDF Downloads 194
4008 Pattern of Cybercrime Among Adolescents: An Exploratory Study

Authors: Mohamamd Shahjahan

Abstract:

Background: Cybercrime is common phenomenon at present both developed and developing countries. Young generation, especially adolescents now engaged internet frequently and they commit cybercrime frequently in Bangladesh. Objective: In this regard, the present study on the pattern of cybercrime among youngers of Bangladesh has been conducted. Methods and tools: This study was a cross-sectional study, descriptive in nature. Non-probability accidental sampling technique has been applied to select the sample because of the nonfinite population and the sample size was 167. A printed semi-structured questionnaire was used to collect data. Results: The study shows that adolescents mainly do hacking (94.6%), pornography (88.6%), software piracy (85 %), cyber theft (82.6%), credit card fraud (81.4%), cyber defamation (75.6%), sweet heart swindling (social network) (65.9%) etc. as cybercrime. According to findings the major causes of cybercrime among the respondents in Bangladesh were- weak laws (88.0%), defective socialization (81.4%), peer group influence (80.2%), easy accessibility to internet (74.3%), corruption (62.9%), unemployment (58.7%), and poverty (24.6%) etc. It is evident from the study that 91.0% respondents used password cracker as the techniques of cyber criminality. About 76.6%, 72.5%, 71.9%, 68.3% and 60.5% respondents’ technique was key loggers, network sniffer, exploiting, vulnerability scanner and port scanner consecutively. Conclusion: The study concluded that pattern of cybercrimes is frequently changing and increasing dramatically. Finally, it is recommending that the private public partnership and execution of existing laws can be controlling this crime.

Keywords: cybercrime, adolescents, pattern, internet

Procedia PDF Downloads 47