Search results for: object oriented classification
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4286

Search results for: object oriented classification

3656 The Optimization of Decision Rules in Multimodal Decision-Level Fusion Scheme

Authors: Andrey V. Timofeev, Dmitry V. Egorov

Abstract:

This paper introduces an original method of parametric optimization of the structure for multimodal decision-level fusion scheme which combines the results of the partial solution of the classification task obtained from assembly of the mono-modal classifiers. As a result, a multimodal fusion classifier which has the minimum value of the total error rate has been obtained.

Keywords: classification accuracy, fusion solution, total error rate, multimodal fusion classifier

Procedia PDF Downloads 456
3655 Domain Driven Design vs Soft Domain Driven Design Frameworks

Authors: Mohammed Salahat, Steve Wade

Abstract:

This paper presents and compares the SSDDD “Systematic Soft Domain Driven Design Framework” to DDD “Domain Driven Design Framework” as a soft system approach of information systems development. The framework use SSM as a guiding methodology within which we have embedded a sequence of design tasks based on the UML leading to the implementation of a software system using the Naked Objects framework. This framework has been used in action research projects that have involved the investigation and modelling of business processes using object-oriented domain models and the implementation of software systems based on those domain models. Within this framework, Soft Systems Methodology (SSM) is used as a guiding methodology to explore the problem situation and to develop the domain model using UML for the given business domain. The framework is proposed and evaluated in our previous works, a comparison between SSDDD and DDD is presented in this paper, to show how SSDDD improved DDD as an approach to modelling and implementing business domain perspectives for Information Systems Development. The comparison process, the results, and the improvements are presented in the following sections of this paper.

Keywords: domain-driven design, soft domain-driven design, naked objects, soft language

Procedia PDF Downloads 282
3654 A Case Study of the Digital Translation of the Lucy Lloyd and Wilhelm Bleek |Xam and !Kun Notebooks into The Digital Bleek and Lloyd

Authors: F. Saptouw

Abstract:

This paper will examine the digitization process of the |Xam and !Kun notebooks, authored by Lucy Lloyd, Dorothea Bleek and Wilhelm Bleek, and their collaborators |a!kunta, ||kabbo, ≠kasin, Dia!kwain, !kweiten ta ||ken, |han≠kass'o, !nanni, Tamme, |uma, and Da during the 19th century. Detail will be provided about the status of the archive, the creation of the digital archive and selected research projects linked to the archive. The Digital Bleek and Lloyd project is an example of institutional collaboration by the University of Cape Town, University of South Africa, Iziko South African Museum, the National Library of South Africa and the Western Cape Provincial Archives and Records Service. The contemporary value of the archive will be discussed in relation to its current manifestation as a collection of archival and digital objects, each with its own set of properties and archival risk factors. This tension between the two ways to access the archive will be interrogated to shed light on the slippages between the digital object and the archival object. The primary argument is that the process of digitization generates an ontological shift in the status of the archival object. The secondary argument is an engagement with practices to curate the encounters with these ontologically shifted objects and how to relate to each as a contemporary viewer. In conclusion this paper will argue for regarding these archival objects according to the interpretive framework utilized to engage secular relics.

Keywords: archive, curatorship, digitization, museum practice

Procedia PDF Downloads 130
3653 An Overview of the Porosity Classification in Carbonate Reservoirs and Their Challenges: An Example of Macro-Microporosity Classification from Offshore Miocene Carbonate in Central Luconia, Malaysia

Authors: Hammad T. Janjuhah, Josep Sanjuan, Mohamed K. Salah

Abstract:

Biological and chemical activities in carbonates are responsible for the complexity of the pore system. Primary porosity is generally of natural origin while secondary porosity is subject to chemical reactivity through diagenetic processes. To understand the integrated part of hydrocarbon exploration, it is necessary to understand the carbonate pore system. However, the current porosity classification scheme is limited to adequately predict the petrophysical properties of different reservoirs having various origins and depositional environments. Rock classification provides a descriptive method for explaining the lithofacies but makes no significant contribution to the application of porosity and permeability (poro-perm) correlation. The Central Luconia carbonate system (Malaysia) represents a good example of pore complexity (in terms of nature and origin) mainly related to diagenetic processes which have altered the original reservoir. For quantitative analysis, 32 high-resolution images of each thin section were taken using transmitted light microscopy. The quantification of grains, matrix, cement, and macroporosity (pore types) was achieved using a petrographic analysis of thin sections and FESEM images. The point counting technique was used to estimate the amount of macroporosity from thin section, which was then subtracted from the total porosity to derive the microporosity. The quantitative observation of thin sections revealed that the mouldic porosity (macroporosity) is the dominant porosity type present, whereas the microporosity seems to correspond to a sum of 40 to 50% of the total porosity. It has been proven that these Miocene carbonates contain a significant amount of microporosity, which significantly complicates the estimation and production of hydrocarbons. Neglecting its impact can increase uncertainty about estimating hydrocarbon reserves. Due to the diversity of geological parameters, the application of existing porosity classifications does not allow a better understanding of the poro-perm relationship. However, the classification can be improved by including the pore types and pore structures where they can be divided into macro- and microporosity. Such studies of microporosity identification/classification represent now a major concern in limestone reservoirs around the world.

Keywords: overview of porosity classification, reservoir characterization, microporosity, carbonate reservoir

Procedia PDF Downloads 139
3652 Using Time Series NDVI to Model Land Cover Change: A Case Study in the Berg River Catchment Area, Western Cape, South Africa

Authors: Adesuyi Ayodeji Steve, Zahn Munch

Abstract:

This study investigates the use of MODIS NDVI to identify agricultural land cover change areas on an annual time step (2007 - 2012) and characterize the trend in the study area. An ISODATA classification was performed on the MODIS imagery to select only the agricultural class producing 3 class groups namely: agriculture, agriculture/semi-natural, and semi-natural. NDVI signatures were created for the time series to identify areas dominated by cereals and vineyards with the aid of ancillary, pictometry and field sample data. The NDVI signature curve and training samples aided in creating a decision tree model in WEKA 3.6.9. From the training samples two classification models were built in WEKA using decision tree classifier (J48) algorithm; Model 1 included ISODATA classification and Model 2 without, both having accuracies of 90.7% and 88.3% respectively. The two models were used to classify the whole study area, thus producing two land cover maps with Model 1 and 2 having classification accuracies of 77% and 80% respectively. Model 2 was used to create change detection maps for all the other years. Subtle changes and areas of consistency (unchanged) were observed in the agricultural classes and crop practices over the years as predicted by the land cover classification. 41% of the catchment comprises of cereals with 35% possibly following a crop rotation system. Vineyard largely remained constant over the years, with some conversion to vineyard (1%) from other land cover classes. Some of the changes might be as a result of misclassification and crop rotation system.

Keywords: change detection, land cover, modis, NDVI

Procedia PDF Downloads 387
3651 Ontology-Based Backpropagation Neural Network Classification and Reasoning Strategy for NoSQL and SQL Databases

Authors: Hao-Hsiang Ku, Ching-Ho Chi

Abstract:

Big data applications have become an imperative for many fields. Many researchers have been devoted into increasing correct rates and reducing time complexities. Hence, the study designs and proposes an Ontology-based backpropagation neural network classification and reasoning strategy for NoSQL big data applications, which is called ON4NoSQL. ON4NoSQL is responsible for enhancing the performances of classifications in NoSQL and SQL databases to build up mass behavior models. Mass behavior models are made by MapReduce techniques and Hadoop distributed file system based on Hadoop service platform. The reference engine of ON4NoSQL is the ontology-based backpropagation neural network classification and reasoning strategy. Simulation results indicate that ON4NoSQL can efficiently achieve to construct a high performance environment for data storing, searching, and retrieving.

Keywords: Hadoop, NoSQL, ontology, back propagation neural network, high distributed file system

Procedia PDF Downloads 251
3650 Efficient Passenger Counting in Public Transport Based on Machine Learning

Authors: Chonlakorn Wiboonsiriruk, Ekachai Phaisangittisagul, Chadchai Srisurangkul, Itsuo Kumazawa

Abstract:

Public transportation is a crucial aspect of passenger transportation, with buses playing a vital role in the transportation service. Passenger counting is an essential tool for organizing and managing transportation services. However, manual counting is a tedious and time-consuming task, which is why computer vision algorithms are being utilized to make the process more efficient. In this study, different object detection algorithms combined with passenger tracking are investigated to compare passenger counting performance. The system employs the EfficientDet algorithm, which has demonstrated superior performance in terms of speed and accuracy. Our results show that the proposed system can accurately count passengers in varying conditions with an accuracy of 94%.

Keywords: computer vision, object detection, passenger counting, public transportation

Procedia PDF Downloads 133
3649 Employee Happiness: The Influence of Providing Consumers with an Experience versus an Object

Authors: Wilson Bastos, Sigal G. Barsade

Abstract:

Much of what happens in the marketplace revolves around the provision and consumption of goods. Recent research has advanced a useful categorization of these goods—as experiential versus material—and shown that, from the consumers’ perspective, experiences (e.g., a theater performance) are superior to objects (e.g., an electronic gadget) in offering various social and psychological benefits. A common finding in this growing research stream is that consumers gain more happiness from the experiences they have than the objects they own. By focusing solely on those acquiring the experiential or material goods (the consumers), prior research has remained silent regarding another important group of individuals—those providing the goods (the employees). Do employees whose jobs are primarily focused on offering consumers an experience (vs. object) also gain more happiness from their occupation? We report evidence from four experiments supporting an experiential-employee advantage. Further, we use mediation and moderation tests to unearth the mechanism responsible for this effect. Results reveal that work meaningfulness is the primary driver of the experiential-employee advantage. Overall, our findings suggest that employees find it more meaningful to provide people with an experience as compared to a material object, which in turn shapes the happiness they derive from their jobs. We expect this finding to have implications on human development, and to be of relevance to researchers and practitioners interested in how to advance human condition in the workplace.

Keywords: employee happiness, experiential versus material jobs, work meaningfulness

Procedia PDF Downloads 258
3648 Land Use Change Detection Using Satellite Images for Najran City, Kingdom of Saudi Arabia (KSA)

Authors: Ismail Elkhrachy

Abstract:

Determination of land use changing is an important component of regional planning for applications ranging from urban fringe change detection to monitoring change detection of land use. This data are very useful for natural resources management.On the other hand, the technologies and methods of change detection also have evolved dramatically during past 20 years. So it has been well recognized that the change detection had become the best methods for researching dynamic change of land use by multi-temporal remotely-sensed data. The objective of this paper is to assess, evaluate and monitor land use change surrounding the area of Najran city, Kingdom of Saudi Arabia (KSA) using Landsat images (June 23, 2009) and ETM+ image(June. 21, 2014). The post-classification change detection technique was applied. At last,two-time subset images of Najran city are compared on a pixel-by-pixel basis using the post-classification comparison method and the from-to change matrix is produced, the land use change information obtained.Three classes were obtained, urban, bare land and agricultural land from unsupervised classification method by using Erdas Imagine and ArcGIS software. Accuracy assessment of classification has been performed before calculating change detection for study area. The obtained accuracy is between 61% to 87% percent for all the classes. Change detection analysis shows that rapid growth in urban area has been increased by 73.2%, the agricultural area has been decreased by 10.5 % and barren area reduced by 7% between 2009 and 2014. The quantitative study indicated that the area of urban class has unchanged by 58.2 km〗^2, gained 70.3 〖km〗^2 and lost 16 〖km〗^2. For bare land class 586.4〖km〗^2 has unchanged, 53.2〖km〗^2 has gained and 101.5〖km〗^2 has lost. While agriculture area class, 20.2〖km〗^2 has unchanged, 31.2〖km〗^2 has gained and 37.2〖km〗^2 has lost.

Keywords: land use, remote sensing, change detection, satellite images, image classification

Procedia PDF Downloads 512
3647 Multi-Agent TeleRobotic Security Control System: Requirements Definitions of Multi-Agent System Using The Behavioral Patterns Analysis (BPA) Approach

Authors: Assem El-Ansary

Abstract:

This paper illustrates the event-oriented Behavioral Pattern Analysis (BPA) modeling approach in developing an Multi-Agent TeleRobotic Security Control System (MTSCS). The event defined in BPA is a real-life conceptual entity that is unrelated to any implementation. The major contributions of this research are the Behavioral Pattern Analysis (BPA) modeling methodology, and the development of an interactive software tool (DECISION), which is based on a combination of the Analytic Hierarchy Process (AHP) and the ELECTRE Multi-Criteria Decision Making (MCDM) methods.

Keywords: analysis, multi-agent, TeleRobotics control, security, modeling methodology, software modeling, event-oriented, behavioral pattern, use cases

Procedia PDF Downloads 420
3646 The Necessity to Standardize Procedures of Providing Engineering Geological Data for Designing Road and Railway Tunneling Projects

Authors: Atefeh Saljooghi Khoshkar, Jafar Hassanpour

Abstract:

One of the main problems of the design stage relating to many tunneling projects is the lack of an appropriate standard for the provision of engineering geological data in a predefined format. In particular, this is more reflected in highway and railroad tunnel projects in which there is a number of tunnels and different professional teams involved. In this regard, comprehensive software needs to be designed using the accepted methods in order to help engineering geologists to prepare standard reports, which contain sufficient input data for the design stage. Regarding this necessity, applied software has been designed using macro capabilities and Visual Basic programming language (VBA) through Microsoft Excel. In this software, all of the engineering geological input data, which are required for designing different parts of tunnels, such as discontinuities properties, rock mass strength parameters, rock mass classification systems, boreability classification, the penetration rate, and so forth, can be calculated and reported in a standard format.

Keywords: engineering geology, rock mass classification, rock mechanic, tunnel

Procedia PDF Downloads 63
3645 Development of Industry Oriented Undergraduate Research Program

Authors: Sung Ryong Kim, Hyung Sup Han, Jae-Yup Kim

Abstract:

Many engineering students feel uncomfortable in solving the industry related problems. There are many ways to strengthen the engineering student’s ability to solve the assigned problem when they get a job. Korea National University of Transportation has developed an industry-oriented undergraduate research program (URP). An URP program is designed for engineering students to provide an experience of solving a company’s research problem. The URP project is carried out for 6 months. Each URP team consisted of 1 company mentor, 1 professor, and 3-4 engineering students. A team of different majors is strongly encouraged to integrate different perspectives of multidisciplinary background. The corporate research projects proposed by companies are chosen by the major-related student teams. A company mentor gives the detailed technical background of the project to the students, and he/she also provides a basic data, raw materials and so forth. The company allows students to use the company's research equipment. An assigned professor has adjusted the project scope and level to the student’s ability after discussing with a company mentor. Monthly meeting is used to check the progress, to exchange ideas, and to help the students. It is proven as an effective engineering education program not only to provide an experience of company research but also to motivate the students in their course work. This program provides a premier interdisciplinary platform for undergraduate students to perform the practical challenges encountered in their major-related companies and it is especially helpful for students who want to get a job from a company that proposed the project.

Keywords: company mentor, industry oriented, interdisciplinary platform, undergraduate research program

Procedia PDF Downloads 237
3644 Heart-Rate Resistance Electrocardiogram Identification Based on Slope-Oriented Neural Networks

Authors: Tsu-Wang Shen, Shan-Chun Chang, Chih-Hsien Wang, Te-Chao Fang

Abstract:

For electrocardiogram (ECG) biometrics system, it is a tedious process to pre-install user’s high-intensity heart rate (HR) templates in ECG biometric systems. Based on only resting enrollment templates, it is a challenge to identify human by using ECG with the high-intensity HR caused from exercises and stress. This research provides a heartbeat segment method with slope-oriented neural networks against the ECG morphology changes due to high intensity HRs. The method has overall system accuracy at 97.73% which includes six levels of HR intensities. A cumulative match characteristic curve is also used to compare with other traditional ECG biometric methods.

Keywords: high-intensity heart rate, heart rate resistant, ECG human identification, decision based artificial neural network

Procedia PDF Downloads 417
3643 Defect Classification of Hydrogen Fuel Pressure Vessels using Deep Learning

Authors: Dongju Kim, Youngjoo Suh, Hyojin Kim, Gyeongyeong Kim

Abstract:

Acoustic Emission Testing (AET) is widely used to test the structural integrity of an operational hydrogen storage container, and clustering algorithms are frequently used in pattern recognition methods to interpret AET results. However, the interpretation of AET results can vary from user to user as the tuning of the relevant parameters relies on the user's experience and knowledge of AET. Therefore, it is necessary to use a deep learning model to identify patterns in acoustic emission (AE) signal data that can be used to classify defects instead. In this paper, a deep learning-based model for classifying the types of defects in hydrogen storage tanks, using AE sensor waveforms, is proposed. As hydrogen storage tanks are commonly constructed using carbon fiber reinforced polymer composite (CFRP), a defect classification dataset is collected through a tensile test on a specimen of CFRP with an AE sensor attached. The performance of the classification model, using one-dimensional convolutional neural network (1-D CNN) and synthetic minority oversampling technique (SMOTE) data augmentation, achieved 91.09% accuracy for each defect. It is expected that the deep learning classification model in this paper, used with AET, will help in evaluating the operational safety of hydrogen storage containers.

Keywords: acoustic emission testing, carbon fiber reinforced polymer composite, one-dimensional convolutional neural network, smote data augmentation

Procedia PDF Downloads 80
3642 Correlating Musical Subject and Dialectical Subject to Develop a Critical Approach to Ideology in Musical Analysis and Composition

Authors: James Waide

Abstract:

In music, subject typically denotes the initial idea to which the entire composition refers—a concept congruous with Aristotle's notion of subject as primary substance, in the sense of an irreducible this particular. Gioseffo Zarlino, who established subject (soggetto) as a musical term, insisted the composer as “rediscovering” the subject within their music in order to “[bring] it to perfection” and, furthermore, that if the composer had not rediscovered the subject already, then one would simply take the first part of the composition to be the subject. Meanwhile, Žižek reads the Hegelian subject (as negativity set against positive object) through Lacanian Psychoanalysis (in which the subject is a kind of fictive entity of the clinic: a mere appearance which sits atop the objects of analysis) in the concept of Absolute Recoil. For Žižek, subject exists retroactively in Absolute Recoil from object, meaning subject is a void which only has meaning because of the object it is seen through. Following the work of theorists such as Adorno and Althusser, one can understand the ideological construction of such a subject. It may be argued that in Zarlino, musical subject can be similarly read as retroactively constructed, either by the composer or the listener. Furthermore, in recent work, Samuel Wilson identifies different kinds of subjects in music which can be psychoanalytically examined, including the fictive subject: a purely musical entity raised to the level of psychoanalytic subject. On which basis if, as Adorno insisted, 'authentic' music constitutes 'cognition without concepts', where and what is this subject without concept?.

Keywords: absolute recoil, critical theory, ideology, music analysis, psychoanalysis, retroactivity, subject

Procedia PDF Downloads 58
3641 Classification of Manufacturing Data for Efficient Processing on an Edge-Cloud Network

Authors: Onyedikachi Ulelu, Andrew P. Longstaff, Simon Fletcher, Simon Parkinson

Abstract:

The widespread interest in 'Industry 4.0' or 'digital manufacturing' has led to significant research requiring the acquisition of data from sensors, instruments, and machine signals. In-depth research then identifies methods of analysis of the massive amounts of data generated before and during manufacture to solve a particular problem. The ultimate goal is for industrial Internet of Things (IIoT) data to be processed automatically to assist with either visualisation or autonomous system decision-making. However, the collection and processing of data in an industrial environment come with a cost. Little research has been undertaken on how to specify optimally what data to capture, transmit, process, and store at various levels of an edge-cloud network. The first step in this specification is to categorise IIoT data for efficient and effective use. This paper proposes the required attributes and classification to take manufacturing digital data from various sources to determine the most suitable location for data processing on the edge-cloud network. The proposed classification framework will minimise overhead in terms of network bandwidth/cost and processing time of machine tool data via efficient decision making on which dataset should be processed at the ‘edge’ and what to send to a remote server (cloud). A fast-and-frugal heuristic method is implemented for this decision-making. The framework is tested using case studies from industrial machine tools for machine productivity and maintenance.

Keywords: data classification, decision making, edge computing, industrial IoT, industry 4.0

Procedia PDF Downloads 166
3640 A Statistical Approach to Predict and Classify the Commercial Hatchability of Chickens Using Extrinsic Parameters of Breeders and Eggs

Authors: M. S. Wickramarachchi, L. S. Nawarathna, C. M. B. Dematawewa

Abstract:

Hatchery performance is critical for the profitability of poultry breeder operations. Some extrinsic parameters of eggs and breeders cause to increase or decrease the hatchability. This study aims to identify the affecting extrinsic parameters on the commercial hatchability of local chicken's eggs and determine the most efficient classification model with a hatchability rate greater than 90%. In this study, seven extrinsic parameters were considered: egg weight, moisture loss, breeders age, number of fertilised eggs, shell width, shell length, and shell thickness. Multiple linear regression was performed to determine the most influencing variable on hatchability. First, the correlation between each parameter and hatchability were checked. Then a multiple regression model was developed, and the accuracy of the fitted model was evaluated. Linear Discriminant Analysis (LDA), Classification and Regression Trees (CART), k-Nearest Neighbors (kNN), Support Vector Machines (SVM) with a linear kernel, and Random Forest (RF) algorithms were applied to classify the hatchability. This grouping process was conducted using binary classification techniques. Hatchability was negatively correlated with egg weight, breeders' age, shell width, shell length, and positive correlations were identified with moisture loss, number of fertilised eggs, and shell thickness. Multiple linear regression models were more accurate than single linear models regarding the highest coefficient of determination (R²) with 94% and minimum AIC and BIC values. According to the classification results, RF, CART, and kNN had performed the highest accuracy values 0.99, 0.975, and 0.972, respectively, for the commercial hatchery process. Therefore, the RF is the most appropriate machine learning algorithm for classifying the breeder outcomes, which are economically profitable or not, in a commercial hatchery.

Keywords: classification models, egg weight, fertilised eggs, multiple linear regression

Procedia PDF Downloads 80
3639 Local Directional Encoded Derivative Binary Pattern Based Coral Image Classification Using Weighted Distance Gray Wolf Optimization Algorithm

Authors: Annalakshmi G., Sakthivel Murugan S.

Abstract:

This paper presents a local directional encoded derivative binary pattern (LDEDBP) feature extraction method that can be applied for the classification of submarine coral reef images. The classification of coral reef images using texture features is difficult due to the dissimilarities in class samples. In coral reef image classification, texture features are extracted using the proposed method called local directional encoded derivative binary pattern (LDEDBP). The proposed approach extracts the complete structural arrangement of the local region using local binary batten (LBP) and also extracts the edge information using local directional pattern (LDP) from the edge response available in a particular region, thereby achieving extra discriminative feature value. Typically the LDP extracts the edge details in all eight directions. The process of integrating edge responses along with the local binary pattern achieves a more robust texture descriptor than the other descriptors used in texture feature extraction methods. Finally, the proposed technique is applied to an extreme learning machine (ELM) method with a meta-heuristic algorithm known as weighted distance grey wolf optimizer (GWO) to optimize the input weight and biases of single-hidden-layer feed-forward neural networks (SLFN). In the empirical results, ELM-WDGWO demonstrated their better performance in terms of accuracy on all coral datasets, namely RSMAS, EILAT, EILAT2, and MLC, compared with other state-of-the-art algorithms. The proposed method achieves the highest overall classification accuracy of 94% compared to the other state of art methods.

Keywords: feature extraction, local directional pattern, ELM classifier, GWO optimization

Procedia PDF Downloads 150
3638 Realistic Modeling of the Preclinical Small Animal Using Commercial Software

Authors: Su Chul Han, Seungwoo Park

Abstract:

As the increasing incidence of cancer, the technology and modality of radiotherapy have advanced and the importance of preclinical model is increasing in the cancer research. Furthermore, the small animal dosimetry is an essential part of the evaluation of the relationship between the absorbed dose in preclinical small animal and biological effect in preclinical study. In this study, we carried out realistic modeling of the preclinical small animal phantom possible to verify irradiated dose using commercial software. The small animal phantom was modeling from 4D Digital Mouse whole body phantom. To manipulate Moby phantom in commercial software (Mimics, Materialise, Leuven, Belgium), we converted Moby phantom to DICOM image file of CT by Matlab and two- dimensional of CT images were converted to the three-dimensional image and it is possible to segment and crop CT image in Sagittal, Coronal and axial view). The CT images of small animals were modeling following process. Based on the profile line value, the thresholding was carried out to make a mask that was connection of all the regions of the equal threshold range. Using thresholding method, we segmented into three part (bone, body (tissue). lung), to separate neighboring pixels between lung and body (tissue), we used region growing function of Mimics software. We acquired 3D object by 3D calculation in the segmented images. The generated 3D object was smoothing by remeshing operation and smoothing operation factor was 0.4, iteration value was 5. The edge mode was selected to perform triangle reduction. The parameters were that tolerance (0.1mm), edge angle (15 degrees) and the number of iteration (5). The image processing 3D object file was converted to an STL file to output with 3D printer. We modified 3D small animal file using 3- Matic research (Materialise, Leuven, Belgium) to make space for radiation dosimetry chips. We acquired 3D object of realistic small animal phantom. The width of small animal phantom was 2.631 cm, thickness was 2.361 cm, and length was 10.817. Mimics software supported efficiency about 3D object generation and usability of conversion to STL file for user. The development of small preclinical animal phantom would increase reliability of verification of absorbed dose in small animal for preclinical study.

Keywords: mimics, preclinical small animal, segmentation, 3D printer

Procedia PDF Downloads 357
3637 Expression-Based Learning as a Starting Point to Promote Students’ Creativity in K-12 Schools in China

Authors: Yanyue Yuan

Abstract:

In this paper, the author shares the findings of a pilot study that examines students’ creative expressions and their perceptions of creativity when engaged in project-based learning. The study is based on an elective course that the author co-designed and co-taught with a colleague to sixteen grade six and seven students over the spring semester in 2019. Using the Little Prince story as the main prompt, they facilitated students’ original creation of a storytelling concert that integrated script writing, music production, lyrics, songs, and visual design as a result of both individual and collaborative work. The author will share the specific challenges we met during the project, including learning cultures of the school, class management, teachers' and parents’ attitude, process-oriented versus product-oriented mindset, and facilities and logistical resources. The findings of this pilot study will inform the ongoing research initiative of exploring how we can foster creative learning in public schools in the Chinese context. While K-12 schools of China’s public education system are still dominated by exam-oriented and teacher-centered approaches, the author proposes that expression-based learning can be a starting point for promoting students’ creativity and can serve as experimental efforts to initiate incremental changes within the current education framework. The paper will also touch upon insights gained from collaborations between university and K-12 schools.

Keywords: creativity, expression-based learning, K-12, incremental changes

Procedia PDF Downloads 94
3636 Kannada HandWritten Character Recognition by Edge Hinge and Edge Distribution Techniques Using Manhatan and Minimum Distance Classifiers

Authors: C. V. Aravinda, H. N. Prakash

Abstract:

In this paper, we tried to convey fusion and state of art pertaining to SIL character recognition systems. In the first step, the text is preprocessed and normalized to perform the text identification correctly. The second step involves extracting relevant and informative features. The third step implements the classification decision. The three stages which involved are Data acquisition and preprocessing, Feature extraction, and Classification. Here we concentrated on two techniques to obtain features, Feature Extraction & Feature Selection. Edge-hinge distribution is a feature that characterizes the changes in direction of a script stroke in handwritten text. The edge-hinge distribution is extracted by means of a windowpane that is slid over an edge-detected binary handwriting image. Whenever the mid pixel of the window is on, the two edge fragments (i.e. connected sequences of pixels) emerging from this mid pixel are measured. Their directions are measured and stored as pairs. A joint probability distribution is obtained from a large sample of such pairs. Despite continuous effort, handwriting identification remains a challenging issue, due to different approaches use different varieties of features, having different. Therefore, our study will focus on handwriting recognition based on feature selection to simplify features extracting task, optimize classification system complexity, reduce running time and improve the classification accuracy.

Keywords: word segmentation and recognition, character recognition, optical character recognition, hand written character recognition, South Indian languages

Procedia PDF Downloads 483
3635 Music Genre Classification Based on Non-Negative Matrix Factorization Features

Authors: Soyon Kim, Edward Kim

Abstract:

In order to retrieve information from the massive stream of songs in the music industry, music search by title, lyrics, artist, mood, and genre has become more important. Despite the subjectivity and controversy over the definition of music genres across different nations and cultures, automatic genre classification systems that facilitate the process of music categorization have been developed. Manual genre selection by music producers is being provided as statistical data for designing automatic genre classification systems. In this paper, an automatic music genre classification system utilizing non-negative matrix factorization (NMF) is proposed. Short-term characteristics of the music signal can be captured based on the timbre features such as mel-frequency cepstral coefficient (MFCC), decorrelated filter bank (DFB), octave-based spectral contrast (OSC), and octave band sum (OBS). Long-term time-varying characteristics of the music signal can be summarized with (1) the statistical features such as mean, variance, minimum, and maximum of the timbre features and (2) the modulation spectrum features such as spectral flatness measure, spectral crest measure, spectral peak, spectral valley, and spectral contrast of the timbre features. Not only these conventional basic long-term feature vectors, but also NMF based feature vectors are proposed to be used together for genre classification. In the training stage, NMF basis vectors were extracted for each genre class. The NMF features were calculated in the log spectral magnitude domain (NMF-LSM) as well as in the basic feature vector domain (NMF-BFV). For NMF-LSM, an entire full band spectrum was used. However, for NMF-BFV, only low band spectrum was used since high frequency modulation spectrum of the basic feature vectors did not contain important information for genre classification. In the test stage, using the set of pre-trained NMF basis vectors, the genre classification system extracted the NMF weighting values of each genre as the NMF feature vectors. A support vector machine (SVM) was used as a classifier. The GTZAN multi-genre music database was used for training and testing. It is composed of 10 genres and 100 songs for each genre. To increase the reliability of the experiments, 10-fold cross validation was used. For a given input song, an extracted NMF-LSM feature vector was composed of 10 weighting values that corresponded to the classification probabilities for 10 genres. An NMF-BFV feature vector also had a dimensionality of 10. Combined with the basic long-term features such as statistical features and modulation spectrum features, the NMF features provided the increased accuracy with a slight increase in feature dimensionality. The conventional basic features by themselves yielded 84.0% accuracy, but the basic features with NMF-LSM and NMF-BFV provided 85.1% and 84.2% accuracy, respectively. The basic features required dimensionality of 460, but NMF-LSM and NMF-BFV required dimensionalities of 10 and 10, respectively. Combining the basic features, NMF-LSM and NMF-BFV together with the SVM with a radial basis function (RBF) kernel produced the significantly higher classification accuracy of 88.3% with a feature dimensionality of 480.

Keywords: mel-frequency cepstral coefficient (MFCC), music genre classification, non-negative matrix factorization (NMF), support vector machine (SVM)

Procedia PDF Downloads 282
3634 Decision Making System for Clinical Datasets

Authors: P. Bharathiraja

Abstract:

Computer Aided decision making system is used to enhance diagnosis and prognosis of diseases and also to assist clinicians and junior doctors in clinical decision making. Medical Data used for decision making should be definite and consistent. Data Mining and soft computing techniques are used for cleaning the data and for incorporating human reasoning in decision making systems. Fuzzy rule based inference technique can be used for classification in order to incorporate human reasoning in the decision making process. In this work, missing values are imputed using the mean or mode of the attribute. The data are normalized using min-ma normalization to improve the design and efficiency of the fuzzy inference system. The fuzzy inference system is used to handle the uncertainties that exist in the medical data. Equal-width-partitioning is used to partition the attribute values into appropriate fuzzy intervals. Fuzzy rules are generated using Class Based Associative rule mining algorithm. The system is trained and tested using heart disease data set from the University of California at Irvine (UCI) Machine Learning Repository. The data was split using a hold out approach into training and testing data. From the experimental results it can be inferred that classification using fuzzy inference system performs better than trivial IF-THEN rule based classification approaches. Furthermore it is observed that the use of fuzzy logic and fuzzy inference mechanism handles uncertainty and also resembles human decision making. The system can be used in the absence of a clinical expert to assist junior doctors and clinicians in clinical decision making.

Keywords: decision making, data mining, normalization, fuzzy rule, classification

Procedia PDF Downloads 503
3633 Web Service Architectural Style Selection in Multi-Criteria Requirements

Authors: Ahmad Mohsin, Syda Fatima, Falak Nawaz, Aman Ullah Khan

Abstract:

Selection of an appropriate architectural style is vital to the success of target web service under development. The nature of architecture design and selection for service-oriented computing applications is quite different as compared to traditional software. Web Services have complex and rigorous architectural styles to choose. Due to this, selection for accurate architectural style for web services development has become a more complex decision to be made by architects. Architectural style selection is a multi-criteria decision and demands lots of experience in service oriented computing. Decision support systems are good solutions to simplify the selection process of a particular architectural style. Our research suggests a new approach using DSS for selection of architectural styles while developing a web service to cater FRs and NFRs. Our proposed DSS helps architects to select right web service architectural pattern according to the domain and non-functional requirements. In this paper, a rule base DSS has been developed using CLIPS (C Language Integrated Production System) to support decisions using multi-criteria requirements. This DSS takes architectural characteristics, domain requirements and software architect preferences for NFRs as input for different architectural styles in use today in service-oriented computing. Weighted sum model has been applied to prioritize quality attributes and domain requirements. Scores are calculated using multiple criterions to choose the final architecture style.

Keywords: software architecture, web-service, rule-based, DSS, multi-criteria requirements, quality attributes

Procedia PDF Downloads 347
3632 Improved Imaging and Tracking Algorithm for Maneuvering Extended UAVs Using High-Resolution ISAR Radar System

Authors: Mohamed Barbary, Mohamed H. Abd El-Azeem

Abstract:

Maneuvering extended object tracking (M-EOT) using high-resolution inverse synthetic aperture radar (ISAR) observations has been gaining momentum recently. This work presents a new robust implementation of the multiple models (MM) multi-Bernoulli (MB) filter for M-EOT, where the M-EOT’s ISAR observations are characterized using a skewed (SK) non-symmetrically normal distribution. To cope with the possible abrupt change of kinematic state, extension, and observation distribution over an extended object when a target maneuvers, a multiple model technique is represented based on MB-track-before-detect (TBD) filter supported by SK-sub-random matrix model (RMM) or sub-ellipses framework. Simulation results demonstrate this remarkable impact.

Keywords: maneuvering extended objects, ISAR, skewed normal distribution, sub-RMM, MM-MB-TBD filter

Procedia PDF Downloads 66
3631 Dual-Channel Reliable Breast Ultrasound Image Classification Based on Explainable Attribution and Uncertainty Quantification

Authors: Haonan Hu, Shuge Lei, Dasheng Sun, Huabin Zhang, Kehong Yuan, Jian Dai, Jijun Tang

Abstract:

This paper focuses on the classification task of breast ultrasound images and conducts research on the reliability measurement of classification results. A dual-channel evaluation framework was developed based on the proposed inference reliability and predictive reliability scores. For the inference reliability evaluation, human-aligned and doctor-agreed inference rationals based on the improved feature attribution algorithm SP-RISA are gracefully applied. Uncertainty quantification is used to evaluate the predictive reliability via the test time enhancement. The effectiveness of this reliability evaluation framework has been verified on the breast ultrasound clinical dataset YBUS, and its robustness is verified on the public dataset BUSI. The expected calibration errors on both datasets are significantly lower than traditional evaluation methods, which proves the effectiveness of the proposed reliability measurement.

Keywords: medical imaging, ultrasound imaging, XAI, uncertainty measurement, trustworthy AI

Procedia PDF Downloads 78
3630 A Multi-Output Network with U-Net Enhanced Class Activation Map and Robust Classification Performance for Medical Imaging Analysis

Authors: Jaiden Xuan Schraut, Leon Liu, Yiqiao Yin

Abstract:

Computer vision in medical diagnosis has achieved a high level of success in diagnosing diseases with high accuracy. However, conventional classifiers that produce an image to-label result provides insufficient information for medical professionals to judge and raise concerns over the trust and reliability of a model with results that cannot be explained. In order to gain local insight into cancerous regions, separate tasks such as imaging segmentation need to be implemented to aid the doctors in treating patients, which doubles the training time and costs which renders the diagnosis system inefficient and difficult to be accepted by the public. To tackle this issue and drive AI-first medical solutions further, this paper proposes a multi-output network that follows a U-Net architecture for image segmentation output and features an additional convolutional neural networks (CNN) module for auxiliary classification output. Class activation maps are a method of providing insight into a convolutional neural network’s feature maps that leads to its classification but in the case of lung diseases, the region of interest is enhanced by U-net-assisted Class Activation Map (CAM) visualization. Therefore, our proposed model combines image segmentation models and classifiers to crop out only the lung region of a chest X-ray’s class activation map to provide a visualization that improves the explainability and is able to generate classification results simultaneously which builds trust for AI-led diagnosis systems. The proposed U-Net model achieves 97.61% accuracy and a dice coefficient of 0.97 on testing data from the COVID-QU-Ex Dataset which includes both diseased and healthy lungs.

Keywords: multi-output network model, U-net, class activation map, image classification, medical imaging analysis

Procedia PDF Downloads 182
3629 Acquisition of French (L3) Direct Object by Persian (L1) Speakers of English (L2) as EFL Learners

Authors: Ali Akbar Jabbari

Abstract:

The present study assessed the acquisition of L3 French direct objects by Persian speakers who had already learned English as their L2. The ultimate goal of this paper is to extend the current knowledge about the CLI phenomenon in the realm of third language acquisition by examining the role of Persian and English as background languages and learners’ English level of proficiency in their performance on French direct object. To fulfill this, the assumptions of three L3 hypotheses, namely L1 Transfer, L2 Status Factor, and Cumulative Enhancement Model, were examined. The research sample was comprised of 40 undergraduate students in the fields of English language and literature and translation studies at Birjand University in Iran. According to the English proficiency level of learners revealed by the Quick Oxford English Placement test, the participants were grouped as upper intermediate and lower intermediate. A grammaticality judgment and a translation test were administered to gather the required data on learners' comprehension and production of the desired structure in French. It was demonstrated that the rate of positive transfer from previously learned languages was more potent than the rate of negative transfer. A Comparison of groups' performances revealed a significant difference between upper and lower intermediate groups in positing French direct objects correctly. However, the upper intermediate group did not significantly differ from the lower intermediate group in negative transfer. It can be said that by increasing the L2 proficiency of the learners, they could use their previous linguistic knowledge more efficiently. Although further examinations are needed, the current study contributed to a better characterization of cross-linguistic influence in third language acquisition. The findings help French teachers and learners to positively exploit the prior knowledge of Persian and English and apply it in in the multilingual context of French direct object's teaching and learning process.

Keywords: Cross-Linguistic Influence, Persian, French & English Direct Object, Third Language Acquisition, Language Transfer

Procedia PDF Downloads 63
3628 Developing Integrated Model for Building Design and Evacuation Planning

Authors: Hao-Hsi Tseng, Hsin-Yun Lee

Abstract:

In the process of building design, the designers have to complete the spatial design and consider the evacuation performance at the same time. It is usually difficult to combine the two planning processes and it results in the gap between spatial design and evacuation performance. Then the designers cannot complete an integrated optimal design solution. In addition, the evacuation routing models proposed by previous researchers is different from the practical evacuation decisions in the real field. On the other hand, more and more building design projects are executed by Building Information Modeling (BIM) in which the design content is formed by the object-oriented framework. Thus, the integration of BIM and evacuation simulation can make a significant contribution for designers. Therefore, this research plan will establish a model that integrates spatial design and evacuation planning. The proposed model will provide the support for the spatial design modifications and optimize the evacuation planning. The designers can complete the integrated design solution in BIM. Besides, this research plan improves the evacuation routing method to make the simulation results more practical. The proposed model will be applied in a building design project for evaluation and validation when it will provide the near-optimal design suggestion. By applying the proposed model, the integration and efficiency of the design process are improved and the evacuation plan is more useful. The quality of building spatial design will be better.

Keywords: building information modeling, evacuation, design, floor plan

Procedia PDF Downloads 441
3627 The Influence of the Laws of Ergonomics on the Design of High-Rise Buildings

Authors: Valery A. Aurov, Maria D. Bausheva, Elena V. Uliyanova

Abstract:

The problems of sustainability of contemporary high-rise buildings now demand an altogether new approach, which corresponds with the laws of dialectics. We should imply the principle “going from mega-object to the so called mezzo-object.” So the scientists have arrived at the conclusion that a contemporary “skyscraper” must not increase in height but develop horizontal space axes which unite a complex of high-rise buildings into a single composition. This is necessary both for safety issues and increasing skyscrapers’ functioning qualities. As a result, architects single out a quality unit in a dominating group of high-rise constructions and make a conclusion about the influence of visual fields on the designing parameters of this group.

Keywords: design, high-rise buildings, skyscrapers, sustainability, visual fields, dominating group, regulations, design recommendations

Procedia PDF Downloads 361