Search results for: bug classification
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2167

Search results for: bug classification

1357 Power Quality Modeling Using Recognition Learning Methods for Waveform Disturbances

Authors: Sang-Keun Moon, Hong-Rok Lim, Jin-O Kim

Abstract:

This paper presents a Power Quality (PQ) modeling and filtering processes for the distribution system disturbances using recognition learning methods. Typical PQ waveforms with mathematical applications and gathered field data are applied to the proposed models. The objective of this paper is analyzing PQ data with respect to monitoring, discriminating, and evaluating the waveform of power disturbances to ensure the system preventative system failure protections and complex system problem estimations. Examined signal filtering techniques are used for the field waveform noises and feature extractions. Using extraction and learning classification techniques, the efficiency was verified for the recognition of the PQ disturbances with focusing on interactive modeling methods in this paper. The waveform of selected 8 disturbances is modeled with randomized parameters of IEEE 1159 PQ ranges. The range, parameters, and weights are updated regarding field waveform obtained. Along with voltages, currents have same process to obtain the waveform features as the voltage apart from some of ratings and filters. Changing loads are causing the distortion in the voltage waveform due to the drawing of the different patterns of current variation. In the conclusion, PQ disturbances in the voltage and current waveforms indicate different types of patterns of variations and disturbance, and a modified technique based on the symmetrical components in time domain was proposed in this paper for the PQ disturbances detection and then classification. Our method is based on the fact that obtained waveforms from suggested trigger conditions contain potential information for abnormality detections. The extracted features are sequentially applied to estimation and recognition learning modules for further studies.

Keywords: power quality recognition, PQ modeling, waveform feature extraction, disturbance trigger condition, PQ signal filtering

Procedia PDF Downloads 186
1356 Airport Pavement Crack Measurement Systems and Crack Density for Pavement Evaluation

Authors: Ali Ashtiani, Hamid Shirazi

Abstract:

This paper reviews the status of existing practice and research related to measuring pavement cracking and using crack density as a pavement surface evaluation protocol. Crack density for pavement evaluation is currently not widely used within the airport community and its use by the highway community is limited. However, surface cracking is a distress that is closely monitored by airport staff and significantly influences the development of maintenance, rehabilitation and reconstruction plans for airport pavements. Therefore crack density has the potential to become an important indicator of pavement condition if the type, severity and extent of surface cracking can be accurately measured. A pavement distress survey is an essential component of any pavement assessment. Manual crack surveying has been widely used for decades to measure pavement performance. However, the accuracy and precision of manual surveys can vary depending upon the surveyor and performing surveys may disrupt normal operations. Given the variability of manual surveys, this method has shown inconsistencies in distress classification and measurement. This can potentially impact the planning for pavement maintenance, rehabilitation and reconstruction and the associated funding strategies. A substantial effort has been devoted for the past 20 years to reduce the human intervention and the error associated with it by moving toward automated distress collection methods. The automated methods refer to the systems that identify, classify and quantify pavement distresses through processes that require no or very minimal human intervention. This principally involves the use of a digital recognition software to analyze and characterize pavement distresses. The lack of established protocols for measurement and classification of pavement cracks captured using digital images is a challenge to developing a reliable automated system for distress assessment. Variations in types and severity of distresses, different pavement surface textures and colors and presence of pavement joints and edges all complicate automated image processing and crack measurement and classification. This paper summarizes the commercially available systems and technologies for automated pavement distress evaluation. A comprehensive automated pavement distress survey involves collection, interpretation, and processing of the surface images to identify the type, quantity and severity of the surface distresses. The outputs can be used to quantitatively calculate the crack density. The systems for automated distress survey using digital images reviewed in this paper can assist the airport industry in the development of a pavement evaluation protocol based on crack density. Analysis of automated distress survey data can lead to a crack density index. This index can be used as a means of assessing pavement condition and to predict pavement performance. This can be used by airport owners to determine the type of pavement maintenance and rehabilitation in a more consistent way.

Keywords: airport pavement management, crack density, pavement evaluation, pavement management

Procedia PDF Downloads 185
1355 Detecting Indigenous Languages: A System for Maya Text Profiling and Machine Learning Classification Techniques

Authors: Alejandro Molina-Villegas, Silvia Fernández-Sabido, Eduardo Mendoza-Vargas, Fátima Miranda-Pestaña

Abstract:

The automatic detection of indigenous languages ​​in digital texts is essential to promote their inclusion in digital media. Underrepresented languages, such as Maya, are often excluded from language detection tools like Google’s language-detection library, LANGDETECT. This study addresses these limitations by developing a hybrid language detection solution that accurately distinguishes Maya (YUA) from Spanish (ES). Two strategies are employed: the first focuses on creating a profile for the Maya language within the LANGDETECT library, while the second involves training a Naive Bayes classification model with two categories, YUA and ES. The process includes comprehensive data preprocessing steps, such as cleaning, normalization, tokenization, and n-gram counting, applied to text samples collected from various sources, including articles from La Jornada Maya, a major newspaper in Mexico and the only media outlet that includes a Maya section. After the training phase, a portion of the data is used to create the YUA profile within LANGDETECT, which achieves an accuracy rate above 95% in identifying the Maya language during testing. Additionally, the Naive Bayes classifier, trained and tested on the same database, achieves an accuracy close to 98% in distinguishing between Maya and Spanish, with further validation through F1 score, recall, and logarithmic scoring, without signs of overfitting. This strategy, which combines the LANGDETECT profile with a Naive Bayes model, highlights an adaptable framework that can be extended to other underrepresented languages in future research. This fills a gap in Natural Language Processing and supports the preservation and revitalization of these languages.

Keywords: indigenous languages, language detection, Maya language, Naive Bayes classifier, natural language processing, low-resource languages

Procedia PDF Downloads 16
1354 Selection of New Business in Brazilian Companies Incubators through Hierarchical Methodology

Authors: Izabel Cristina Zattar, Gilberto Passos Lima, Guilherme Schünemann de Oliveira

Abstract:

In Brazil, there are several institutions committed to the development of new businesses based on product innovation. Among them are business incubators, universities and science institutes. Business incubators can be defined as nurseries for new companies, which may be in the technology segment, discussed in this article. Business incubators provide services related to infrastructure, such as physical space and meeting rooms. Besides these services, incubators also offer assistance in the form of information and communication, access to finance, relationship networks and business monitoring and mentoring processes. Business incubators support not all technology companies. One of the business incubators tasks is to assess the nature and feasibility of new business proposals. To assist this goal, this paper proposes a methodology for evaluating new business using the Analytic Hierarchy Process (AHP). This paper presents the concepts used in the assessing methodology application for new business, concepts that have been tested with positive results in practice. This study counts on three main steps: first, a hierarchy was built, based on new business manuals used by the business incubators. These books and manuals relate business selection requirements, such as the innovation status and other technological aspects. Then, a questionnaire was generated, in order to guide incubator experts in the parity comparisons at all hierarchy levels. The weights of each requirement are calculated from information obtained from the questionnaire responses. Finally, the proposed method was applied to evaluate five new business proposals, which were applying to be part of a company incubator. The main result is the classification of these new businesses, which helped the incubator experts to decide what companies were more eligible to work with. This classification may also be helpful to the decision-making process of business incubators in future selection processes.

Keywords: Analytic Hierarchy Process (AHP), Brazilian companies incubators, technology companies, incubator

Procedia PDF Downloads 373
1353 A Technique for Planning the Application of Buttress Plate in the Medial Tibial Plateau Using the Preoperative CT Scan

Authors: P. Panwalkar, K. Veravalli, R. Gwynn, M. Tofighi, R. Clement, A. Mofidi

Abstract:

When operating on tibial plateau fracture especially medial tibial plateau, it has regularly been said “where do I put my thumb to reduce the fracture”. This refers to the ideal placement of the buttress device to hold the fracture till union. The aim of this study was to see if one can identify this sweet spot using a CT scan. Methods: Forty-five tibial plateau fractures with medial plateau involvement were identified and included in the study. The preoperative CT scans were analysed and the medial plateau involvement pattern was classified based on modified radiological classification by Yukata et-al of stress fracture of medial tibial plateau. The involvement of part of plateau was compared with position of buttress plate position which was classified as medial posteromedial or both. Presence and position of the buttress was compared with ability to achieve and hold the reduction of the fracture till union. Results: Thirteen fractures were type-1 fracture, 19 fractures were type-2 fracture and 13 fractures were type-3 fracture. Sixteen fractures were buttressed correctly according to the potential deformity and twenty-six fractures were not buttressed and three fractures were partly buttressed correctly. No fracture was over butressed! When the fracture was buttressed correctly the rate of the malunion was 0%. When fracture was partly buttressed 33% were anatomically united and 66% were united in the plane of buttress. When buttress was not used, 14 were malunited, one malunited in one of the two planes of deformity and eleven anatomically healed (of which 9 were non displaced!). Buttressing resulted in statistically significant lower mal-union rate (x2=7.8, p=0.0052). Conclusion: The classification based on involvement of medial condyle can identify the placement of buttress plate in the tibial plateau. The correct placement of the buttress plate results in predictably satisfactory union. There may be a correlation between injury shape of the tibial plateau and the fracture type.

Keywords: knee, tibial plateau, trauma, CT scan, surgery

Procedia PDF Downloads 146
1352 Evaluation of Gesture-Based Password: User Behavioral Features Using Machine Learning Algorithms

Authors: Lakshmidevi Sreeramareddy, Komalpreet Kaur, Nane Pothier

Abstract:

Graphical-based passwords have existed for decades. Their major advantage is that they are easier to remember than an alphanumeric password. However, their disadvantage (especially recognition-based passwords) is the smaller password space, making them more vulnerable to brute force attacks. Graphical passwords are also highly susceptible to the shoulder-surfing effect. The gesture-based password method that we developed is a grid-free, template-free method. In this study, we evaluated the gesture-based passwords for usability and vulnerability. The results of the study are significant. We developed a gesture-based password application for data collection. Two modes of data collection were used: Creation mode and Replication mode. In creation mode (Session 1), users were asked to create six different passwords and reenter each password five times. In replication mode, users saw a password image created by some other user for a fixed duration of time. Three different duration timers, such as 5 seconds (Session 2), 10 seconds (Session 3), and 15 seconds (Session 4), were used to mimic the shoulder-surfing attack. After the timer expired, the password image was removed, and users were asked to replicate the password. There were 74, 57, 50, and 44 users participated in Session 1, Session 2, Session 3, and Session 4 respectfully. In this study, the machine learning algorithms have been applied to determine whether the person is a genuine user or an imposter based on the password entered. Five different machine learning algorithms were deployed to compare the performance in user authentication: namely, Decision Trees, Linear Discriminant Analysis, Naive Bayes Classifier, Support Vector Machines (SVMs) with Gaussian Radial Basis Kernel function, and K-Nearest Neighbor. Gesture-based password features vary from one entry to the next. It is difficult to distinguish between a creator and an intruder for authentication. For each password entered by the user, four features were extracted: password score, password length, password speed, and password size. All four features were normalized before being fed to a classifier. Three different classifiers were trained using data from all four sessions. Classifiers A, B, and C were trained and tested using data from the password creation session and the password replication with a timer of 5 seconds, 10 seconds, and 15 seconds, respectively. The classification accuracies for Classifier A using five ML algorithms are 72.5%, 71.3%, 71.9%, 74.4%, and 72.9%, respectively. The classification accuracies for Classifier B using five ML algorithms are 69.7%, 67.9%, 70.2%, 73.8%, and 71.2%, respectively. The classification accuracies for Classifier C using five ML algorithms are 68.1%, 64.9%, 68.4%, 71.5%, and 69.8%, respectively. SVMs with Gaussian Radial Basis Kernel outperform other ML algorithms for gesture-based password authentication. Results confirm that the shorter the duration of the shoulder-surfing attack, the higher the authentication accuracy. In conclusion, behavioral features extracted from the gesture-based passwords lead to less vulnerable user authentication.

Keywords: authentication, gesture-based passwords, machine learning algorithms, shoulder-surfing attacks, usability

Procedia PDF Downloads 107
1351 Using Hierarchical Methodology to Assist the Selection of New Business in Brazilian Companies Incubators

Authors: Izabel Cristina Zattar, Gilberto Passos Lima, Guilherme Schünemann de Oliveira

Abstract:

In Brazil, there are several institutions committed to the development of new businesses based on product innovation. Among them are business incubators, universities and science institutes. Business incubators can be defined as nurseries for new companies, which may be in the technology segment, discussed in this article. Business incubators provide services related to infrastructure, such as physical space and meeting rooms. Besides these services, incubators also offer assistance in the form of information and communication, access to finance, relationship networks and business monitoring and mentoring processes. Business incubators support not all technology companies. One of the business incubators tasks is to assess the nature and feasibility of new business proposals. To assist in this goal, this paper proposes a methodology for evaluating new business using the Analytic Hierarchy Process (AHP). This paper presents the concepts used in the assessing methodology application for new business, concepts that have been tested with positive results in practice. This study counts on three main steps: first, a hierarchy was built, based on new business manuals used by the business incubators. These books and manuals relate business selection requirements, such as the innovation status and other technological aspects. Then, a questionnaire was generated, in order to guide incubator experts in the parity comparisons at all hierarchy levels. The weights of each requirement are calculated from information obtained from the questionnaire responses. Finally, the proposed method was applied to evaluate five new business proposals, which were applying to be part of a company incubator. The main result is the classification of these new businesses, which helped the incubator experts to decide what companies were more eligible to work with. This classification may also be helpful to the decision-making process of business incubators in future selection processes.

Keywords: Analytic Hierarchy Process (AHP), Brazilian companies incubators, technology companies, incubator

Procedia PDF Downloads 402
1350 Proposed Organizational Development Interventions in Managing Occupational Stressors for Business Schools in Batangas City

Authors: Marlon P. Perez

Abstract:

The study intended to determine the level of occupational stress that was experienced by faculty members of private and public business schools in Batangas City with the end in view of proposing organizational development interventions in managing occupational stressors. Stressors such as factors intrinsic to the job, role in the organization, relationships at work, career development and organizational structure and climate were used as determinants of occupational stress level. Descriptive method of research was used as its research design. There were only 64 full-time faculty members coming from private and public business schools in Batangas City – University of Batangas, Lyceum of the Philippines University-Batangas, Golden Gate Colleges, Batangas State University and Colegio ng Lungsod ng Batangas. Survey questionnaire was used as data gathering instrument. It was found out that all occupational stressors were assessed stressful when grouped according to its classification of tertiary schools while response of subject respondents differs on their assessment of occupational stressors. Age variable has become significantly related to respondents’ assessments on factors intrinsic to the job and career development; however, it was not significantly related to role in the organization, relationships at work and organizational structure and climate. On the other hand, gender, marital status, highest educational attainment, employment status, length of service, area of specialization and classification of tertiary school were revealed to be not significantly related to all occupational stressors. Various organizational development interventions have been proposed to manage the occupational stressors that are experienced by business faculty members in the institution.

Keywords: occupational stress, business school, organizational development, intervention, stressors, faculty members, assessment, manage

Procedia PDF Downloads 431
1349 Assessment of the Spatio-Temporal Distribution of Pteridium aquilinum (Bracken Fern) Invasion on the Grassland Plateau in Nyika National Park

Authors: Andrew Kanzunguze, Lusayo Mwabumba, Jason K. Gilbertson, Dominic B. Gondwe, George Z. Nxumayo

Abstract:

Knowledge about the spatio-temporal distribution of invasive plants in protected areas provides a base from which hypotheses explaining proliferation of plant invasions can be made alongside development of relevant invasive plant monitoring programs. The aim of this study was to investigate the spatio-temporal distribution of bracken fern on the grassland plateau of Nyika National Park over the past 30 years (1986-2016) as well as to determine the current extent of the invasion. Remote sensing, machine learning, and statistical modelling techniques (object-based image analysis, image classification and linear regression analysis) in geographical information systems were used to determine both the spatial and temporal distribution of bracken fern in the study area. Results have revealed that bracken fern has been increasing coverage on the Nyika plateau at an estimated annual rate of 87.3 hectares since 1986. This translates to an estimated net increase of 2,573.1 hectares, which was recorded from 1,788.1 hectares (1986) to 4,361.9 hectares (2016). As of 2017 bracken fern covered 20,940.7 hectares, approximately 14.3% of the entire grassland plateau. Additionally, it was observed that the fern was distributed most densely around Chelinda camp (on the central plateau) as well as in forest verges and roadsides across the plateau. Based on these results it is recommended that Ecological Niche Modelling approaches be employed to (i) isolate the most important factors influencing bracken fern proliferation as well as (ii) identify and prioritize areas requiring immediate control interventions so as to minimize bracken fern proliferation in Nyika National Park.

Keywords: bracken fern, image classification, Landsat-8, Nyika National Park, spatio-temporal distribution

Procedia PDF Downloads 179
1348 Reduplication in Dhiyan: An Indo-Aryan Language of Assam

Authors: S. Sulochana Singha

Abstract:

Dhiyan or Dehan is the name of the community and language spoken by the Koch-Rajbangshi people of Barak Valley of Assam. Ethnically, they are Mongoloids, and their language belongs to the Indo-Aryan language family. However, Dhiyan is absent in any classification of Indo-Aryan languages. So the classification of Dhiyan language under the Indo-Aryan language family is completely based on the shared typological features of the other Indo-Aryan languages. Typologically, Dhiyan is an agglutinating language, and it shares many features of Indo-Aryan languages like presence of aspirated voiced stops, non-tonal, verb-person agreement, adjectives as different word class, prominent tense and subject object verb word order. Reduplication is a productive word-formation process in Dhiyan. Besides it also expresses plurality, intensification, and distributive. Generally, reduplication in Dhiyan can be at the morphological or lexical level. Morphological reduplication in Dhiyan involves expressives which includes onomatopoeias, sound symbolism, idiophones, and imitatives. Lexical reduplication in the language can be formed by echo formations and word reduplication. Echo formation in Dhiyan is formed by partial repetition from the base word which can be either consonant alternation or vowel alternation. The consonant alternation is basically found in onset position while the alternation of vowel is basically found in open syllable particularly in final syllable. Word reduplication involves reduplication of nouns, interrogatives, adjectives, and numerals which further can be class changing or class maintaining reduplication. The process of reduplication can be partial or complete whether it is lexical or morphological. The present paper is an attempt to describe some aspects of the formation, function, and usage of reduplications in Dhiyan which is mainly spoken in ten villages in the Eastern part of Barak River in the Cachar District of Assam.

Keywords: Barak-Valley, Dhiyan, Indo-Aryan, reduplication

Procedia PDF Downloads 217
1347 Hybridization of Manually Extracted and Convolutional Features for Classification of Chest X-Ray of COVID-19

Authors: M. Bilal Ishfaq, Adnan N. Qureshi

Abstract:

COVID-19 is the most infectious disease these days, it was first reported in Wuhan, the capital city of Hubei in China then it spread rapidly throughout the whole world. Later on 11 March 2020, the World Health Organisation (WHO) declared it a pandemic. Since COVID-19 is highly contagious, it has affected approximately 219M people worldwide and caused 4.55M deaths. It has brought the importance of accurate diagnosis of respiratory diseases such as pneumonia and COVID-19 to the forefront. In this paper, we propose a hybrid approach for the automated detection of COVID-19 using medical imaging. We have presented the hybridization of manually extracted and convolutional features. Our approach combines Haralick texture features and convolutional features extracted from chest X-rays and CT scans. We also employ a minimum redundancy maximum relevance (MRMR) feature selection algorithm to reduce computational complexity and enhance classification performance. The proposed model is evaluated on four publicly available datasets, including Chest X-ray Pneumonia, COVID-19 Pneumonia, COVID-19 CTMaster, and VinBig data. The results demonstrate high accuracy and effectiveness, with 0.9925 on the Chest X-ray pneumonia dataset, 0.9895 on the COVID-19, Pneumonia and Normal Chest X-ray dataset, 0.9806 on the Covid CTMaster dataset, and 0.9398 on the VinBig dataset. We further evaluate the effectiveness of the proposed model using ROC curves, where the AUC for the best-performing model reaches 0.96. Our proposed model provides a promising tool for the early detection and accurate diagnosis of COVID-19, which can assist healthcare professionals in making informed treatment decisions and improving patient outcomes. The results of the proposed model are quite plausible and the system can be deployed in a clinical or research setting to assist in the diagnosis of COVID-19.

Keywords: COVID-19, feature engineering, artificial neural networks, radiology images

Procedia PDF Downloads 75
1346 Hedgerow Detection and Characterization Using Very High Spatial Resolution SAR DATA

Authors: Saeid Gharechelou, Stuart Green, Fiona Cawkwell

Abstract:

Hedgerow has an important role for a wide range of ecological habitats, landscape, agriculture management, carbon sequestration, wood production. Hedgerow detection accurately using satellite imagery is a challenging problem in remote sensing techniques, because in the special approach it is very similar to line object like a road, from a spectral viewpoint, a hedge is very similar to a forest. Remote sensors with very high spatial resolution (VHR) recently enable the automatic detection of hedges by the acquisition of images with enough spectral and spatial resolution. Indeed, recently VHR remote sensing data provided the opportunity to detect the hedgerow as line feature but still remain difficulties in monitoring the characterization in landscape scale. In this research is used the TerraSAR-x Spotlight and Staring mode with 3-5 m resolution in wet and dry season in the test site of Fermoy County, Ireland to detect the hedgerow by acquisition time of 2014-2015. Both dual polarization of Spotlight data in HH/VV is using for detection of hedgerow. The varied method of SAR image technique with try and error way by integration of classification algorithm like texture analysis, support vector machine, k-means and random forest are using to detect hedgerow and its characterization. We are applying the Shannon entropy (ShE) and backscattering analysis in single and double bounce in polarimetric analysis for processing the object-oriented classification and finally extracting the hedgerow network. The result still is in progress and need to apply the other method as well to find the best method in study area. Finally, this research is under way to ahead to get the best result and here just present the preliminary work that polarimetric image of TSX potentially can detect the hedgerow.

Keywords: TerraSAR-X, hedgerow detection, high resolution SAR image, dual polarization, polarimetric analysis

Procedia PDF Downloads 230
1345 Organic Facies Classification, Distribution, and Their Geochemical Characteristics in Sirt Basin, Libya

Authors: Khaled Albriki, Feiyu Wang

Abstract:

The failed rifted epicratonic Sirt basin is located in the northern margin of the African Plate with an area of approximately 600,000 km2. The organofacies' classification, characterization, and its distribution vertically and horizontally are carried out in 7 main troughs with 32 typical selected wells. 7 geological and geochemical cross sections including Rock-Eval data and % TOC data are considered in order to analyze and to characterize the main organofacies with respect to their geochemical and geological controls and also to remove the ambiguity behind the complexity of the orgnofacies types and distributions in the basin troughs from where the oil and gas are generated and migrated. This study confirmes that there are four different classical types of organofacies distributed in Sirt basin F, D/E, C, and B. these four clasical types of organofacies controls the type and amount of the hydrocarbon discovered in Sirt basin. Oil bulk property data from more than 20 oil and gas fields indicate that D/E organoface are significant oil and gas contributors similar to B organoface. In the western Sirt basin in Zallah-Dur Al Abd, Hagfa, Kotla, and Dur Atallha troughs, F organoface is identified for Etel formation, Kalash formation and Hagfa formation having % TOC < 0.6, whereas the good quality D/E and B organofacies present in Rachmat formation and Sirte shale formation both have % TOC > 1.1. Results from the deepest trough (Ajdabiya), Etel (Gas pron in Whadyat trough), Kalash, and Hagfa constitute F organofacies, mainly. The Rachmat and Sirt shale both have D/E to B organofacies with % TOC > 1.2, thus indicating the best organofacies quality in Ajdabiya trough. In Maragh trough, results show that Etel F organofacies and D/E, C to B organofacies related to Middle Nubian, Rachmat, and Sirte shale have %TOC > 0.66. Towards the eastern Sirt basin, in troughs (Hameimat, Faregh, and Sarir), results show that the Middle Nubian, Etel, Rachmat, and Sirte shales are strongly dominated by D/E, C to B (% TOC > 0.75) organofacies.

Keywords: Etel, Mid-Nubian, organic facies, Rachmat, Sirt basin, Sirte shale

Procedia PDF Downloads 128
1344 Short Text Classification Using Part of Speech Feature to Analyze Students' Feedback of Assessment Components

Authors: Zainab Mutlaq Ibrahim, Mohamed Bader-El-Den, Mihaela Cocea

Abstract:

Students' textual feedback can hold unique patterns and useful information about learning process, it can hold information about advantages and disadvantages of teaching methods, assessment components, facilities, and other aspects of teaching. The results of analysing such a feedback can form a key point for institutions’ decision makers to advance and update their systems accordingly. This paper proposes a data mining framework for analysing end of unit general textual feedback using part of speech feature (PoS) with four machine learning algorithms: support vector machines, decision tree, random forest, and naive bays. The proposed framework has two tasks: first, to use the above algorithms to build an optimal model that automatically classifies the whole data set into two subsets, one subset is tailored to assessment practices (assessment related), and the other one is the non-assessment related data. Second task to use the same algorithms to build an optimal model for whole data set, and the new data subsets to automatically detect their sentiment. The significance of this paper is to compare the performance of the above four algorithms using part of speech feature to the performance of the same algorithms using n-grams feature. The paper follows Knowledge Discovery and Data Mining (KDDM) framework to construct the classification and sentiment analysis models, which is understanding the assessment domain, cleaning and pre-processing the data set, selecting and running the data mining algorithm, interpreting mined patterns, and consolidating the discovered knowledge. The results of this paper experiments show that both models which used both features performed very well regarding first task. But regarding the second task, models that used part of speech feature has underperformed in comparison with models that used unigrams and bigrams.

Keywords: assessment, part of speech, sentiment analysis, student feedback

Procedia PDF Downloads 142
1343 Waste Analysis and Classification Study (WACS) in Ecotourism Sites of Samal Island, Philippines Towards a Circular Economy Perspective

Authors: Reeden Bicomong

Abstract:

Ecotourism activities, though geared towards conservation efforts, still put pressures against the natural state of the environment. Influx of visitors that goes beyond carrying capacity of the ecotourism site, the wastes generated, greenhouse gas emissions, are just few of the potential negative impacts of a not well-managed ecotourism activities. According to Girard and Nocca (2017) tourism produces many negative impacts because it is configured according to the model of linear economy, operating on a linear model of take, make and dispose (Ellen MacArthur Foundation 2015). With the influx of tourists in an ecotourism area, more wastes are generated, and if unregulated, natural state of the environment will be at risk. It is in this light that a study on waste analysis and classification study in five different ecotourism sites of Samal Island, Philippines was conducted. The major objective of the study was to analyze the amount and content of wastes generated from ecotourism sites in Samal Island, Philippines and make recommendations based on the circular economy perspective. Five ecotourism sites in Samal Island, Philippines was identified such as Hagimit Falls, Sanipaan Vanishing Shoal, Taklobo Giant Clams, Monfort Bat Cave, and Tagbaobo Community Based Ecotourism. Ocular inspection of each ecotourism site was conducted. Likewise, key informant interview of ecotourism operators and staff was done. Wastes generated from these ecotourism sites were analyzed and characterized to come up with recommendations that are based on the concept of circular economy. Wastes generated were classified into biodegradables, recyclables, residuals and special wastes. Regression analysis was conducted to determine if increase in number of visitors would equate to increase in the amount of wastes generated. Ocular inspection indicated that all of the five ecotourism sites have their own system of waste collection. All of the sites inspected were found to be conducting waste separation at source since there are different types of garbage bins for all of the four classification of wastes such as biodegradables, recyclables, residuals and special wastes. Furthermore, all five ecotourism sites practice composting of biodegradable wastes and recycling of recyclables. Therefore, only residuals are being collected by the municipal waste collectors. Key informant interview revealed that all five ecotourism sites offer mostly nature based activities such as swimming, diving, site seeing, bat watching, rice farming experiences and community living. Among the five ecotourism sites, Sanipaan Vanishing Shoal has the highest average number of visitors in a weekly basis. At the same time, in the wastes assessment study conducted, Sanipaan has the highest amount of wastes generated. Further results of wastes analysis revealed that biodegradables constitute majority of the wastes generated in all of the five selected ecotourism sites. Meanwhile, special wastes proved to be the least generated as there was no amount of this type was observed during the three consecutive weeks WACS was conducted.

Keywords: Circular economy, ecotourism, sustainable development, WACS

Procedia PDF Downloads 220
1342 Identification and Classification of Stakeholders in the Transition to 3D Cadastre

Authors: Qiaowen Lin

Abstract:

The 3D cadastre is an inevitable choice to meet the needs of real cadastral management. Nowadays, more attention is given to the technical aspects of 3D cadastre, resulting in the imbalance within this field. To fulfill this research gap, the stakeholder, which has been regarded as the determining factor in cadastral change has been studied. Delphi method, Michael rating, and stakeholder mapping are used to identify and classify the stakeholders in 3D cadastre. It is concluded that the project managers should pay more attention to the interesting appeal of the key stakeholders and different coping strategies should be adopted to facilitate the transition to 3D cadastre.

Keywords: stakeholders, three dimension, cadastre, transtion

Procedia PDF Downloads 290
1341 Application of Deep Learning Algorithms in Agriculture: Early Detection of Crop Diseases

Authors: Manaranjan Pradhan, Shailaja Grover, U. Dinesh Kumar

Abstract:

Farming community in India, as well as other parts of the world, is one of the highly stressed communities due to reasons such as increasing input costs (cost of seeds, fertilizers, pesticide), droughts, reduced revenue leading to farmer suicides. Lack of integrated farm advisory system in India adds to the farmers problems. Farmers need right information during the early stages of crop’s lifecycle to prevent damage and loss in revenue. In this paper, we use deep learning techniques to develop an early warning system for detection of crop diseases using images taken by farmers using their smart phone. The research work leads to building a smart assistant using analytics and big data which could help the farmers with early diagnosis of the crop diseases and corrective actions. The classical approach for crop disease management has been to identify diseases at crop level. Recently, ImageNet Classification using the convolutional neural network (CNN) has been successfully used to identify diseases at individual plant level. Our model uses convolution filters, max pooling, dense layers and dropouts (to avoid overfitting). The models are built for binary classification (healthy or not healthy) and multi class classification (identifying which disease). Transfer learning is used to modify the weights of parameters learnt through ImageNet dataset and apply them on crop diseases, which reduces number of epochs to learn. One shot learning is used to learn from very few images, while data augmentation techniques are used to improve accuracy with images taken from farms by using techniques such as rotation, zoom, shift and blurred images. Models built using combination of these techniques are more robust for deploying in the real world. Our model is validated using tomato crop. In India, tomato is affected by 10 different diseases. Our model achieves an accuracy of more than 95% in correctly classifying the diseases. The main contribution of our research is to create a personal assistant for farmers for managing plant disease, although the model was validated using tomato crop, it can be easily extended to other crops. The advancement of technology in computing and availability of large data has made possible the success of deep learning applications in computer vision, natural language processing, image recognition, etc. With these robust models and huge smartphone penetration, feasibility of implementation of these models is high resulting in timely advise to the farmers and thus increasing the farmers' income and reducing the input costs.

Keywords: analytics in agriculture, CNN, crop disease detection, data augmentation, image recognition, one shot learning, transfer learning

Procedia PDF Downloads 119
1340 Products in Early Development Phases: Ecological Classification and Evaluation Using an Interval Arithmetic Based Calculation Approach

Authors: Helen L. Hein, Joachim Schwarte

Abstract:

As a pillar of sustainable development, ecology has become an important milestone in research community, especially due to global challenges like climate change. The ecological performance of products can be scientifically conducted with life cycle assessments. In the construction sector, significant amounts of CO2 emissions are assigned to the energy used for building heating purposes. Therefore, sustainable construction materials for insulating purposes are substantial, whereby aerogels have been explored intensively in the last years due to their low thermal conductivity. Therefore, the WALL-ACE project aims to develop an aerogel-based thermal insulating plaster that would achieve minor thermal conductivities. But as in the early stage of development phases, a lot of information is still missing or not yet accessible, the ecological performance of innovative products bases increasingly on uncertain data that can lead to significant deviations in the results. To be able to predict realistically how meaningful the results are and how viable the developed products may be with regard to their corresponding respective market, these deviations however have to be considered. Therefore, a classification method is presented in this study, which may allow comparing the ecological performance of modern products with already established and competitive materials. In order to achieve this, an alternative calculation method was used that allows computing with lower and upper bounds to consider all possible values without precise data. The life cycle analysis of the considered products was conducted with an interval arithmetic based calculation method. The results lead to the conclusion that the interval solutions describing the possible environmental impacts are so wide that the result usability is limited. Nevertheless, a further optimization in reducing environmental impacts of aerogels seems to be needed to become more competitive in the future.

Keywords: aerogel-based, insulating material, early development phase, interval arithmetic

Procedia PDF Downloads 142
1339 Impacted Maxillary Canines and Associated Dental Anomalies

Authors: Athanasia Eirini Zarkadi, Despoina Balli, Olga Elpis Kolokitha

Abstract:

Objective: Impacted maxillary canines are a frequent condition and a common reason for patients seeking orthodontic treatment. Their simultaneous presence with dental anomalies raises a question about their possible connection. The aim of this study was to investigate the association of maxillary impacted canines with dental anomalies. Materials and Methods: Files of 874 patients from an orthodontic private practice in Greece were evaluated for the presence of maxillary impacted canines. From this sample, a group of 97 patients (39 males and 58 females) with at least one impacted maxillary canine were selected and consisted of the study group (canine impaction group) of this study. This group was compared to a control group of 97 patients (42 males and 55 females) that was created by random selection from the initial sample without maxillary canine impaction. The impaction diagnosis was made from the panoramic radiographs and confirmed from the surgery. The association between maxillary canine impaction and dental anomalies was examined with the chi-square test. A classification tree was created to further investigate the relations between impaction and dental anomalies. The reproducibility of diagnoses was assessed by re-examining the records of 25 patients two weeks after the first examination. Results: The found associated anomalies were cone-shaped upper lateral incisors and infraocclusion of deciduous molars. There is a significant increase in the prevalence of 12,4% of distal displacement of the unerupted mandibular second premolar in the canine impaction group compared to the control group that was 7,2%. The classification tree showed that the presence of a cone-shaped maxillary lateral incisor gave rise to the probability of an impacted canine to 83,3%. Conclusions: The presence of cone-shaped maxillary lateral incisors and infraocclusion of deciduous molars can be considered valuable early risk indicators for maxillary canine impaction.

Keywords: cone-shaped maxillary lateral incisors, dental anomalies, impacted canines, infraoccluded deciduous molars

Procedia PDF Downloads 148
1338 Image Segmentation Using 2-D Histogram in RGB Color Space in Digital Libraries

Authors: El Asnaoui Khalid, Aksasse Brahim, Ouanan Mohammed

Abstract:

This paper presents an unsupervised color image segmentation method. It is based on a hierarchical analysis of 2-D histogram in RGB color space. This histogram minimizes storage space of images and thus facilitates the operations between them. The improved segmentation approach shows a better identification of objects in a color image and, at the same time, the system is fast.

Keywords: image segmentation, hierarchical analysis, 2-D histogram, classification

Procedia PDF Downloads 380
1337 Use of a Business Intelligence Software for Interactive Visualization of Data on the Swiss Elite Sports System

Authors: Corinne Zurmuehle, Andreas Christoph Weber

Abstract:

In 2019, the Swiss Federal Institute of Sport Magglingen (SFISM) conducted a mixed-methods study on the Swiss elite sports system, which yielded a large quantity of research data. In a quantitative online survey, 1151 elite sports athletes, 542 coaches, and 102 Performance Directors of national sports federations (NF) have submitted their perceptions of the national support measures of the Swiss elite sports system. These data provide an essential database for the further development of the Swiss elite sports system. The results were published in a report presenting the results divided into 40 Olympic summer and 14 winter sports (Olympic classification). The authors of this paper assume that, in practice, this division is too unspecific to assess where further measures would be needed. The aim of this paper is to find appropriate parameters for data visualization in order to identify disparities in sports promotion that allow an assessment of where further interventions by Swiss Olympic (NF umbrella organization) are required. Method: First, the variable 'salary earned from sport' was defined as a variable to measure the impact of elite sports promotion. This variable was chosen as a measure as it represents an important indicator for the professionalization of elite athletes and therefore reflects national level sports promotion measures applied by Swiss Olympic. Afterwards, the variable salary was tested with regard to the correlation between Olympic classification [a], calculating the Eta coefficient. To estimate the appropriate parameters for data visualization, the correlation between salary and four further parameters was analyzed by calculating the Eta coefficient: [a] sport; [b] prioritization (from 1 to 5) of the sports by Swiss Olympic; [c] gender; [d] employment level in sports. Results & Discussion: The analyses reveal a very small correlation between salary and Olympic classification (ɳ² = .011, p = .005). Gender demonstrates an even small correlation (ɳ² = .006, p = .014). The parameter prioritization was correlating with small effect (ɳ² = .017, p = .001) as did employment level (ɳ² = .028, p < .001). The highest correlation was identified by the parameter sport with a moderate effect (ɳ² = .075, p = .047). The analyses show that the disparities in sports promotion cannot be determined by a particular parameter but presumably explained by a combination of several parameters. We argue that the possibility of combining parameters for data visualization should be enabled when the analysis is provided to Swiss Olympic for further strategic decision-making. However, the inclusion of multiple parameters massively multiplies the number of graphs and is therefore not suitable for practical use. Therefore, we suggest to apply interactive dashboards for data visualization using Business Intelligence Software. Practical & Theoretical Contribution: This contribution provides the first attempt to use Business Intelligence Software for strategic decision-making in national level sports regarding the prioritization of national resources for sports and athletes. This allows to set specific parameters with a significant effect as filters. By using filters, parameters can be combined and compared against each other and set individually for each strategic decision.

Keywords: data visualization, business intelligence, Swiss elite sports system, strategic decision-making

Procedia PDF Downloads 90
1336 The AI Method and System for Analyzing Wound Status in Wound Care Nursing

Authors: Ho-Hsin Lee, Yue-Min Jiang, Shu-Hui Tsai, Jian-Ren Chen, Mei-Yu XU, Wen-Tien Wu

Abstract:

This project presents an AI-based method and system for wound status analysis. The system uses a three-in-one sensor device to analyze wound status, including color, temperature, and a 3D sensor to provide wound information up to 2mm below the surface, such as redness, heat, and blood circulation information. The system has a 90% accuracy rate, requiring only one manual correction in 70% of cases, with a one-second delay. The system also provides an offline application that allows for manual correction of the wound bed range using color-based guidance to estimate wound bed size with 96% accuracy and a maximum of one manual correction in 96% of cases, with a one-second delay. Additionally, AI-assisted wound bed range selection achieves 100% of cases without manual intervention, with an accuracy rate of 76%, while AI-based wound tissue type classification achieves an 85.3% accuracy rate for five categories. The AI system also includes similar case search and expert recommendation capabilities. For AI-assisted wound range selection, the system uses WIFI6 technology, increasing data transmission speeds by 22 times. The project aims to save up to 64% of the time required for human wound record keeping and reduce the estimated time to assess wound status by 96%, with an 80% accuracy rate. Overall, the proposed AI method and system integrate multiple sensors to provide accurate wound information and offer offline and online AI-assisted wound bed size estimation and wound tissue type classification. The system decreases delay time to one second, reduces the number of manual corrections required, saves time on wound record keeping, and increases data transmission speed, all of which have the potential to significantly improve wound care and management efficiency and accuracy.

Keywords: wound status analysis, AI-based system, multi-sensor integration, color-based guidance

Procedia PDF Downloads 115
1335 Improving Chest X-Ray Disease Detection with Enhanced Data Augmentation Using Novel Approach of Diverse Conditional Wasserstein Generative Adversarial Networks

Authors: Malik Muhammad Arslan, Muneeb Ullah, Dai Shihan, Daniyal Haider, Xiaodong Yang

Abstract:

Chest X-rays are instrumental in the detection and monitoring of a wide array of diseases, including viral infections such as COVID-19, tuberculosis, pneumonia, lung cancer, and various cardiac and pulmonary conditions. To enhance the accuracy of diagnosis, artificial intelligence (AI) algorithms, particularly deep learning models like Convolutional Neural Networks (CNNs), are employed. However, these deep learning models demand a substantial and varied dataset to attain optimal precision. Generative Adversarial Networks (GANs) can be employed to create new data, thereby supplementing the existing dataset and enhancing the accuracy of deep learning models. Nevertheless, GANs have their limitations, such as issues related to stability, convergence, and the ability to distinguish between authentic and fabricated data. In order to overcome these challenges and advance the detection and classification of CXR normal and abnormal images, this study introduces a distinctive technique known as DCWGAN (Diverse Conditional Wasserstein GAN) for generating synthetic chest X-ray (CXR) images. The study evaluates the effectiveness of this Idiosyncratic DCWGAN technique using the ResNet50 model and compares its results with those obtained using the traditional GAN approach. The findings reveal that the ResNet50 model trained on the DCWGAN-generated dataset outperformed the model trained on the classic GAN-generated dataset. Specifically, the ResNet50 model utilizing DCWGAN synthetic images achieved impressive performance metrics with an accuracy of 0.961, precision of 0.955, recall of 0.970, and F1-Measure of 0.963. These results indicate the promising potential for the early detection of diseases in CXR images using this Inimitable approach.

Keywords: CNN, classification, deep learning, GAN, Resnet50

Procedia PDF Downloads 88
1334 Generating Synthetic Chest X-ray Images for Improved COVID-19 Detection Using Generative Adversarial Networks

Authors: Muneeb Ullah, Daishihan, Xiadong Young

Abstract:

Deep learning plays a crucial role in identifying COVID-19 and preventing its spread. To improve the accuracy of COVID-19 diagnoses, it is important to have access to a sufficient number of training images of CXRs (chest X-rays) depicting the disease. However, there is currently a shortage of such images. To address this issue, this paper introduces COVID-19 GAN, a model that uses generative adversarial networks (GANs) to generate realistic CXR images of COVID-19, which can be used to train identification models. Initially, a generator model is created that uses digressive channels to generate images of CXR scans for COVID-19. To differentiate between real and fake disease images, an efficient discriminator is developed by combining the dense connectivity strategy and instance normalization. This approach makes use of their feature extraction capabilities on CXR hazy areas. Lastly, the deep regret gradient penalty technique is utilized to ensure stable training of the model. With the use of 4,062 grape leaf disease images, the Leaf GAN model successfully produces 8,124 COVID-19 CXR images. The COVID-19 GAN model produces COVID-19 CXR images that outperform DCGAN and WGAN in terms of the Fréchet inception distance. Experimental findings suggest that the COVID-19 GAN-generated CXR images possess noticeable haziness, offering a promising approach to address the limited training data available for COVID-19 model training. When the dataset was expanded, CNN-based classification models outperformed other models, yielding higher accuracy rates than those of the initial dataset and other augmentation techniques. Among these models, ImagNet exhibited the best recognition accuracy of 99.70% on the testing set. These findings suggest that the proposed augmentation method is a solution to address overfitting issues in disease identification and can enhance identification accuracy effectively.

Keywords: classification, deep learning, medical images, CXR, GAN.

Procedia PDF Downloads 96
1333 Better Defined WHO International Classification of Disease Codes for Relapsing Fever Borreliosis, and Lyme Disease Education Aiding Diagnosis, Treatment Improving Human Right to Health

Authors: Mualla McManus, Jenna Luche Thaye

Abstract:

World Health Organisation International Classification of Disease codes were created to define disease including infections in order to guide and educate diagnosticians. Most infectious diseases such as syphilis are clearly defined by their ICD 10 codes and aid/help to educate the clinicians in syphilis diagnosis and treatment globally. However, current ICD 10 codes for relapsing fever Borreliosis and Lyme disease are less clearly defined and can impede appropriate diagnosis especially if the clinician is not familiar with the symptoms of these infectious diseases. This is despite substantial number of scientific articles published in peer-reviewed journals about relapsing fever and Lyme disease. In the USA there are estimated 380,000 people annually contacting Lyme disease, more cases than breast cancer and 6x HIV/AIDS cases. This represents estimated 0.09% of the USA population. If extrapolated to the global population (7billion), 0.09% equates to 63 million people contracting relapsing fever or Lyme disease. In many regions, the rate of contracting some form of infection from tick bite may be even higher. Without accurate and appropriate diagnostic codes, physicians are impeded in their ability to properly care for their patients, leaving those patients invisible and marginalized within the medical system and to those guiding public policy. This results in great personal hardship, pain, disability, and expense. This unnecessarily burdens health care systems, governments, families, and society as a whole. With accurate diagnostic codes in place, robust data can guide medical and public health research, health policy, track mortality and save health care dollars. Better defined ICD codes are the way forward in educating the diagnosticians about relapsing fever and Lyme diseases.

Keywords: WHO ICD codes, relapsing fever, Lyme diseases, World Health Organisation

Procedia PDF Downloads 193
1332 Landslide and Liquefaction Vulnerability Analysis Using Risk Assessment Analysis and Analytic Hierarchy Process Implication: Suitability of the New Capital of the Republic of Indonesia on Borneo Island

Authors: Rifaldy, Misbahudin, Khalid Rizky, Ricky Aryanto, M. Alfiyan Bagus, Fahri Septianto, Firman Najib Wibisana, Excobar Arman

Abstract:

Indonesia is a country that has a high level of disaster because it is on the ring of fire, and there are several regions with three major plates meeting in the world. So that disaster analysis must always be done to see the potential disasters that might always occur, especially in this research are landslides and liquefaction. This research was conducted to analyze areas that are vulnerable to landslides and liquefaction hazards and their relationship with the assessment of the issue of moving the new capital of the Republic of Indonesia to the island of Kalimantan with a total area of 612,267.22 km². The method in this analysis uses the Analytical Hierarchy Process and consistency ratio testing as a complex and unstructured problem-solving process into several parameters by providing values. The parameters used in this analysis are the slope, land cover, lithology distribution, wetness index, earthquake data, peak ground acceleration. Weighted overlay was carried out from all these parameters using the percentage value obtained from the Analytical Hierarchy Process and confirmed its accuracy with a consistency ratio so that a percentage of the area obtained with different vulnerability classification values was obtained. Based on the analysis results obtained vulnerability classification from very high to low vulnerability. There are (0.15%) 918.40083 km² of highly vulnerable, medium (20.75%) 127,045,44815 km², low (56.54%) 346,175.886188 km², very low (22.56%) 138,127.484832 km². This research is expected to be able to map landslides and liquefaction disasters on the island of Kalimantan and provide consideration of the suitability of regional development of the new capital of the Republic of Indonesia. Also, this research is expected to provide input or can be applied to all regions that are analyzing the vulnerability of landslides and liquefaction or the suitability of the development of certain regions.

Keywords: analytic hierarchy process, Borneo Island, landslide and liquefaction, vulnerability analysis

Procedia PDF Downloads 176
1331 A Convolution Neural Network Approach to Predict Pes-Planus Using Plantar Pressure Mapping Images

Authors: Adel Khorramrouz, Monireh Ahmadi Bani, Ehsan Norouzi, Morvarid Lalenoor

Abstract:

Background: Plantar pressure distribution measurement has been used for a long time to assess foot disorders. Plantar pressure is an important component affecting the foot and ankle function and Changes in plantar pressure distribution could indicate various foot and ankle disorders. Morphologic and mechanical properties of the foot may be important factors affecting the plantar pressure distribution. Accurate and early measurement may help to reduce the prevalence of pes planus. With recent developments in technology, new techniques such as machine learning have been used to assist clinicians in predicting patients with foot disorders. Significance of the study: This study proposes a neural network learning-based flat foot classification methodology using static foot pressure distribution. Methodologies: Data were collected from 895 patients who were referred to a foot clinic due to foot disorders. Patients with pes planus were labeled by an experienced physician based on clinical examination. Then all subjects (with and without pes planus) were evaluated for static plantar pressures distribution. Patients who were diagnosed with the flat foot in both feet were included in the study. In the next step, the leg length was normalized and the network was trained for plantar pressure mapping images. Findings: From a total of 895 image data, 581 were labeled as pes planus. A computational neural network (CNN) ran to evaluate the performance of the proposed model. The prediction accuracy of the basic CNN-based model was performed and the prediction model was derived through the proposed methodology. In the basic CNN model, the training accuracy was 79.14%, and the test accuracy was 72.09%. Conclusion: This model can be easily and simply used by patients with pes planus and doctors to predict the classification of pes planus and prescreen for possible musculoskeletal disorders related to this condition. However, more models need to be considered and compared for higher accuracy.

Keywords: foot disorder, machine learning, neural network, pes planus

Procedia PDF Downloads 360
1330 Remote Sensing and Geographic Information Systems for Identifying Water Catchments Areas in the Northwest Coast of Egypt for Sustainable Agricultural Development

Authors: Mohamed Aboelghar, Ayman Abou Hadid, Usama Albehairy, Asmaa Khater

Abstract:

Sustainable agricultural development of the desert areas of Egypt under the pressure of irrigation water scarcity is a significant national challenge. Existing water harvesting techniques on the northwest coast of Egypt do not ensure the optimal use of rainfall for agricultural purposes. Basin-scale hydrology potentialities were studied to investigate how available annual rainfall could be used to increase agricultural production. All data related to agricultural production included in the form of geospatial layers. Thematic classification of Sentinal-2 imagery was carried out to produce the land cover and crop maps following the (FAO) system of land cover classification. Contour lines and spot height points were used to create a digital elevation model (DEM). Then, DEM was used to delineate basins, sub-basins, and water outlet points using the Soil and Water Assessment Tool (Arc SWAT). Main soil units of the study area identified from Land Master Plan maps. Climatic data collected from existing official sources. The amount of precipitation, surface water runoff, potential, and actual evapotranspiration for the years (2004 to 2017) shown as results of (Arc SWAT). The land cover map showed that the two tree crops (olive and fig) cover 195.8 km2 when herbaceous crops (barley and wheat) cover 154 km2. The maximum elevation was 250 meters above sea level when the lowest one was 3 meters below sea level. The study area receives a massive variable amount of precipitation; however, water harvesting methods are inappropriate to store water for purposes.

Keywords: water catchements, remote sensing, GIS, sustainable agricultural development

Procedia PDF Downloads 114
1329 Computer Aide Discrimination of Benign and Malignant Thyroid Nodules by Ultrasound Imaging

Authors: Akbar Gharbali, Ali Abbasian Ardekani, Afshin Mohammadi

Abstract:

Introduction: Thyroid nodules have an incidence of 33-68% in the general population. More than 5-15% of these nodules are malignant. Early detection and treatment of thyroid nodules increase the cure rate and provide optimal treatment. Between the medical imaging methods, Ultrasound is the chosen imaging technique for assessment of thyroid nodules. The confirming of the diagnosis usually demands repeated fine-needle aspiration biopsy (FNAB). So, current management has morbidity and non-zero mortality. Objective: To explore diagnostic potential of automatic texture analysis (TA) methods in differentiation benign and malignant thyroid nodules by ultrasound imaging in order to help for reliable diagnosis and monitoring of the thyroid nodules in their early stages with no need biopsy. Material and Methods: The thyroid US image database consists of 70 patients (26 benign and 44 malignant) which were reported by Radiologist and proven by the biopsy. Two slices per patient were loaded in Mazda Software version 4.6 for automatic texture analysis. Regions of interests (ROIs) were defined within the abnormal part of the thyroid nodules ultrasound images. Gray levels within an ROI normalized according to three normalization schemes: N1: default or original gray levels, N2: +/- 3 Sigma or dynamic intensity limited to µ+/- 3σ, and N3: present intensity limited to 1% - 99%. Up to 270 multiscale texture features parameters per ROIs per each normalization schemes were computed from well-known statistical methods employed in Mazda software. From the statistical point of view, all calculated texture features parameters are not useful for texture analysis. So, the features based on maximum Fisher coefficient and the minimum probability of classification error and average correlation coefficients (POE+ACC) eliminated to 10 best and most effective features per normalization schemes. We analyze this feature under two standardization states (standard (S) and non-standard (NS)) with Principle Component Analysis (PCA), Linear Discriminant Analysis (LDA) and Non-Linear Discriminant Analysis (NDA). The 1NN classifier was performed to distinguish between benign and malignant tumors. The confusion matrix and Receiver operating characteristic (ROC) curve analysis were used for the formulation of more reliable criteria of the performance of employed texture analysis methods. Results: The results demonstrated the influence of the normalization schemes and reduction methods on the effectiveness of the obtained features as a descriptor on discrimination power and classification results. The selected subset features under 1%-99% normalization, POE+ACC reduction and NDA texture analysis yielded a high discrimination performance with the area under the ROC curve (Az) of 0.9722, in distinguishing Benign from Malignant Thyroid Nodules which correspond to sensitivity of 94.45%, specificity of 100%, and accuracy of 97.14%. Conclusions: Our results indicate computer-aided diagnosis is a reliable method, and can provide useful information to help radiologists in the detection and classification of benign and malignant thyroid nodules.

Keywords: ultrasound imaging, thyroid nodules, computer aided diagnosis, texture analysis, PCA, LDA, NDA

Procedia PDF Downloads 279
1328 An Algebraic Geometric Imaging Approach for Automatic Dairy Cow Body Condition Scoring System

Authors: Thi Thi Zin, Pyke Tin, Ikuo Kobayashi, Yoichiro Horii

Abstract:

Today dairy farm experts and farmers have well recognized the importance of dairy cow Body Condition Score (BCS) since these scores can be used to optimize milk production, managing feeding system and as an indicator for abnormality in health even can be utilized to manage for having healthy calving times and process. In tradition, BCS measures are done by animal experts or trained technicians based on visual observations focusing on pin bones, pin, thurl and hook area, tail heads shapes, hook angles and short and long ribs. Since the traditional technique is very manual and subjective, the results can lead to different scores as well as not cost effective. Thus this paper proposes an algebraic geometric imaging approach for an automatic dairy cow BCS system. The proposed system consists of three functional modules. In the first module, significant landmarks or anatomical points from the cow image region are automatically extracted by using image processing techniques. To be specific, there are 23 anatomical points in the regions of ribs, hook bones, pin bone, thurl and tail head. These points are extracted by using block region based vertical and horizontal histogram methods. According to animal experts, the body condition scores depend mainly on the shape structure these regions. Therefore the second module will investigate some algebraic and geometric properties of the extracted anatomical points. Specifically, the second order polynomial regression is employed to a subset of anatomical points to produce the regression coefficients which are to be utilized as a part of feature vector in scoring process. In addition, the angles at thurl, pin, tail head and hook bone area are computed to extend the feature vector. Finally, in the third module, the extracted feature vectors are trained by using Markov Classification process to assign BCS for individual cows. Then the assigned BCS are revised by using multiple regression method to produce the final BCS score for dairy cows. In order to confirm the validity of proposed method, a monitoring video camera is set up at the milk rotary parlor to take top view images of cows. The proposed method extracts the key anatomical points and the corresponding feature vectors for each individual cows. Then the multiple regression calculator and Markov Chain Classification process are utilized to produce the estimated body condition score for each cow. The experimental results tested on 100 dairy cows from self-collected dataset and public bench mark dataset show very promising with accuracy of 98%.

Keywords: algebraic geometric imaging approach, body condition score, Markov classification, polynomial regression

Procedia PDF Downloads 158