Search results for: λ-levelwise statistical cluster points
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6838

Search results for: λ-levelwise statistical cluster points

6478 Understanding the Qualitative Nature of Product Reviews by Integrating Text Processing Algorithm and Usability Feature Extraction

Authors: Cherry Yieng Siang Ling, Joong Hee Lee, Myung Hwan Yun

Abstract:

The quality of a product to be usable has become the basic requirement in consumer’s perspective while failing the requirement ends up the customer from not using the product. Identifying usability issues from analyzing quantitative and qualitative data collected from usability testing and evaluation activities aids in the process of product design, yet the lack of studies and researches regarding analysis methodologies in qualitative text data of usability field inhibits the potential of these data for more useful applications. While the possibility of analyzing qualitative text data found with the rapid development of data analysis studies such as natural language processing field in understanding human language in computer, and machine learning field in providing predictive model and clustering tool. Therefore, this research aims to study the application capability of text processing algorithm in analysis of qualitative text data collected from usability activities. This research utilized datasets collected from LG neckband headset usability experiment in which the datasets consist of headset survey text data, subject’s data and product physical data. In the analysis procedure, which integrated with the text-processing algorithm, the process includes training of comments onto vector space, labeling them with the subject and product physical feature data, and clustering to validate the result of comment vector clustering. The result shows 'volume and music control button' as the usability feature that matches best with the cluster of comment vectors where centroid comments of a cluster emphasized more on button positions, while centroid comments of the other cluster emphasized more on button interface issues. When volume and music control buttons are designed separately, the participant experienced less confusion, and thus, the comments mentioned only about the buttons' positions. While in the situation where the volume and music control buttons are designed as a single button, the participants experienced interface issues regarding the buttons such as operating methods of functions and confusion of functions' buttons. The relevance of the cluster centroid comments with the extracted feature explained the capability of text processing algorithms in analyzing qualitative text data from usability testing and evaluations.

Keywords: usability, qualitative data, text-processing algorithm, natural language processing

Procedia PDF Downloads 260
6477 Phenotype Prediction of DNA Sequence Data: A Machine and Statistical Learning Approach

Authors: Mpho Mokoatle, Darlington Mapiye, James Mashiyane, Stephanie Muller, Gciniwe Dlamini

Abstract:

Great advances in high-throughput sequencing technologies have resulted in availability of huge amounts of sequencing data in public and private repositories, enabling a holistic understanding of complex biological phenomena. Sequence data are used for a wide range of applications such as gene annotations, expression studies, personalized treatment and precision medicine. However, this rapid growth in sequence data poses a great challenge which calls for novel data processing and analytic methods, as well as huge computing resources. In this work, a machine and statistical learning approach for DNA sequence classification based on $k$-mer representation of sequence data is proposed. The approach is tested using whole genome sequences of Mycobacterium tuberculosis (MTB) isolates to (i) reduce the size of genomic sequence data, (ii) identify an optimum size of k-mers and utilize it to build classification models, (iii) predict the phenotype from whole genome sequence data of a given bacterial isolate, and (iv) demonstrate computing challenges associated with the analysis of whole genome sequence data in producing interpretable and explainable insights. The classification models were trained on 104 whole genome sequences of MTB isoloates. Cluster analysis showed that k-mers maybe used to discriminate phenotypes and the discrimination becomes more concise as the size of k-mers increase. The best performing classification model had a k-mer size of 10 (longest k-mer) an accuracy, recall, precision, specificity, and Matthews Correlation coeffient of 72.0%, 80.5%, 80.5%, 63.6%, and 0.4 respectively. This study provides a comprehensive approach for resampling whole genome sequencing data, objectively selecting a k-mer size, and performing classification for phenotype prediction. The analysis also highlights the importance of increasing the k-mer size to produce more biological explainable results, which brings to the fore the interplay that exists amongst accuracy, computing resources and explainability of classification results. However, the analysis provides a new way to elucidate genetic information from genomic data, and identify phenotype relationships which are important especially in explaining complex biological mechanisms.

Keywords: AWD-LSTM, bootstrapping, k-mers, next generation sequencing

Procedia PDF Downloads 140
6476 Phenotype Prediction of DNA Sequence Data: A Machine and Statistical Learning Approach

Authors: Darlington Mapiye, Mpho Mokoatle, James Mashiyane, Stephanie Muller, Gciniwe Dlamini

Abstract:

Great advances in high-throughput sequencing technologies have resulted in availability of huge amounts of sequencing data in public and private repositories, enabling a holistic understanding of complex biological phenomena. Sequence data are used for a wide range of applications such as gene annotations, expression studies, personalized treatment and precision medicine. However, this rapid growth in sequence data poses a great challenge which calls for novel data processing and analytic methods, as well as huge computing resources. In this work, a machine and statistical learning approach for DNA sequence classification based on k-mer representation of sequence data is proposed. The approach is tested using whole genome sequences of Mycobacterium tuberculosis (MTB) isolates to (i) reduce the size of genomic sequence data, (ii) identify an optimum size of k-mers and utilize it to build classification models, (iii) predict the phenotype from whole genome sequence data of a given bacterial isolate, and (iv) demonstrate computing challenges associated with the analysis of whole genome sequence data in producing interpretable and explainable insights. The classification models were trained on 104 whole genome sequences of MTB isoloates. Cluster analysis showed that k-mers maybe used to discriminate phenotypes and the discrimination becomes more concise as the size of k-mers increase. The best performing classification model had a k-mer size of 10 (longest k-mer) an accuracy, recall, precision, specificity, and Matthews Correlation coeffient of 72.0 %, 80.5 %, 80.5 %, 63.6 %, and 0.4 respectively. This study provides a comprehensive approach for resampling whole genome sequencing data, objectively selecting a k-mer size, and performing classification for phenotype prediction. The analysis also highlights the importance of increasing the k-mer size to produce more biological explainable results, which brings to the fore the interplay that exists amongst accuracy, computing resources and explainability of classification results. However, the analysis provides a new way to elucidate genetic information from genomic data, and identify phenotype relationships which are important especially in explaining complex biological mechanisms

Keywords: AWD-LSTM, bootstrapping, k-mers, next generation sequencing

Procedia PDF Downloads 128
6475 Coping Strategies among Caregivers of Children with Autism Spectrum Disorders: A Cluster Analysis

Authors: Noor Ismael, Lisa Mische Lawson, Lauren Little, Murad Moqbel

Abstract:

Background/Significance: Caregivers of children with Autism Spectrum Disorders (ASD) develop coping mechanisms to overcome daily challenges to successfully parent their child. There is variability in coping strategies used among caregivers of children with ASD. Capturing homogeneity among such variable groups may help elucidate targeted intervention approaches for caregivers of children with ASD. Study Purpose: This study aimed to identify groups of caregivers of children with ASD based on coping mechanisms, and to examine whether there are differences among these groups in terms of strain level. Methods: This study utilized a secondary data analysis, and included survey responses of 273 caregivers of children with ASD. Measures consisted of the COPE Inventory and the Caregiver Strain Questionnaire. Data analyses consisted of cluster analysis to group caregiver coping strategies, and analysis of variance to compare the caregiver coping groups on strain level. Results: Cluster analysis results showed four distinct groups with different combinations of coping strategies: Social-Supported/Planning (group one), Spontaneous/Reactive (group two), Self-Supporting/Reappraisal (group three), and Religious/Expressive (group four). Caregivers in group one (Social-Supported/Planning) demonstrated significantly higher levels than the remaining three groups in the use of the following coping strategies: planning, use of instrumental social support, and use of emotional social support, relative to the other three groups. Caregivers in group two (Spontaneous/Reactive) used less restraint relative to the other three groups, and less suppression of competing activities relative to the other three groups as coping strategies. Also, group two showed significantly lower levels of religious coping as compared to the other three groups. In contrast to group one, caregivers in group three (Self-Supporting/Reappraisal) demonstrated significantly lower levels of the use of instrumental social support and the use of emotional social support relative to the other three groups. Additionally, caregivers in group three showed more acceptance, positive reinterpretation and growth coping strategies. Caregivers in group four (Religious/Expressive) demonstrated significantly higher levels of religious coping relative to the other three groups and utilized more venting of emotions strategies. Analysis of Variance results showed no significant differences between the four groups on the strain scores. Conclusions: There are four distinct groups with different combinations of coping strategies: Social-Supported/Planning, Spontaneous/Reactive, Self-Supporting/Reappraisal, and Religious/Expressive. Each caregiver group engaged in a combination of coping strategies to overcome the strain of caregiving.

Keywords: autism, caregivers, cluster analysis, coping strategies

Procedia PDF Downloads 265
6474 Closed Urban Block versus Open Housing Estates Structures: Sustainability Surveys in Brno, Czech Republic

Authors: M. Wittmann, G. Kopacik, A. Leitmannova

Abstract:

A prominent place in the spatial arrangement of Czech as well as other post-socialist, Central European cities belongs to 19th century closed urban blocks and the open concrete panel housing estates which were erected during the socialism era in the second half of 20th century. The characteristics of these two fundamentally diverse types of residential structures have, as we suppose, a different impact on the sustainable development of the urban area. The characteristics of these residential structures may influence the ecological stability of the area, its hygienic qualities, the intensity and way of using by various social groups, and also, e.g., the prices of real estates. These and many other phenomena indicate the environmental, social and economic sustainability of the urban area. The proposed research methodology assessed specific indicators of sustainability within a range from 0 to 10 points. 5 points correspond to the general standard in the area, 0 points indicates degradation, and 10 points indicate the highest contribution to sustainable development. The survey results are reflected in the overall sustainability index and in the residents’ satisfaction index. The paper analyses the residential structures in the Central European city of Brno, Czech Republic. The case studies of the urban blocks near the city centre and of the housing estate Brno - Vinohrady are compared. The results imply that a considerable positive impact on the sustainable development of the area should be ascribed to the closed urban blocks near the city centre.

Keywords: City of Brno, closed urban block, open housing estate, urban structure

Procedia PDF Downloads 154
6473 Lyapunov Functions for Extended Ross Model

Authors: Rahele Mosleh

Abstract:

This paper gives a survey of results on global stability of extended Ross model for malaria by constructing some elegant Lyapunov functions for two cases of epidemic, including disease-free and endemic occasions. The model is a nonlinear seven-dimensional system of ordinary differential equations that simulates this phenomenon in a more realistic fashion. We discuss the existence of positive disease-free and endemic equilibrium points of the model. It is stated that extended Ross model possesses invariant solutions for human and mosquito in a specific domain of the system.

Keywords: global stability, invariant solutions, Lyapunov function, stationary points

Procedia PDF Downloads 144
6472 A New Complex Method for Integrated Warehouse Design in Aspect of Dynamic and Static Capacity

Authors: Tamas Hartvanyi, Zoltan Andras Nagy, Miklos Szabo

Abstract:

The dynamic and static capacity are two opposing aspect of warehouse design. Static capacity optimization aims to maximize the space-usage for goods storing, while dynamic capacity needs more free place to handling them. They are opposing by the building structure and the area utilization. According to Pareto principle: the 80% of the goods are the 20% of the variety. From the origin of this statement, it worth to store the big amount of same products by fulfill the space with minimal corridors, meanwhile the rest 20% of goods have the 80% variety of the whole range, so there is more important to be fast-reachable instead of the space utilizing, what makes the space fulfillment numbers worse. The warehouse design decisions made in present practice by intuitive and empiric impressions, the planning method is formed to one selected technology, making this way the structure of the warehouse homogeny. Of course the result can’t be optimal for the inhomogeneous demands. A new innovative model based on our research will be introduced in this paper to describe the technic capacities, what makes possible to define optimal cluster of technology. It is able to optimize the space fulfillment and the dynamic operation together with this cluster application.

Keywords: warehouse, warehouse capacity, warehouse design method, warehouse optimization

Procedia PDF Downloads 109
6471 Forecasting the Influences of Information and Communication Technology on the Structural Changes of Japanese Industrial Sectors: A Study Using Statistical Analysis

Authors: Ubaidillah Zuhdi, Shunsuke Mori, Kazuhisa Kamegai

Abstract:

The purpose of this study is to forecast the influences of Information and Communication Technology (ICT) on the structural changes of Japanese economies based on Leontief Input-Output (IO) coefficients. This study establishes a statistical analysis to predict the future interrelationships among industries. We employ the Constrained Multivariate Regression (CMR) model to analyze the historical changes of input-output coefficients. Statistical significance of the model is then tested by Likelihood Ratio Test (LRT). In our model, ICT is represented by two explanatory variables, i.e. computers (including main parts and accessories) and telecommunications equipment. A previous study, which analyzed the influences of these variables on the structural changes of Japanese industrial sectors from 1985-2005, concluded that these variables had significant influences on the changes in the business circumstances of Japanese commerce, business services and office supplies, and personal services sectors. The projected future Japanese economic structure based on the above forecast generates the differentiated direct and indirect outcomes of ICT penetration.

Keywords: forecast, ICT, industrial structural changes, statistical analysis

Procedia PDF Downloads 355
6470 Automatic Seizure Detection Using Weighted Permutation Entropy and Support Vector Machine

Authors: Noha Seddik, Sherine Youssef, Mohamed Kholeif

Abstract:

The automated epileptic seizure detection research field has emerged in the recent years; this involves analyzing the Electroencephalogram (EEG) signals instead of the traditional visual inspection performed by expert neurologists. In this study, a Support Vector Machine (SVM) that uses Weighted Permutation Entropy (WPE) as the input feature is proposed for classifying normal and seizure EEG records. WPE is a modified statistical parameter of the permutation entropy (PE) that measures the complexity and irregularity of a time series. It incorporates both the mapped ordinal pattern of the time series and the information contained in the amplitude of its sample points. The proposed system utilizes the fact that entropy based measures for the EEG segments during epileptic seizure are lower than in normal EEG.

Keywords: electroencephalogram (EEG), epileptic seizure detection, weighted permutation entropy (WPE), support vector machine (SVM)

Procedia PDF Downloads 346
6469 An Adjusted Network Information Criterion for Model Selection in Statistical Neural Network Models

Authors: Christopher Godwin Udomboso, Angela Unna Chukwu, Isaac Kwame Dontwi

Abstract:

In selecting a Statistical Neural Network model, the Network Information Criterion (NIC) has been observed to be sample biased, because it does not account for sample sizes. The selection of a model from a set of fitted candidate models requires objective data-driven criteria. In this paper, we derived and investigated the Adjusted Network Information Criterion (ANIC), based on Kullback’s symmetric divergence, which has been designed to be an asymptotically unbiased estimator of the expected Kullback-Leibler information of a fitted model. The analyses show that on a general note, the ANIC improves model selection in more sample sizes than does the NIC.

Keywords: statistical neural network, network information criterion, adjusted network, information criterion, transfer function

Procedia PDF Downloads 538
6468 Enhancing Secondary School Mathematics Retention with Blended Learning: Integrating Concepts for Improved Understanding

Authors: Felix Oromena Egara, Moeketsi Mosia

Abstract:

The study aimed to evaluate the impact of blended learning on mathematics retention among secondary school students. Conducted in the Isoko North Local Government Area of Delta State, Nigeria, the research involved 1,235 senior class one (SS 1) students. Employing a non-equivalent control group pre-test-post-test quasi-experimental design, a sample of 70 students was selected from two secondary schools with ICT facilities through purposive sampling. Random allocation of students into experimental and control groups was achieved through balloting within each selected school. The investigation included three assessment points: pre-Mathematics Achievement Test (MAT), post-MAT, and post-post-MAT (retention), administered systematically by the researchers. Data collection utilized the established MAT instrument, which demonstrated a high reliability score of 0.86. Statistical analysis was conducted using the Statistical Package for Social Sciences (SPSS) version 28, with mean and standard deviation addressing study questions and analysis of covariance scrutinizing hypotheses at a significance level of .05. Results revealed significantly greater improvements in mathematics retention scores among students exposed to blended learning compared to those instructed through conventional methods. Moreover, noticeable differences in mean retention scores were observed, with male students in the blended learning group exhibiting notably higher performance. Based on these findings, recommendations were made, advocating for mathematics educators to integrate blended learning, particularly in geometry teaching, to enhance students’ retention of mathematical concepts.

Keywords: blended learning, flipped classroom model, secondary school students, station rotation model

Procedia PDF Downloads 13
6467 Investigating Visual Statistical Learning during Aging Using the Eye-Tracking Method

Authors: Zahra Kazemi Saleh, Bénédicte Poulin-Charronnat, Annie Vinter

Abstract:

This study examines the effects of aging on visual statistical learning, using eye-tracking techniques to investigate this cognitive phenomenon. Visual statistical learning is a fundamental brain function that enables the automatic and implicit recognition, processing, and internalization of environmental patterns over time. Some previous research has suggested the robustness of this learning mechanism throughout the aging process, underscoring its importance in the context of education and rehabilitation for the elderly. The study included three distinct groups of participants, including 21 young adults (Mage: 19.73), 20 young-old adults (Mage: 67.22), and 17 old-old adults (Mage: 79.34). Participants were exposed to a series of 12 arbitrary black shapes organized into 6 pairs, each with different spatial configurations and orientations (horizontal, vertical, and oblique). These pairs were not explicitly revealed to the participants, who were instructed to passively observe 144 grids presented sequentially on the screen for a total duration of 7 min. In the subsequent test phase, participants performed a two-alternative forced-choice task in which they had to identify the most familiar pair from 48 trials, each consisting of a base pair and a non-base pair. Behavioral analysis using t-tests revealed notable findings. The mean score for the first group was significantly above chance, indicating the presence of visual statistical learning. Similarly, the second group also performed significantly above chance, confirming the persistence of visual statistical learning in young-old adults. Conversely, the third group, consisting of old-old adults, showed a mean score that was not significantly above chance. This lack of statistical learning in the old-old adult group suggests a decline in this cognitive ability with age. Preliminary eye-tracking results showed a decrease in the number and duration of fixations during the exposure phase for all groups. The main difference was that older participants focused more often on empty cases than younger participants, likely due to a decline in the ability to ignore irrelevant information, resulting in a decrease in statistical learning performance.

Keywords: aging, eye tracking, implicit learning, visual statistical learning

Procedia PDF Downloads 54
6466 Study of Interplanetary Transfer Trajectories via Vicinity of Libration Points

Authors: Zhe Xu, Jian Li, Lvping Li, Zezheng Dong

Abstract:

This work is to study an optimized transfer strategy of connecting Earth and Mars via the vicinity of libration points, which have been playing an increasingly important role in trajectory designing on a deep space mission, and can be used as an effective alternative solution for Earth-Mars direct transfer mission in some unusual cases. The use of vicinity of libration points of the sun-planet body system is becoming potential gateways for future interplanetary transfer missions. By adding fuel to cargo spaceships located in spaceports, the interplanetary round-trip exploration shuttle mission of such a system facility can also be a reusable transportation system. In addition, in some cases, when the S/C cruising through invariant manifolds, it can also save a large amount of fuel. Therefore, it is necessary to make an effort on looking for efficient transfer strategies using variant manifold about libration points. It was found that Earth L1/L2 Halo/Lyapunov orbits and Mars L2/L1 Halo/Lyapunov orbits could be connected with reasonable fuel consumption and flight duration with appropriate design. In the paper, the halo hopping method and coplanar circular method are briefly introduced. The former used differential corrections to systematically generate low ΔV transfer trajectories between interplanetary manifolds, while the latter discussed escape and capture trajectories to and from Halo orbits by using impulsive maneuvers at periapsis of the manifolds about libration points. In the following, designs of transfer strategies of the two methods are shown here. A comparative performance analysis of interplanetary transfer strategies of the two methods is carried out accordingly. Comparison of strategies is based on two main criteria: the total fuel consumption required to perform the transfer and the time of flight, as mentioned above. The numeric results showed that the coplanar circular method procedure has certain advantages in cost or duration. Finally, optimized transfer strategy with engineering constraints is searched out and examined to be an effective alternative solution for a given direct transfer mission. This paper investigated main methods and gave out an optimized solution in interplanetary transfer via the vicinity of libration points. Although most of Earth-Mars mission planners prefer to build up a direct transfer strategy for the mission due to its advantage in relatively short time of flight, the strategies given in the paper could still be regard as effective alternative solutions since the advantages mentioned above and longer departure window than direct transfer.

Keywords: circular restricted three-body problem, halo/Lyapunov orbit, invariant manifolds, libration points

Procedia PDF Downloads 219
6465 Phylogenetic Studies of Six Egyptian Sheep Breeds Using Cytochrome B

Authors: Othman Elmahdy Othman, Agnés Germot, Daniel Petit, Muhammad Khodary, Abderrahman Maftah

Abstract:

Recently, the control (D-loop) and cytochrome b (Cyt b) regions of mtDNA have received more attention due to their role in the genetic diversity and phylogenetic studies in different livestock which give important knowledge towards the genetic resource conservation. Studies based on sequencing of sheep mitochondrial DNA showed that there are five maternal lineages in the world for domestic sheep breeds; A, B, C, D and E. By using cytochrome B sequencing, we aimed to clarify the genetic affinities and phylogeny of six Egyptian sheep breeds. Blood samples were collected from 111 animals belonging to six Egyptian sheep breeds; Barki, Rahmani, Ossimi, Saidi, Sohagi and Fallahi. The total DNA was extracted and the specific primers were used for conventional PCR amplification of the cytochrome B region of mtDNA. PCR amplified products were purified and sequenced. The alignment of sequences was done using BioEdit software and DnaSP 5.00 software was used to identify the sequence variation and polymorphic sites in the aligned sequences. The result showed that the presence of 39 polymorphic sites leading to the formation of 29 haplotypes. The haplotype diversity in six tested breeds ranged from 0.643 in Rahmani breed to 0.871 in Barki breed. The lowest genetic distance was observed between Rahmani and Saidi (D: 1.436 and Dxy: 0.00127) while the highest distance was observed between Ossimi and Sohagi (D: 6.050 and Dxy: 0.00534). Neighbour-joining (Phylogeny) tree was constructed using Mega 5.0 software. The sequences of 111 analyzed samples were aligned with references sequences of different haplogroups; A, B, C, D and E. The phylogeny result showed the presence of four haplogroups; HapA, HapB, HapC and HapE in the examined samples whereas the haplogroup D was not found. The result showed that 88 out of 111 tested animals cluster with haplogroup B (79.28%), whereas 12 tested animals cluster with haplogroup A (10.81%), 10 animals cluster with haplogroup C (9.01%) and one animal belongs to haplogroup E (0.90%).

Keywords: phylogeny, genetic biodiversity, MtDNA, cytochrome B, Egyptian sheep

Procedia PDF Downloads 326
6464 Understanding Loc Trade in Kashmir: References of Global Episodes in Arena of Economy and Confidence Building Measure

Authors: Aarushi Baloria, Joshina Jamwal

Abstract:

The paper attempts to understand the genesis of the Kashmir conflict, the LoC trade, and the various challenges which impede LoC trade. The paper further understands how this trade assists in mitigating tension between the countries and act as a conference building measure (CBM). The paper discusses later on the positive aspects of LoC trade with the help of statistical data like increase in state's economy along with negatives like smuggling of arms, drugs, swapping and interchanging of Hawala money and other unconstitutional activities like terrorism that took place on trade points across LoC. Moreover, the paper also mentioned in the international context; the episodes of Ireland of Europe, Palestine of Middle East, Uganda of Africa not only as transaction step but also as a peace channel between the fragmented parts. Thus, the paper, in a nutshell, reflects how the trade across LoC benefited in various psychological, economic, and political reasons, and it is worth taking risk, taking its overall positive things into consideration.

Keywords: drugs, economy, international, peace, psychological, trade

Procedia PDF Downloads 116
6463 Combination of Unmanned Aerial Vehicle and Terrestrial Laser Scanner Data for Citrus Yield Estimation

Authors: Mohammed Hmimou, Khalid Amediaz, Imane Sebari, Nabil Bounajma

Abstract:

Annual crop production is one of the most important macroeconomic indicators for the majority of countries around the world. This information is valuable, especially for exporting countries which need a yield estimation before harvest in order to correctly plan the supply chain. When it comes to estimating agricultural yield, especially for arboriculture, conventional methods are mostly applied. In the case of the citrus industry, the sale before harvest is largely practiced, which requires an estimation of the production when the fruit is on the tree. However, conventional method based on the sampling surveys of some trees within the field is always used to perform yield estimation, and the success of this process mainly depends on the expertise of the ‘estimator agent’. The present study aims to propose a methodology based on the combination of unmanned aerial vehicle (UAV) images and terrestrial laser scanner (TLS) point cloud to estimate citrus production. During data acquisition, a fixed wing and rotatory drones, as well as a terrestrial laser scanner, were tested. After that, a pre-processing step was performed in order to generate point cloud and digital surface model. At the processing stage, a machine vision workflow was implemented to extract points corresponding to fruits from the whole tree point cloud, cluster them into fruits, and model them geometrically in a 3D space. By linking the resulting geometric properties to the fruit weight, the yield can be estimated, and the statistical distribution of fruits size can be generated. This later property, which is information required by importing countries of citrus, cannot be estimated before harvest using the conventional method. Since terrestrial laser scanner is static, data gathering using this technology can be performed over only some trees. So, integration of drone data was thought in order to estimate the yield over a whole orchard. To achieve that, features derived from drone digital surface model were linked to yield estimation by laser scanner of some trees to build a regression model that predicts the yield of a tree given its features. Several missions were carried out to collect drone and laser scanner data within citrus orchards of different varieties by testing several data acquisition parameters (fly height, images overlap, fly mission plan). The accuracy of the obtained results by the proposed methodology in comparison to the yield estimation results by the conventional method varies from 65% to 94% depending mainly on the phenological stage of the studied citrus variety during the data acquisition mission. The proposed approach demonstrates its strong potential for early estimation of citrus production and the possibility of its extension to other fruit trees.

Keywords: citrus, digital surface model, point cloud, terrestrial laser scanner, UAV, yield estimation, 3D modeling

Procedia PDF Downloads 117
6462 The Algorithm of Semi-Automatic Thai Spoonerism Words for Bi-Syllable

Authors: Nutthapat Kaewrattanapat, Wannarat Bunchongkien

Abstract:

The purposes of this research are to study and develop the algorithm of Thai spoonerism words by semi-automatic computer programs, that is to say, in part of data input, syllables are already separated and in part of spoonerism, the developed algorithm is utilized, which can establish rules and mechanisms in Thai spoonerism words for bi-syllables by utilizing analysis in elements of the syllables, namely cluster consonant, vowel, intonation mark and final consonant. From the study, it is found that bi-syllable Thai spoonerism has 1 case of spoonerism mechanism, namely transposition in value of vowel, intonation mark and consonant of both 2 syllables but keeping consonant value and cluster word (if any). From the study, the rules and mechanisms in Thai spoonerism word were applied to develop as Thai spoonerism word software, utilizing PHP program. the software was brought to conduct a performance test on software execution; it is found that the program performs bi-syllable Thai spoonerism correctly or 99% of all words used in the test and found faults on the program at 1% as the words obtained from spoonerism may not be spelling in conformity with Thai grammar and the answer in Thai spoonerism could be more than 1 answer.

Keywords: algorithm, spoonerism, computational linguistics, Thai spoonerism

Procedia PDF Downloads 212
6461 Statistical Manufacturing Cell/Process Qualification Sample Size Optimization

Authors: Angad Arora

Abstract:

In production operations/manufacturing, a cell or line is typically a bunch of similar machines (computer numerical control (CNCs), advanced cutting, 3D printing or special purpose machines. For qualifying a typical manufacturing line /cell / new process, Ideally, we need a sample of parts that can be flown through the process and then we make a judgment on the health of the line/cell. However, with huge volumes and mass production scope, such as in the mobile phone industry, for example, the actual cells or lines can go in thousands and to qualify each one of them with statistical confidence means utilizing samples that are very large and eventually add to product /manufacturing cost + huge waste if the parts are not intended to be customer shipped. To solve this, we come up with 2 steps statistical approach. We start with a small sample size and then objectively evaluate whether the process needs additional samples or not. For example, if a process is producing bad parts and we saw those samples early, then there is a high chance that the process will not meet the desired yield and there is no point in keeping adding more samples. We used this hypothesis and came up with 2 steps binomial testing approach. Further, we also prove through results that we can achieve an 18-25% reduction in samples while keeping the same statistical confidence.

Keywords: statistics, data science, manufacturing process qualification, production planning

Procedia PDF Downloads 71
6460 An Approach Based on Statistics and Multi-Resolution Representation to Classify Mammograms

Authors: Nebi Gedik

Abstract:

One of the significant and continual public health problems in the world is breast cancer. Early detection is very important to fight the disease, and mammography has been one of the most common and reliable methods to detect the disease in the early stages. However, it is a difficult task, and computer-aided diagnosis (CAD) systems are needed to assist radiologists in providing both accurate and uniform evaluation for mass in mammograms. In this study, a multiresolution statistical method to classify mammograms as normal and abnormal in digitized mammograms is used to construct a CAD system. The mammogram images are represented by wave atom transform, and this representation is made by certain groups of coefficients, independently. The CAD system is designed by calculating some statistical features using each group of coefficients. The classification is performed by using support vector machine (SVM).

Keywords: wave atom transform, statistical features, multi-resolution representation, mammogram

Procedia PDF Downloads 201
6459 The Metacognition Levels of Students: A Research School of Physical Education and Sports at Anadolu University

Authors: Dilek Yalız Solmaz

Abstract:

Meta-cognition is an important factor for educating conscious individuals who are aware of their cognitive processes. With this respect, the purposes of this article is to find out the perceived metacognition level of Physical Education and Sports School students at Anadolu University and to identify whether metacognition levels display significant differences in terms of various variables. 416 Anadolu University Physical Education and Sports School students were formed the research universe. "The Meta-Cognitions Questionnaire (MCQ-30)" developed by Cartwright-Hatton and Wells and later developed the 30-item short form (MCQ-30) was used. The MCQ-30 which was adapted into Turkish by Tosun and Irak is a four-point agreement scale. In the data analysis, arithmethic mean, standard deviation, t-test and ANOVA were used. There is no statistical difference between mean scores of uncontrollableness and danger, cognitive awareness, cognitive confidence and the positive beliefs of girls and boys students. There is a statistical difference between mean scores of the need to control thinking. There is no statistical difference according to departments of students between mean scores of uncontrollableness and danger, cognitive awareness, cognitive confidence, need to control thinking and the positive beliefs. There is no statistical difference according to grade level of students between mean scores of the positive beliefs, cognitive confidence and need to control thinking. There is a statistical difference between mean scores of uncontrollableness and danger and cognitive awareness.

Keywords: meta cognition, physical education, sports school students, thinking

Procedia PDF Downloads 359
6458 Exploring the Spatial Characteristics of Mortality Map: A Statistical Area Perspective

Authors: Jung-Hong Hong, Jing-Cen Yang, Cai-Yu Ou

Abstract:

The analysis of geographic inequality heavily relies on the use of location-enabled statistical data and quantitative measures to present the spatial patterns of the selected phenomena and analyze their differences. To protect the privacy of individual instance and link to administrative units, point-based datasets are spatially aggregated to area-based statistical datasets, where only the overall status for the selected levels of spatial units is used for decision making. The partition of the spatial units thus has dominant influence on the outcomes of the analyzed results, well known as the Modifiable Areal Unit Problem (MAUP). A new spatial reference framework, the Taiwan Geographical Statistical Classification (TGSC), was recently introduced in Taiwan based on the spatial partition principles of homogeneous consideration of the number of population and households. Comparing to the outcomes of the traditional township units, TGSC provides additional levels of spatial units with finer granularity for presenting spatial phenomena and enables domain experts to select appropriate dissemination level for publishing statistical data. This paper compares the results of respectively using TGSC and township unit on the mortality data and examines the spatial characteristics of their outcomes. For the mortality data between the period of January 1st, 2008 and December 31st, 2010 of the Taitung County, the all-cause age-standardized death rate (ASDR) ranges from 571 to 1757 per 100,000 persons, whereas the 2nd dissemination area (TGSC) shows greater variation, ranged from 0 to 2222 per 100,000. The finer granularity of spatial units of TGSC clearly provides better outcomes for identifying and evaluating the geographic inequality and can be further analyzed with the statistical measures from other perspectives (e.g., population, area, environment.). The management and analysis of the statistical data referring to the TGSC in this research is strongly supported by the use of Geographic Information System (GIS) technology. An integrated workflow that consists of the tasks of the processing of death certificates, the geocoding of street address, the quality assurance of geocoded results, the automatic calculation of statistic measures, the standardized encoding of measures and the geo-visualization of statistical outcomes is developed. This paper also introduces a set of auxiliary measures from a geographic distribution perspective to further examine the hidden spatial characteristics of mortality data and justify the analyzed results. With the common statistical area framework like TGSC, the preliminary results demonstrate promising potential for developing a web-based statistical service that can effectively access domain statistical data and present the analyzed outcomes in meaningful ways to avoid wrong decision making.

Keywords: mortality map, spatial patterns, statistical area, variation

Procedia PDF Downloads 232
6457 Aeronautical Noise Management inside an Aerodrome: Analysis of Sound Exposure on Aviation Professional’s Health

Authors: Rafael Felipe Guatura da Silva, José Luis Gomes da Silva, Luiz Antonio, Ferreira Perrone de Brito

Abstract:

Noise can cause serious damage to human health, such as hearing loss, stress, irritability, fatigue, and others. Aviation is a place where your entire process should be work out with the utmost attention and commitment of human resources, thus the need to study the effects of noise in this sector, as aeronautical noise levels are high. This study aimed to evaluate the impact of noise pollution on the performance of professionals regarding the fatigue generated by aeronautical noise and time to noise exposure. The methodology used consists of measurements of sound pressure levels at 42 points of the aerodrome. The selected points are located inside the hangars and outside the airfield hangars. All points chosen are close to the professionals' work areas, seeking to identify the sound pressure levels to which they submitted. The other part of the research used the principle on the application of a self-report questionnaire to a sample of 207 people working inside the aerodrome. The 207 professionals surveyed consist of aircraft mechanics, pilots, maintenance managers, and administrative professionals. The questionnaire was intended to evaluate the knowledge that professionals have about health risks caused by sound exposure as well as to identify diseases that professionals have, and that may be associated with exposure to high levels of sound pressure. Preliminary results identify points with sound pressure levels of up to 91.7 dB, thus highlighting the need for the use of personal protective equipment that reduces noise exposure. It was also identified a large number of professionals who are bothered by the sound exposure and approximately 25% of professionals interviewed reported having a hearing disorder.

Keywords: aeronautical noise, fatigue, noise and health, noise management

Procedia PDF Downloads 126
6456 Computational Fluid Dynamics Analysis for Radon Dispersion Study and Mitigation

Authors: A. K. Visnuprasad, P. J. Jojo, Reshma Bhaskaran

Abstract:

Computational fluid dynamics (CFD) is used to simulate the distribution of indoor radon concentration in a living room with elevated levels of radon concentration which varies from 22 Bqm-3 to 1533 Bqm-3 in 24 hours. Finite volume method (FVM) was used for the simulation. The simulation results were experimentally validated at 16 points in two horizontal planes (y=1.4m & y=2.0m) using pin-hole dosimeters and at 3 points using scintillation radon monitor (SRM). Passive measurement using pin-hole dosimeters were performed in all seasons. Another simulation was done to find a suitable position for a passive ventilation system for the effective mitigation of radon.

Keywords: indoor radon, computational fluid dynamics, radon flux, ventilation rate, pin-hole dosimeter

Procedia PDF Downloads 388
6455 Simple Procedure for Probability Calculation of Tensile Crack Occurring in Rigid Pavement: A Case Study

Authors: Aleš Florian, Lenka Ševelová, Jaroslav Žák

Abstract:

Formation of tensile cracks in concrete slabs of rigid pavement can be (among others) the initiation point of the other, more serious failures which can ultimately lead to complete degradation of the concrete slab and thus the whole pavement. Two measures can be used for reliability assessment of this phenomenon - the probability of failure and/or the reliability index. Different methods can be used for their calculation. The simple ones are called moment methods and simulation techniques. Two methods - FOSM Method and Simple Random Sampling Method - are verified and their comparison is performed. The influence of information about the probability distribution and the statistical parameters of input variables as well as of the limit state function on the calculated reliability index and failure probability are studied in three points on the lower surface of concrete slabs of the older type of rigid pavement formerly used in the Czech Republic.

Keywords: failure, pavement, probability, reliability index, simulation, tensile crack

Procedia PDF Downloads 524
6454 Geometric Imperfections in Lattice Structures: A Simulation Strategy to Predict Strength Variability

Authors: Xavier Lorang, Ahmadali Tahmasebimoradi, Chetra Mang, Sylvain Girard

Abstract:

The additive manufacturing processes (e.g. selective laser melting) allow us to produce lattice structures which have less weight, higher impact absorption capacity, and better thermal exchange property compared to the classical structures. Unfortunately, geometric imperfections (defects) in the lattice structures are by-products results of the manufacturing process. These imperfections decrease the lifetime and the strength of the lattice structures and alternate their mechanical responses. The objective of the paper is to present a simulation strategy which allows us to take into account the effect of the geometric imperfections on the mechanical response of the lattice structure. In the first part, an identification method of geometric imperfection parameters of the lattice structure based on point clouds is presented. These point clouds are based on tomography measurements. The point clouds are fed into the platform LATANA (LATtice ANAlysis) developed by IRT-SystemX to characterize the geometric imperfections. This is done by projecting the point clouds of each microbeam along the beam axis onto a 2D surface. Then, by fitting an ellipse to the 2D projections of the points, the geometric imperfections are characterized by introducing three parameters of an ellipse; semi-major/minor axes and angle of rotation. With regard to the calculated parameters of the microbeam geometric imperfections, a statistical analysis is carried out to determine a probability density law based on a statistical hypothesis. The microbeam samples are randomly drawn from the density law and are used to generate lattice structures. In the second part, a finite element model for the lattice structure with the simplified geometric imperfections (ellipse parameters) is presented. This numerical model is used to simulate the generated lattice structures. The propagation of the uncertainties of geometric imperfections is shown through the distribution of the computed mechanical responses of the lattice structures.

Keywords: additive manufacturing, finite element model, geometric imperfections, lattice structures, propagation of uncertainty

Procedia PDF Downloads 166
6453 Analysis of the Engineering Judgement Influence on the Selection of Geotechnical Parameters Characteristic Values

Authors: K. Ivandic, F. Dodigovic, D. Stuhec, S. Strelec

Abstract:

A characteristic value of certain geotechnical parameter results from an engineering assessment. Its selection has to be based on technical principles and standards of engineering practice. It has been shown that the results of engineering assessment of different authors for the same problem and input data are significantly dispersed. A survey was conducted in which participants had to estimate the force that causes a 10 cm displacement at the top of a axially in-situ compressed pile. Fifty experts from all over the world took part in it. The lowest estimated force value was 42% and the highest was 133% of measured force resulting from a mentioned static pile load test. These extreme values result in significantly different technical solutions to the same engineering task. In case of selecting a characteristic value of a geotechnical parameter the importance of the influence of an engineering assessment can be reduced by using statistical methods. An informative annex of Eurocode 1 prescribes the method of selecting the characteristic values of material properties. This is followed by Eurocode 7 with certain specificities linked to selecting characteristic values of geotechnical parameters. The paper shows the procedure of selecting characteristic values of a geotechnical parameter by using a statistical method with different initial conditions. The aim of the paper is to quantify an engineering assessment in the example of determining a characteristic value of a specific geotechnical parameter. It is assumed that this assessment is a random variable and that its statistical features will be determined. For this purpose, a survey research was conducted among relevant experts from the field of geotechnical engineering. Conclusively, the results of the survey and the application of statistical method were compared.

Keywords: characteristic values, engineering judgement, Eurocode 7, statistical methods

Procedia PDF Downloads 272
6452 Application of Decline Curve Analysis to Depleted Wells in a Cluster and then Predicting the Performance of Currently Flowing Wells

Authors: Satish Kumar Pappu

Abstract:

The most common questions which are frequently asked in oil and gas industry are how much is the current production rate from a particular well and what is the approximate predicted life of that well. These questions can be answered through forecasting of important realistic data like flowing tubing hole pressures FTHP, Production decline curves which are used predict the future performance of a well in a reservoir. With the advent of directional drilling, cluster well drilling has gained much importance and in-fact has even revolutionized the whole world of oil and gas industry. An oil or gas reservoir can generally be described as a collection of several overlying, producing and potentially producing sands in to which a number of wells are drilled depending upon the in-place volume and several other important factors both technical and economical in nature, in some sands only one well is drilled and in some, more than one. The aim of this study is to derive important information from the data collected over a period of time at regular intervals on a depleted well in a reservoir sand and apply this information to predict the performance of other wells in that reservoir sand. The depleted wells are the most common observations when an oil or gas field is being visited, w the application of this study more realistic in nature.

Keywords: decline curve analysis, estimation of future gas reserves, reservoir sands, reservoir risk profile

Procedia PDF Downloads 409
6451 Energy Efficient Clustering with Adaptive Particle Swarm Optimization

Authors: KumarShashvat, ArshpreetKaur, RajeshKumar, Raman Chadha

Abstract:

Wireless sensor networks have principal characteristic of having restricted energy and with limitation that energy of the nodes cannot be replenished. To increase the lifetime in this scenario WSN route for data transmission is opted such that utilization of energy along the selected route is negligible. For this energy efficient network, dandy infrastructure is needed because it impinges the network lifespan. Clustering is a technique in which nodes are grouped into disjoints and non–overlapping sets. In this technique data is collected at the cluster head. In this paper, Adaptive-PSO algorithm is proposed which forms energy aware clusters by minimizing the cost of locating the cluster head. The main concern is of the suitability of the swarms by adjusting the learning parameters of PSO. Particle Swarm Optimization converges quickly at the beginning stage of the search but during the course of time, it becomes stable and may be trapped in local optima. In suggested network model swarms are given the intelligence of the spiders which makes them capable enough to avoid earlier convergence and also help them to escape from the local optima. Comparison analysis with traditional PSO shows that new algorithm considerably enhances the performance where multi-dimensional functions are taken into consideration.

Keywords: Particle Swarm Optimization, adaptive – PSO, comparison between PSO and A-PSO, energy efficient clustering

Procedia PDF Downloads 225
6450 High Order Block Implicit Multi-Step (Hobim) Methods for the Solution of Stiff Ordinary Differential Equations

Authors: J. P. Chollom, G. M. Kumleng, S. Longwap

Abstract:

The search for higher order A-stable linear multi-step methods has been the interest of many numerical analysts and has been realized through either higher derivatives of the solution or by inserting additional off step points, supper future points and the likes. These methods are suitable for the solution of stiff differential equations which exhibit characteristics that place a severe restriction on the choice of step size. It becomes necessary that only methods with large regions of absolute stability remain suitable for such equations. In this paper, high order block implicit multi-step methods of the hybrid form up to order twelve have been constructed using the multi-step collocation approach by inserting one or more off step points in the multi-step method. The accuracy and stability properties of the new methods are investigated and are shown to yield A-stable methods, a property desirable of methods suitable for the solution of stiff ODE’s. The new High Order Block Implicit Multistep methods used as block integrators are tested on stiff differential systems and the results reveal that the new methods are efficient and compete favourably with the state of the art Matlab ode23 code.

Keywords: block linear multistep methods, high order, implicit, stiff differential equations

Procedia PDF Downloads 334
6449 Organizational Agility in 22 Districts of Tehran Municipality

Authors: Mehrnoosh Jafari, Zeinolabedin Amini Sabegh, Habibollah Azimian

Abstract:

Background: Today variable and dynamic environment doubles importance of using suitable solutions for confronting these changes in th4e organizations. One of the best ways for coping with environmental changes is directing the organization towards agility. Current research aims at investigating status of organizational agility in Tehran municipality (22 districts). Research Methodology: This research is applied research in terms of purpose of study and it is survey in terms of collection of descriptive data. A sample (n = 377) was selected from Tehran Municipality (22 districts) employees using multistage sampling method (cluster and regular). Data were collected using organizational agility standard questionnaire, and they were analyzed using statistical tests in SPSS software as well as inferential statistics such as one-sample t-test and Friedman test and descriptive statistics such as mean and median. Findings: Research findings showed organizational agility status in the organizations under study is in relatively optimal status and competence has highest priority in terms of ranking and priority of organizational agility indexes. Conclusion: It is necessary that managers provide suitable conditions for promoting organizational agility status in the organizations under study by identifying factors affecting change in the organizational environments and using available potentials for better coping with changes and higher flexibility and speed.

Keywords: organizational, municipality, employer, agility

Procedia PDF Downloads 334