Search results for: level sets
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 13388

Search results for: level sets

13298 Electroencephalogram Based Approach for Mental Stress Detection during Gameplay with Level Prediction

Authors: Priyadarsini Samal, Rajesh Singla

Abstract:

Many mobile games come with the benefits of entertainment by introducing stress to the human brain. In recognizing this mental stress, the brain-computer interface (BCI) plays an important role. It has various neuroimaging approaches which help in analyzing the brain signals. Electroencephalogram (EEG) is the most commonly used method among them as it is non-invasive, portable, and economical. Here, this paper investigates the pattern in brain signals when introduced with mental stress. Two healthy volunteers played a game whose aim was to search hidden words from the grid, and the levels were chosen randomly. The EEG signals during gameplay were recorded to investigate the impacts of stress with the changing levels from easy to medium to hard. A total of 16 features of EEG were analyzed for this experiment which includes power band features with relative powers, event-related desynchronization, along statistical features. Support vector machine was used as the classifier, which resulted in an accuracy of 93.9% for three-level stress analysis; for two levels, the accuracy of 92% and 98% are achieved. In addition to that, another game that was similar in nature was played by the volunteers. A suitable regression model was designed for prediction where the feature sets of the first and second game were used for testing and training purposes, respectively, and an accuracy of 73% was found.

Keywords: brain computer interface, electroencephalogram, regression model, stress, word search

Procedia PDF Downloads 164
13297 Effects of the Different Recovery Durations on Some Physiological Parameters during 3 X 3 Small-Sided Games in Soccer

Authors: Samet Aktaş, Nurtekin Erkmen, Faruk Guven, Halil Taskin

Abstract:

This study aimed to determine the effects of 3 versus 3 small-sided games (SSG) with different recovery times on soma physiological parameters in soccer players. Twelve soccer players from Regional Amateur League volunteered for this study (mean±SD age, 20.50±2.43 years; height, 177.73±4.13 cm; weight, 70.83±8.38 kg). Subjects were performing soccer training for five days per week. The protocol of the study was approved by the local ethic committee in School of Physical Education and Sport, Selcuk University. The subjects were divided into teams with 3 players according to Yo-Yo Intermittent Recovery Test. The field dimension was 26 m wide and 34 m in length. Subjects performed two times in a random order a series of 3 bouts of 3-a-side SSGs with 3 min and 5 min recovery durations. In SSGs, each set were performed with 6 min duration. The percent of maximal heart rate (% HRmax), blood lactate concentration (LA) and Rated Perceived Exertion (RPE) scale points were collected before the SSGs and at the end of each set. Data were analyzed by analysis of variance (ANOVA) with repeated measures. Significant differences were found between %HRmax in before SSG and 1st set, 2nd set, and 3rd set in both SSG with 3 min recovery duration and SSG with 5 min recovery duration (p<0.05). Means of %HRmax in SSG with 3 min recovery duration at both 1st and 2nd sets were significantly higher than SSG with 5 min recovery duration (p<0.05). No significant difference was found between sets of either SSGs in terms of LA (p>0.05). LA in SSG with 3 min recovery duration was higher than SSG with 5 min recovery duration at 2nd sets (p<0.05). RPE in soccer players was not different between SSGs (p>0.05).In conclusion, this study demonstrates that exercise intensity in SSG with 3 min recovery durations is higher than SSG with 5 min recovery durations.

Keywords: small-sided games, soccer, heart rate, lactate

Procedia PDF Downloads 431
13296 A Dataset of Program Educational Objectives Mapped to ABET Outcomes: Data Cleansing, Exploratory Data Analysis and Modeling

Authors: Addin Osman, Anwar Ali Yahya, Mohammed Basit Kamal

Abstract:

Datasets or collections are becoming important assets by themselves and now they can be accepted as a primary intellectual output of a research. The quality and usage of the datasets depend mainly on the context under which they have been collected, processed, analyzed, validated, and interpreted. This paper aims to present a collection of program educational objectives mapped to student’s outcomes collected from self-study reports prepared by 32 engineering programs accredited by ABET. The manual mapping (classification) of this data is a notoriously tedious, time consuming process. In addition, it requires experts in the area, which are mostly not available. It has been shown the operational settings under which the collection has been produced. The collection has been cleansed, preprocessed, some features have been selected and preliminary exploratory data analysis has been performed so as to illustrate the properties and usefulness of the collection. At the end, the collection has been benchmarked using nine of the most widely used supervised multiclass classification techniques (Binary Relevance, Label Powerset, Classifier Chains, Pruned Sets, Random k-label sets, Ensemble of Classifier Chains, Ensemble of Pruned Sets, Multi-Label k-Nearest Neighbors and Back-Propagation Multi-Label Learning). The techniques have been compared to each other using five well-known measurements (Accuracy, Hamming Loss, Micro-F, Macro-F, and Macro-F). The Ensemble of Classifier Chains and Ensemble of Pruned Sets have achieved encouraging performance compared to other experimented multi-label classification methods. The Classifier Chains method has shown the worst performance. To recap, the benchmark has achieved promising results by utilizing preliminary exploratory data analysis performed on the collection, proposing new trends for research and providing a baseline for future studies.

Keywords: ABET, accreditation, benchmark collection, machine learning, program educational objectives, student outcomes, supervised multi-class classification, text mining

Procedia PDF Downloads 142
13295 Recombination Center Levels in Gold and Platinum Doped N-type Silicon for High-Speed Thyristor

Authors: Nam Chol Yu, GyongIl Chu, HoJong Ri

Abstract:

Using DLTS (Deep-level transient spectroscopy) measurement techniques, we determined the dominant recombination center levels (defects of both A and B) in gold and platinum doped n-type silicon. Also, the injection and temperature dependence of the Shockley-Read-Hall (SRH) carrier lifetime was studied under low-level injection and high-level injection. Here measurements show that the dominant level under low-level injection located at EC-0.25 eV (A) correlated to the Pt+G1 and the dominant level under high-level injection located at EC-0.54 eV (B) correlated to the Au+G4. Finally, A and B are the same dominant levels for controlling the lifetime in gold-platinum doped n-silicon.

Keywords: recombination center level, lifetime, carrier lifetime control, Gold, Platinum, Silicon

Procedia PDF Downloads 35
13294 Survey of Selected Pathogenic Bacteria in Chickens from Rural Households in Limpopo Province

Authors: M. Lizzy Madiwani, Ignatious Ncube, Evelyn Madoroba

Abstract:

This study was designed to determine the distribution of pathogenic bacteria in household raised chickens and study their virulence and antibiotic profiles. For this purpose, 40 chickens were purchased from families in the Capricorn district and sacrificed for sampling. Tissues were cultured on different bacteriological media followed by biotyping using Matrix-assisted Laser Desorption Ionization-time of Flight (MALDI-TOF). Disk diffusion test was performed to determine the antibiotic susceptibility profiles of these bacteria. Out of a total of 160 tissue samples evaluated, E. coli and Salmonella were detected in these tissues. Furthermore, determination of the pathogenic E. coli and Salmonella strains at species level using primer sets that target selected genes of interest in the polymerase chain reaction (PCR) assay was employed. The invA gene, a confirmatory gene of Salmonella was detected in all the Salmonella isolates. The study revealed that there is a high distribution of Salmonella and pathogenic E. coli in these chickens. Therefore, further studies on identification at the species level are highly recommended to provide management and sanitation practices to lower this prevalence. The antimicrobial susceptibly data generated from this study can be a valuable reference to veterinarians for treating bacterial diseases in poultry.

Keywords: antimicrobial, Escherichia coli, pathogens, Salmonella

Procedia PDF Downloads 95
13293 Feature Selection of Personal Authentication Based on EEG Signal for K-Means Cluster Analysis Using Silhouettes Score

Authors: Jianfeng Hu

Abstract:

Personal authentication based on electroencephalography (EEG) signals is one of the important field for the biometric technology. More and more researchers have used EEG signals as data source for biometric. However, there are some disadvantages for biometrics based on EEG signals. The proposed method employs entropy measures for feature extraction from EEG signals. Four type of entropies measures, sample entropy (SE), fuzzy entropy (FE), approximate entropy (AE) and spectral entropy (PE), were deployed as feature set. In a silhouettes calculation, the distance from each data point in a cluster to all another point within the same cluster and to all other data points in the closest cluster are determined. Thus silhouettes provide a measure of how well a data point was classified when it was assigned to a cluster and the separation between them. This feature renders silhouettes potentially well suited for assessing cluster quality in personal authentication methods. In this study, “silhouettes scores” was used for assessing the cluster quality of k-means clustering algorithm is well suited for comparing the performance of each EEG dataset. The main goals of this study are: (1) to represent each target as a tuple of multiple feature sets, (2) to assign a suitable measure to each feature set, (3) to combine different feature sets, (4) to determine the optimal feature weighting. Using precision/recall evaluations, the effectiveness of feature weighting in clustering was analyzed. EEG data from 22 subjects were collected. Results showed that: (1) It is possible to use fewer electrodes (3-4) for personal authentication. (2) There was the difference between each electrode for personal authentication (p<0.01). (3) There is no significant difference for authentication performance among feature sets (except feature PE). Conclusion: The combination of k-means clustering algorithm and silhouette approach proved to be an accurate method for personal authentication based on EEG signals.

Keywords: personal authentication, K-mean clustering, electroencephalogram, EEG, silhouettes

Procedia PDF Downloads 256
13292 Investigating the Effects of Psychological and Socio-Cultural Factors on the Tendency of Villagers to Use E-Banking Services: Case Study of Agricultural Bank Branches in Ilam

Authors: Nahid Ehsani, Amir Hossein Rezvanfar

Abstract:

The main objective of this study is to investigate psychological and socio-cultural factors effective on the tendency of the villagers to use e-banking services. The current paper is an applied study considering its objectives. The main data gathering tool in the current study is a made questionnaire which is designed and executed based on the conceptual background of the subject matter and the objectives and hypotheses of the study. The statistical population of this study includes all the customers of rural branches of Agricultural Bank in Ilam Province (N=82885). Among these 120 participants were chosen through sample size determination formula and they were studied using stratified random sampling method. In the analytical statistics level the results obtained from calculating Spearman’s Correlative Coefficient showed that socio-cultural and psychological factors had a significant impact of the extent of the tendency of the villagers to use e-banking services of the Agricultural Bank at the 99% level. Furthermore, stepwise multiple regression analysis showed that both sets of psychological factors as well as socio-economic factors were able to explain 50 percent of the variance of the independent variable; namely the tendency of villagers to use e-banking services.

Keywords: e-banking, agricultural bank, tendency, socio-economic factors, psychological factors

Procedia PDF Downloads 508
13291 Forecasting the Sea Level Change in Strait of Hormuz

Authors: Hamid Goharnejad, Amir Hossein Eghbali

Abstract:

Recent investigations have demonstrated the global sea level rise due to climate change impacts. In this study climate changes study the effects of increasing water level in the strait of Hormuz. The probable changes of sea level rise should be investigated to employ the adaption strategies. The climatic output data of a GCM (General Circulation Model) named CGCM3 under climate change scenario of A1b and A2 were used. Among different variables simulated by this model, those of maximum correlation with sea level changes in the study region and least redundancy among themselves were selected for sea level rise prediction by using stepwise regression. One models of Discrete Wavelet artificial Neural Network (DWNN) was developed to explore the relationship between climatic variables and sea level changes. In these models, wavelet was used to disaggregate the time series of input and output data into different components and then ANN was used to relate the disaggregated components of predictors and predictands to each other. The results showed in the Shahid Rajae Station for scenario A1B sea level rise is among 64 to 75 cm and for the A2 Scenario sea level rise is among 90 to 105 cm. Furthermore the result showed a significant increase of sea level at the study region under climate change impacts, which should be incorporated in coastal areas management.

Keywords: climate change scenarios, sea-level rise, strait of Hormuz, forecasting

Procedia PDF Downloads 241
13290 The Project Evaluation to Develop the Competencies, Capabilities, and Skills in Repairing Computers of People in Jompluak Local Municipality, Bang Khonthi District, Samut Songkram Province

Authors: Wilailuk Meepracha

Abstract:

The results of the study on the project evaluation to develop the competencies, capabilities, and skills in repairing computers of people in Jompluak Local Municipality, Bang Khonthi District, Samut Songkram Province showed that the overall result was good (4.33). When considering on each aspect, it was found that the highest one was on process evaluation (4.60) followed by product evaluation (4.50) and the least one was on feeding factor (3.97). When considering in details, it was found that: 1) the context aspect was high (4.23) with the highest item on the arrangement of the training situation (4.67) followed by the appropriateness of the target (4.30) and the least aspect was on the project cooperation (3.73). 2) The evaluation of average overall primary factor or feeding factor showed high value (4.23) while the highest aspect was on the capability of the trainers (4.47) followed by the suitable venue (4.33) while the least aspect was on the insufficient budget (3.47). 3) The average result of process evaluation was very high (4.60). The highest aspect was on the follow-op supervision (4.70) followed by responsibility of each project staffs (4.50) while the least aspect was on the present situation and the problems of the community (4.40). 4) The overall result of the product evaluation was very high (4.50). The highest aspect was on the diversity of the activities and the community integration (4.67) followed by project target achievement (4.63) while the least aspect was on continuation and regularity of the activities (4.33). The trainees reported high satisfaction on the project management at very high level (43.33%) while 40% reported high level and 16.67% reported moderate level. Suggestions for the project were on the additional number of the computer sets (37.78%) followed by longer training period especially on computer skills (43.48%).

Keywords: project evaluation, competency development, the capability on computer repairing and computer skills

Procedia PDF Downloads 280
13289 Journey to the East: The Story of Ghanaian Migrants in Guangzhou, China

Authors: Mark Kwaku Mensah Obeng

Abstract:

In the late 1990s and early 2000s, nationals of sub-Saharan Africa who had initially settled in the Middle East and other parts of south east Asia moved to Guangzhou in response to the 1997/8 Asian financial crisis in numbers never witnessed. They were later joined by many more as the Chinese economy improved and as the economic relationship between China and Africa improved. This paper tells the story of identifiable sets of Ghanaians in Guangzhou, China in the 21st century. It details out their respective characteristics and their activities in China, their migratory trajectories and the motivations for travelling to China. Also analyzed is how they are coping with life in the unknown destination. It finally attempt predicting the future of the Ghanaian community in China in terms of their level of community participation and integration.

Keywords: Africa in China, Ghana, motivation, Guangzhou

Procedia PDF Downloads 415
13288 Generalized Rough Sets Applied to Graphs Related to Urban Problems

Authors: Mihai Rebenciuc, Simona Mihaela Bibic

Abstract:

Branch of modern mathematics, graphs represent instruments for optimization and solving practical applications in various fields such as economic networks, engineering, network optimization, the geometry of social action, generally, complex systems including contemporary urban problems (path or transport efficiencies, biourbanism, & c.). In this paper is studied the interconnection of some urban network, which can lead to a simulation problem of a digraph through another digraph. The simulation is made univoc or more general multivoc. The concepts of fragment and atom are very useful in the study of connectivity in the digraph that is simulation - including an alternative evaluation of k- connectivity. Rough set approach in (bi)digraph which is proposed in premier in this paper contribute to improved significantly the evaluation of k-connectivity. This rough set approach is based on generalized rough sets - basic facts are presented in this paper.

Keywords: (bi)digraphs, rough set theory, systems of interacting agents, complex systems

Procedia PDF Downloads 210
13287 The Grinding Influence on the Strength of Fan-Out Wafer-Level Packages

Authors: Z. W. Zhong, C. Xu, W. K. Choi

Abstract:

To build a thin fan-out wafer-level package, the package had to be ground to a thin level. In this work, the influence of the grinding processes on the strength of the fan-out wafer-level packages was investigated. After different grinding processes, all specimens were placed on a three-point-bending fixture installed on a universal tester for three-point-bending testing, and the strength of the fan-out wafer-level packages was measured. The experiments revealed that the average flexure strength increased with the decreasing surface roughness height of the fan-out wafer-level package tested. The grinding processes had a significant influence on the strength of the fan-out wafer-level packages investigated.

Keywords: FOWLP strength, surface roughness, three-point bending, grinding

Procedia PDF Downloads 253
13286 The Effects of Passive and Active Recoveries on Responses of Platelet Indices and Hemodynamic Variables to Resistance Exercise

Authors: Mohammad Soltani, Sajad Ahmadizad, Fatemeh Hoseinzadeh, Atefe Sarvestan

Abstract:

The exercise recovery is an important variable in designing resistance exercise training. This study determined the effects of passive and active recoveries on responses of platelet indices and hemodynamic variables to resistance exercise. Twelve healthy subjects (six men and six women, age, 25.4 ±2.5 yrs) performed two types of resistance exercise protocols (six exercises including upper- and lower-body parts) at two separate sessions with one-week intervening. First resistance protocol included three sets of six repetitions at 80% of 1RM with 2 min passive rest between sets and exercises; while, the second protocol included three sets of six repetitions at 60% of 1RM followed by active recovery included six repetitions of the same exercise at 20% of 1RM. The exercise volume was equalized. Three blood samples were taken before exercise, immediately after exercise and after 1-hour recovery, and analyzed for fibrinogen and platelet indices. Blood pressure (BP), heart rate (HR) and rate pressure product (RPP), were measured before, immediately after exercise and every 5 minutes during recovery. Data analyzes showed a significant increase in SBP (systolic blood pressure), HR, rate of pressure product (RPP) and PLT in response to resistance exercise (P<0.05) and that changes for HR and RPP were significantly different between two protocols (P<0.05). Furthermore, MPV and P_LCR did not change in response to resistance exercise, though significant reductions were observed after 1h recovery compared to before and after exercise (P<0.05). No significant changes in fibrinogen and PDW following two types of resistance exercise protocols were observed (P>0.05). On the other hand, no significant differences in platelet indices were found between the two protocols (P>0.05). Resistance exercise induces changes in platelet indices and hemodynamic variables, and that these changes are not related to the type of recovery and returned to normal levels after 1h recovery.

Keywords: hemodynamic variables, platelet indices, resistance exercise, recovery intensity

Procedia PDF Downloads 104
13285 A Relative Entropy Regularization Approach for Fuzzy C-Means Clustering Problem

Authors: Ouafa Amira, Jiangshe Zhang

Abstract:

Clustering is an unsupervised machine learning technique; its aim is to extract the data structures, in which similar data objects are grouped in the same cluster, whereas dissimilar objects are grouped in different clusters. Clustering methods are widely utilized in different fields, such as: image processing, computer vision , and pattern recognition, etc. Fuzzy c-means clustering (fcm) is one of the most well known fuzzy clustering methods. It is based on solving an optimization problem, in which a minimization of a given cost function has been studied. This minimization aims to decrease the dissimilarity inside clusters, where the dissimilarity here is measured by the distances between data objects and cluster centers. The degree of belonging of a data point in a cluster is measured by a membership function which is included in the interval [0, 1]. In fcm clustering, the membership degree is constrained with the condition that the sum of a data object’s memberships in all clusters must be equal to one. This constraint can cause several problems, specially when our data objects are included in a noisy space. Regularization approach took a part in fuzzy c-means clustering technique. This process introduces an additional information in order to solve an ill-posed optimization problem. In this study, we focus on regularization by relative entropy approach, where in our optimization problem we aim to minimize the dissimilarity inside clusters. Finding an appropriate membership degree to each data object is our objective, because an appropriate membership degree leads to an accurate clustering result. Our clustering results in synthetic data sets, gaussian based data sets, and real world data sets show that our proposed model achieves a good accuracy.

Keywords: clustering, fuzzy c-means, regularization, relative entropy

Procedia PDF Downloads 240
13284 Three-Stage Mining Metals Supply Chain Coordination and Product Quality Improvement with Revenue Sharing Contract

Authors: Hamed Homaei, Iraj Mahdavi, Ali Tajdin

Abstract:

One of the main concerns of miners is to increase the quality level of their products because the mining metals price depends on their quality level; however, increasing the quality level of these products has different costs at different levels of the supply chain. These costs usually increase after extractor level. This paper studies the coordination issue of a decentralized three-level supply chain with one supplier (extractor), one mineral processor and one manufacturer in which the increasing product quality level cost at the processor level is higher than the supplier and at the level of the manufacturer is more than the processor. We identify the optimal product quality level for each supply chain member by designing a revenue sharing contract. Finally, numerical examples show that the designed contract not only increases the final product quality level but also provides a win-win condition for all supply chain members and increases the whole supply chain profit.

Keywords: three-stage supply chain, product quality improvement, channel coordination, revenue sharing

Procedia PDF Downloads 156
13283 Aligning Informatics Study Programs with Occupational and Qualifications Standards

Authors: Patrizia Poscic, Sanja Candrlic, Danijela Jaksic

Abstract:

The University of Rijeka, Department of Informatics participated in the Stand4Info project, co-financed by the European Union, with the main idea of an alignment of study programs with occupational and qualifications standards in the field of Informatics. A brief overview of our research methodology, goals and deliverables is shown. Our main research and project objectives were: a) development of occupational standards, qualification standards and study programs based on the Croatian Qualifications Framework (CROQF), b) higher education quality improvement in the field of information and communication sciences, c) increasing the employability of students of information and communication technology (ICT) and science, and d) continuously improving competencies of teachers in accordance with the principles of CROQF. CROQF is a reform instrument in the Republic of Croatia for regulating the system of qualifications at all levels through qualifications standards based on learning outcomes and following the needs of the labor market, individuals and society. The central elements of CROQF are learning outcomes - competences acquired by the individual through the learning process and proved afterward. The place of each acquired qualification is set by the level of the learning outcomes belonging to that qualification. The placement of qualifications at respective levels allows the comparison and linking of different qualifications, as well as linking of Croatian qualifications' levels to the levels of the European Qualifications Framework and the levels of the Qualifications framework of the European Higher Education Area. This research has made 3 proposals of occupational standards for undergraduate study level (System Analyst, Developer, ICT Operations Manager), and 2 for graduate (master) level (System Architect, Business Architect). For each occupational standard employers have provided a list of key tasks and associated competencies necessary to perform them. A set of competencies required for each particular job in the workplace was defined and each set of competencies as described in more details by its individual competencies. Based on sets of competencies from occupational standards, sets of learning outcomes were defined and competencies from the occupational standard were linked with learning outcomes. For each learning outcome, as well as for the set of learning outcomes, it was necessary to specify verification method, material, and human resources. The task of the project was to suggest revision and improvement of the existing study programs. It was necessary to analyze existing programs and determine how they meet and fulfill defined learning outcomes. This way, one could see: a) which learning outcomes from the qualifications standards are covered by existing courses, b) which learning outcomes have yet to be covered, c) are they covered by mandatory or elective courses, and d) are some courses unnecessary or redundant. Overall, the main research results are: a) completed proposals of qualification and occupational standards in the field of ICT, b) revised curricula of undergraduate and master study programs in ICT, c) sustainable partnership and association stakeholders network, d) knowledge network - informing the public and stakeholders (teachers, students, and employers) about the importance of CROQF establishment, and e) teachers educated in innovative methods of teaching.

Keywords: study program, qualification standard, occupational standard, higher education, informatics and computer science

Procedia PDF Downloads 115
13282 Modified InVEST for Whatsapp Messages Forensic Triage and Search through Visualization

Authors: Agria Rhamdhan

Abstract:

WhatsApp as the most popular mobile messaging app has been used as evidence in many criminal cases. As the use of mobile messages generates large amounts of data, forensic investigation faces the challenge of large data problems. The hardest part of finding this important evidence is because current practice utilizes tools and technique that require manual analysis to check all messages. That way, analyze large sets of mobile messaging data will take a lot of time and effort. Our work offers methodologies based on forensic triage to reduce large data to manageable sets resulting easier to do detailed reviews, then show the results through interactive visualization to show important term, entities and relationship through intelligent ranking using Term Frequency-Inverse Document Frequency (TF-IDF) and Latent Dirichlet Allocation (LDA) Model. By implementing this methodology, investigators can improve investigation processing time and result's accuracy.

Keywords: forensics, triage, visualization, WhatsApp

Procedia PDF Downloads 130
13281 Summarizing Data Sets for Data Mining by Using Statistical Methods in Coastal Engineering

Authors: Yunus Doğan, Ahmet Durap

Abstract:

Coastal regions are the one of the most commonly used places by the natural balance and the growing population. In coastal engineering, the most valuable data is wave behaviors. The amount of this data becomes very big because of observations that take place for periods of hours, days and months. In this study, some statistical methods such as the wave spectrum analysis methods and the standard statistical methods have been used. The goal of this study is the discovery profiles of the different coast areas by using these statistical methods, and thus, obtaining an instance based data set from the big data to analysis by using data mining algorithms. In the experimental studies, the six sample data sets about the wave behaviors obtained by 20 minutes of observations from Mersin Bay in Turkey and converted to an instance based form, while different clustering techniques in data mining algorithms were used to discover similar coastal places. Moreover, this study discusses that this summarization approach can be used in other branches collecting big data such as medicine.

Keywords: clustering algorithms, coastal engineering, data mining, data summarization, statistical methods

Procedia PDF Downloads 337
13280 Simultaneous Optimization of Design and Maintenance through a Hybrid Process Using Genetic Algorithms

Authors: O. Adjoul, A. Feugier, K. Benfriha, A. Aoussat

Abstract:

In general, issues related to design and maintenance are considered in an independent manner. However, the decisions made in these two sets influence each other. The design for maintenance is considered an opportunity to optimize the life cycle cost of a product, particularly in the nuclear or aeronautical field, where maintenance expenses represent more than 60% of life cycle costs. The design of large-scale systems starts with product architecture, a choice of components in terms of cost, reliability, weight and other attributes, corresponding to the specifications. On the other hand, the design must take into account maintenance by improving, in particular, real-time monitoring of equipment through the integration of new technologies such as connected sensors and intelligent actuators. We noticed that different approaches used in the Design For Maintenance (DFM) methods are limited to the simultaneous characterization of the reliability and maintainability of a multi-component system. This article proposes a method of DFM that assists designers to propose dynamic maintenance for multi-component industrial systems. The term "dynamic" refers to the ability to integrate available monitoring data to adapt the maintenance decision in real time. The goal is to maximize the availability of the system at a given life cycle cost. This paper presents an approach for simultaneous optimization of the design and maintenance of multi-component systems. Here the design is characterized by four decision variables for each component (reliability level, maintainability level, redundancy level, and level of monitoring data). The maintenance is characterized by two decision variables (the dates of the maintenance stops and the maintenance operations to be performed on the system during these stops). The DFM model helps the designers choose technical solutions for the large-scale industrial products. Large-scale refers to the complex multi-component industrial systems and long life-cycle, such as trains, aircraft, etc. The method is based on a two-level hybrid algorithm for simultaneous optimization of design and maintenance, using genetic algorithms. The first level is to select a design solution for a given system that considers the life cycle cost and the reliability. The second level consists of determining a dynamic and optimal maintenance plan to be deployed for a design solution. This level is based on the Maintenance Free Operating Period (MFOP) concept, which takes into account the decision criteria such as, total reliability, maintenance cost and maintenance time. Depending on the life cycle duration, the desired availability, and the desired business model (sales or rental), this tool provides visibility of overall costs and optimal product architecture.

Keywords: availability, design for maintenance (DFM), dynamic maintenance, life cycle cost (LCC), maintenance free operating period (MFOP), simultaneous optimization

Procedia PDF Downloads 86
13279 The Impact of the Great Irish Famine on Irish Mass Migration to the United States at the Turn of the Twentieth Century

Authors: Gayane Vardanyan, Gaia Narciso, Battista Severgnini

Abstract:

This paper investigates the long-run impact of the Great Irish Famine on emigration from Ireland at the turn of the twentieth century. To do it we combine the 1901 and the 1911 Irish Census data sets with the Ellis Island Administrative Records on Irish migrants to the United States. We find that the migrants were more likely to be Catholic, literate, unmarried, young and Gaelic speaking compared to the ones that stay. Running individual level specifications, our preliminary findings suggest that being born in a place where the Famine was more severe increases the probability of becoming a migrant in the long-run. We also intend to explore the mechanisms through which this impact occurs.

Keywords: Great Famine, mass migration, long-run impact, mechanisms

Procedia PDF Downloads 216
13278 Fault Detection and Isolation of a Three-Tank System using Analytical Temporal Redundancy, Parity Space/Relation Based Residual Generation

Authors: A. T. Kuda, J. J. Dayya, A. Jimoh

Abstract:

This paper investigates the fault detection and Isolation technique of measurement data sets from a three tank system using analytical model-based temporal redundancy which is based on residual generation using parity equations/space approach. It further briefly outlines other approaches of model-based residual generation. The basic idea of parity space residual generation in temporal redundancy is dynamic relationship between sensor outputs and actuator inputs (input-output model). These residuals where then used to detect whether or not the system is faulty and indicate the location of the fault when it is faulty. The method obtains good results by detecting and isolating faults from the considered data sets measurements generated from the system.

Keywords: fault detection, fault isolation, disturbing influences, system failure, parity equation/relation, structured parity equations

Procedia PDF Downloads 271
13277 FCNN-MR: A Parallel Instance Selection Method Based on Fast Condensed Nearest Neighbor Rule

Authors: Lu Si, Jie Yu, Shasha Li, Jun Ma, Lei Luo, Qingbo Wu, Yongqi Ma, Zhengji Liu

Abstract:

Instance selection (IS) technique is used to reduce the data size to improve the performance of data mining methods. Recently, to process very large data set, several proposed methods divide the training set into some disjoint subsets and apply IS algorithms independently to each subset. In this paper, we analyze the limitation of these methods and give our viewpoint about how to divide and conquer in IS procedure. Then, based on fast condensed nearest neighbor (FCNN) rule, we propose a large data sets instance selection method with MapReduce framework. Besides ensuring the prediction accuracy and reduction rate, it has two desirable properties: First, it reduces the work load in the aggregation node; Second and most important, it produces the same result with the sequential version, which other parallel methods cannot achieve. We evaluate the performance of FCNN-MR on one small data set and two large data sets. The experimental results show that it is effective and practical.

Keywords: instance selection, data reduction, MapReduce, kNN

Procedia PDF Downloads 231
13276 Spectral Anomaly Detection and Clustering in Radiological Search

Authors: Thomas L. McCullough, John D. Hague, Marylesa M. Howard, Matthew K. Kiser, Michael A. Mazur, Lance K. McLean, Johanna L. Turk

Abstract:

Radiological search and mapping depends on the successful recognition of anomalies in large data sets which contain varied and dynamic backgrounds. We present a new algorithmic approach for real-time anomaly detection which is resistant to common detector imperfections, avoids the limitations of a source template library and provides immediate, and easily interpretable, user feedback. This algorithm is based on a continuous wavelet transform for variance reduction and evaluates the deviation between a foreground measurement and a local background expectation using methods from linear algebra. We also present a technique for recognizing and visualizing spectrally similar clusters of data. This technique uses Laplacian Eigenmap Manifold Learning to perform dimensional reduction which preserves the geometric "closeness" of the data while maintaining sensitivity to outlying data. We illustrate the utility of both techniques on real-world data sets.

Keywords: radiological search, radiological mapping, radioactivity, radiation protection

Procedia PDF Downloads 669
13275 3D Point Cloud Model Color Adjustment by Combining Terrestrial Laser Scanner and Close Range Photogrammetry Datasets

Authors: M. Pepe, S. Ackermann, L. Fregonese, C. Achille

Abstract:

3D models obtained with advanced survey techniques such as close-range photogrammetry and laser scanner are nowadays particularly appreciated in Cultural Heritage and Archaeology fields. In order to produce high quality models representing archaeological evidences and anthropological artifacts, the appearance of the model (i.e. color) beyond the geometric accuracy, is not a negligible aspect. The integration of the close-range photogrammetry survey techniques with the laser scanner is still a topic of study and research. By combining point cloud data sets of the same object generated with both technologies, or with the same technology but registered in different moment and/or natural light condition, could construct a final point cloud with accentuated color dissimilarities. In this paper, a methodology to uniform the different data sets, to improve the chromatic quality and to highlight further details by balancing the point color will be presented.

Keywords: color models, cultural heritage, laser scanner, photogrammetry

Procedia PDF Downloads 254
13274 Flood Control Structures in the River Göta Älv to Protect Gothenburg City (Sweden) during the 21st Century: Preliminary Evaluation

Authors: M. Irannezhad, E. H. N. Gashti, U. Moback, B. Kløve

Abstract:

Climate change because of increases in concentration level of greenhouse gases emissions to the atmosphere will result in mean sea level rise about +1 m by 2100. To prevent coastal floods resulted from the sea level rising, different flood control structures have been built, e.g. the Thames barrier on the Thames River in London (UK), with acceptable protection levels at least so far. Gothenburg located on the southwest coast of Sweden, with the River Göta älv running through it, is one of vulnerable cities to the accelerated rises in mean sea level. Developing a water level model by MATLAB, we evaluated using a sea barrage in the Göta älv River as the flood control structure for protecting the Gothenburg city during this century. Considering three operational scenarios for two barriers in upstream and downstream, the highest sea level was estimated to + 2.95 m above the current mean sea level by 2100. To verify flood protection against such high sea levels, both barriers have to be closed. To prevent high water level in the River Göta älv reservoir, the barriers would be open when the sea level is low. The suggested flood control structures would successfully protect the city from flooding events during this century.

Keywords: climate change, flood control structures, gothenburg, sea level rising, water level mode

Procedia PDF Downloads 329
13273 Impact of Climate Change on Water Level and Properties of Gorgan Bay in the Southern Caspian Sea

Authors: Siamak Jamshidi

Abstract:

The Caspian Sea is the Earth's largest inland body of water. One of the most important issues related to the sea is water level changes. For measuring and recording Caspian Sea water level, there are at least three gauges and radar equipment in Anzali, Nowshahr and Amirabad Ports along the southern boundary of the Caspian Sea. It seems that evaporation, hotter surface air temperature, and in general climate change is the main reasons for its water level fluctuations. Gorgan Bay in the eastern part of the southern boundary of the Caspian Sea is one of the areas under the effect of water level fluctuation. Based on the results of field measurements near the Gorgan Bay mouth temperature ranged between 24°C–28°C and salinity was about 13.5 PSU in midsummer while temperature changed between 10-11.5°C and salinity mostly was 15-16.5 PSU in mid-winter. The decrease of Caspian Sea water level and rivers outflow are the two most important factors for the increase in water salinity of the Gorgan Bay. Results of field observations showed that, due to atmospheric factors, climate changes and decreasing of precipitation over the southern basin of the Caspian Sea during last decades, the water level of bay was reduced around 0.5 m.

Keywords: Caspian Sea, Gorgan Bay, water level fluctuation, climate changes

Procedia PDF Downloads 142
13272 Analysis of Financial Time Series by Using Ornstein-Uhlenbeck Type Models

Authors: Md Al Masum Bhuiyan, Maria C. Mariani, Osei K. Tweneboah

Abstract:

In the present work, we develop a technique for estimating the volatility of financial time series by using stochastic differential equation. Taking the daily closing prices from developed and emergent stock markets as the basis, we argue that the incorporation of stochastic volatility into the time-varying parameter estimation significantly improves the forecasting performance via Maximum Likelihood Estimation. While using the technique, we see the long-memory behavior of data sets and one-step-ahead-predicted log-volatility with ±2 standard errors despite the variation of the observed noise from a Normal mixture distribution, because the financial data studied is not fully Gaussian. Also, the Ornstein-Uhlenbeck process followed in this work simulates well the financial time series, which aligns our estimation algorithm with large data sets due to the fact that this algorithm has good convergence properties.

Keywords: financial time series, maximum likelihood estimation, Ornstein-Uhlenbeck type models, stochastic volatility model

Procedia PDF Downloads 211
13271 From Manipulation to Citizen Control: A Case Study Revealing the Level of Participation in the Citizen Participatory Audit

Authors: Mark Jason E. Arca, Jay Vee R. Linatoc, Rex Francis N. Lupango, Michael Joe A. Ramirez

Abstract:

Participation promises an avenue for citizens to take part in governance, but it does not necessarily mean effective participation. The proper integration of participants in the decision-making process should be properly addressed to ensure effectiveness. This study explores the integration of the participants in the decision-making process to reveal the level of participation in the Solid Waste Management audit done by the Citizen Participatory Audit (CPA), a program under the supervision of the Commission on Audit. Specifically, this study will use the experience of participation to identify emerging themes that will help reveal the level of participation through the integrated ladder of participation. The researchers used key informant interviews to gather necessary data from the actors of the program. The findings revealed that the level of participation present in the CPA is at the Placation level, a level below the program’s targeted level of participation. The study also allowed the researchers to reveal facilitating factors in the program that contributed to a better understanding of the practice of participation.

Keywords: citizen participation, culture of participation, ladder of participation, level of participation

Procedia PDF Downloads 376
13270 Recombination Center Levels in Gold and Platinum Doped N-Type Silicon

Authors: Nam Chol Yu, Kyong Il Chu

Abstract:

Using DLTS measurement techniques, we determined the dominant recombination center levels (defects of both A and B) in gold and platinum doped n-type silicon. Also, the injection and temperature dependence of the Shockley-Read-Hall (SRH) carrier lifetime was studied under low-level injection and high-level injection. Here measurements show that the dominant level under low-level injection located at EC-0.25eV(A) correlated to the Pt+G1 and the dominant level under high-level injection located at EC-0.54eV(B) correlated to the Au+G4. Finally, A and B are the same dominant levels for controlling the lifetime in gold-platinum doped n-silicon.

Keywords: recombination center level, lifetime, carrier lifetime control, gold, platinum, silicon

Procedia PDF Downloads 124
13269 Structure of Consciousness According to Deep Systemic Constellations

Authors: Dmitry Ustinov, Olga Lobareva

Abstract:

The method of Deep Systemic Constellations is based on a phenomenological approach. Using the phenomenon of substitutive perception it was established that the human consciousness has a hierarchical structure, where deeper levels govern more superficial ones (reactive level, energy or ancestral level, spiritual level, magical level, and deeper levels of consciousness). Every human possesses a depth of consciousness to the spiritual level, however deeper levels of consciousness are not found for every person. It was found that the spiritual level of consciousness is not homogeneous and has its own internal hierarchy of sublevels (the level of formation of spiritual values, the level of the 'inner observer', the level of the 'path', the level of 'God', etc.). The depth of the spiritual level of a person defines the paradigm of all his internal processes and the main motives of the movement through life. At any level of consciousness disturbances can occur. Disturbances at a deeper level cause disturbances at more superficial levels and are manifested in the daily life of a person in feelings, behavioral patterns, psychosomatics, etc. Without removing the deepest source of a disturbance it is impossible to completely correct its manifestation in the actual moment. Thus a destructive pattern of feeling and behavior in the actual moment can exist because of a disturbance, for example, at the spiritual level of a person (although in most cases the source is at the energy level). Psychological work with superficial levels without removing a source of disturbance cannot fully solve the problem. The method of Deep Systemic Constellations allows one to work effectively with the source of the problem located at any depth. The methodology has confirmed its effectiveness in working with more than a thousand people.

Keywords: constellations, spiritual psychology, structure of consciousness, transpersonal psychology

Procedia PDF Downloads 214