Search results for: data analysis
13226 Aircraft Selection Using Multiple Criteria Decision Making Analysis Method with Different Data Normalization Techniques
Authors: C. Ardil
Abstract:
This paper presents an original application of multiple criteria decision making analysis theory to the evaluation of aircraft selection problem. The selection of an optimal, efficient and reliable fleet, network and operations planning policy is one of the most important factors in aircraft selection problem. Given that decision making in aircraft selection involves the consideration of a number of opposite criteria and possible solutions, such a selection can be considered as a multiple criteria decision making analysis problem. This study presents a new integrated approach to decision making by considering the multiple criteria utility theory and the maximal regret minimization theory methods as well as aircraft technical, economical, and environmental aspects. Multiple criteria decision making analysis method uses different normalization techniques to allow criteria to be aggregated with qualitative and quantitative data of the decision problem. Therefore, selecting a suitable normalization technique for the model is also a challenge to provide data aggregation for the aircraft selection problem. To compare the impact of different normalization techniques on the decision problem, the vector, linear (sum), linear (max), and linear (max-min) data normalization techniques were identified to evaluate aircraft selection problem. As a logical implication of the proposed approach, it enhances the decision making process through enabling the decision maker to: (i) use higher level knowledge regarding the selection of criteria weights and the proposed technique, (ii) estimate the ranking of an alternative, under different data normalization techniques and integrated criteria weights after a posteriori analysis of the final rankings of alternatives. A set of commercial passenger aircraft were considered in order to illustrate the proposed approach. The obtained results of the proposed approach were compared using Spearman's rho tests. An analysis of the final rank stability with respect to the changes in criteria weights was also performed so as to assess the sensitivity of the alternative rankings obtained by the application of different data normalization techniques and the proposed approach.
Keywords: Normalization Techniques, Aircraft Selection, Multiple Criteria Decision Making, Multiple Criteria Decision Making Analysis, MCDMA
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 59113225 A Security Cloud Storage Scheme Based Accountable Key-Policy Attribute-Based Encryption without Key Escrow
Authors: Ming Lun Wang, Yan Wang, Ning Ruo Sun
Abstract:
With the development of cloud computing, more and more users start to utilize the cloud storage service. However, there exist some issues: 1) cloud server steals the shared data, 2) sharers collude with the cloud server to steal the shared data, 3) cloud server tampers the shared data, 4) sharers and key generation center (KGC) conspire to steal the shared data. In this paper, we use advanced encryption standard (AES), hash algorithms, and accountable key-policy attribute-based encryption without key escrow (WOKE-AKP-ABE) to build a security cloud storage scheme. Moreover, the data are encrypted to protect the privacy. We use hash algorithms to prevent the cloud server from tampering the data uploaded to the cloud. Analysis results show that this scheme can resist conspired attacks.
Keywords: Cloud storage security, sharing storage, attributes, Hash algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 103713224 Classification of Right and Left-Hand Movement Using Multi-Resolution Analysis Method
Authors: Nebi Gedik
Abstract:
The aim of the brain-computer interface studies on electroencephalogram (EEG) signals containing motor imagery is to extract the effective features that will provide the highest possible classification accuracy for the detection of the desired motor movement. However, achieving this goal is difficult as the most suitable frequency band and time frame vary from subject to subject. In this study, the classification success of the two-feature data obtained from raw EEG signals and the coefficients of the multi-resolution analysis method applied to the EEG signals were analyzed comparatively. The method was applied to several EEG channels (C3, Cz and C4) signals obtained from the EEG data set belonging to the publicly available BCI competition III.
Keywords: Motor imagery, EEG, wave atom transform, k-NN.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 59013223 A Social Cognitive Investigation in the Context of Vocational Training Performance of People with Disabilities
Authors: Majid A. AlSayari
Abstract:
The study reported here investigated social cognitive theory (SCT) in the context of Vocational Rehab (VR) for people with disabilities. The prime purpose was to increase knowledge of VR phenomena and make recommendations for improving VR services. The sample consisted of 242 persons with Spinal Cord Injuries (SCI) who completed questionnaires. A further 32 participants were Trainers. Analysis of questionnaire data was carried out using factor analysis, multiple regression analysis, and thematic analysis. The analysis suggested that, in motivational terms, and consistent with research carried out in other academic contexts, self-efficacy was the best predictor of VR performance. The author concludes that that VR self-efficacy predicted VR training performance.
Keywords: Social cognitive theory, vocational rehab, self-efficacy, proxy efficacy, people with disabilities.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 77213222 Analysis of Statistical Data on Social Resources Dimension of Occupational Status Attainment: A Rational Choice Approach
Authors: Oleg Demchenko
Abstract:
The aim of the present study is to analyze empirical researches on the social resources dimension of occupational status attainment process and relate them to the rational choice approach. The analysis suggests that the existing data on the strength of ties aspect of social resources is insufficient and does not allow any implication concerning rational actor-s behavior. However, the results concerning work relation aspect are more encouraging.Keywords: Social resources, status attainment, rational choice, weak ties, work-related ties.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 148713221 The Success of E-Collaborative in E-Commerce: The Study of B2C Business in Thailand
Authors: Wanida Suwunniponth
Abstract:
The objectives of this research were to study the influencing factors that contributed to the success of e-collaborative in e-commerce of B2C (Business to Customer) business in Bangkok, Thailand. The influencing factors included organization, people, information technology and the process of e-collaborative. A questionnaire was used to collect data from 200 small e-commerce businesses and the path analysis was utilized as the tool for data analysis. By using the path analysis, it was revealed that the factors concerning with organization, people and information technology played an influence on e-collaborative process and the success of ecollaborative, whereas the process of e-collaborative factor manipulated its success. The findings suggested that B2C ecommerce business in Thailand should opt in improvement approach in terms of managerial structure, leaderships, staff’s skills and knowledge, and investment of information technology in order to capacitate higher efficiency of e-collaborative process that would result in profit and competitive advantage.
Keywords: E-collaborative, E-commerce, B2C.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 204313220 An Evaluation Model for Semantic Enablement of Virtual Research Environments
Authors: Tristan O'Neill, Trina Myers, Jarrod Trevathan
Abstract:
The Tropical Data Hub (TDH) is a virtual research environment that provides researchers with an e-research infrastructure to congregate significant tropical data sets for data reuse, integration, searching, and correlation. However, researchers often require data and metadata synthesis across disciplines for crossdomain analyses and knowledge discovery. A triplestore offers a semantic layer to achieve a more intelligent method of search to support the synthesis requirements by automating latent linkages in the data and metadata. Presently, the benchmarks to aid the decision of which triplestore is best suited for use in an application environment like the TDH are limited to performance. This paper describes a new evaluation tool developed to analyze both features and performance. The tool comprises a weighted decision matrix to evaluate the interoperability, functionality, performance, and support availability of a range of integrated and native triplestores to rank them according to requirements of the TDH.
Keywords: Virtual research environment, Semantic Web, performance analysis, tropical data hub.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 178413219 An Analysis on the Appropriateness and Effectiveness of CCTV Location for Crime Prevention
Authors: Tae-Heon Moon, Sun-Young Heo, Sang-Ho Lee, Youn-Taik Leem, Kwang-Woo Nam
Abstract:
This study aims to investigate the possibility of crime prevention through CCTV by analyzing the appropriateness of the CCTV location, whether it is installed in the hotspot of crime-prone areas, and exploring the crime prevention effect and transition effect. The real crime and CCTV locations of case city were converted into the spatial data by using GIS. The data was analyzed by hotspot analysis and weighted displacement quotient (WDQ). As study methods, it analyzed existing relevant studies for identifying the trends of CCTV and crime studies based on big data from 1800 to 2014 and understanding the relation between CCTV and crime. Second, it investigated the current situation of nationwide CCTVs and analyzed the guidelines of CCTV installation and operation to draw attention to the problems and indicating points of CCTV use. Third, it investigated the crime occurrence in case areas and the current situation of CCTV installation in the spatial aspects, and analyzed the appropriateness and effectiveness of CCTV installation to suggest a rational installation of CCTV and the strategic direction of crime prevention. The results demonstrate that there was no significant effect in the installation of CCTV on crime prevention in the case area. This indicates that CCTV should be installed and managed in a more scientific way reflecting local crime situations. In terms of CCTV, the methods of spatial analysis such as GIS, which can evaluate the installation effect, and the methods of economic analysis like cost-benefit analysis should be developed. In addition, these methods should be distributed to local governments across the nation for the appropriate installation of CCTV and operation. This study intended to find a design guideline of the optimum CCTV installation. In this regard, this study is meaningful in that it will contribute to the creation of a safe city.
Keywords: CCTV, Safe City, Crime Prevention, Spatial Analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 268413218 Data Envelopment Analysis under Uncertainty and Risk
Authors: P. Beraldi, M. E. Bruni
Abstract:
Data Envelopment Analysis (DEA) is one of the most widely used technique for evaluating the relative efficiency of a set of homogeneous decision making units. Traditionally, it assumes that input and output variables are known in advance, ignoring the critical issue of data uncertainty. In this paper, we deal with the problem of efficiency evaluation under uncertain conditions by adopting the general framework of the stochastic programming. We assume that output parameters are represented by discretely distributed random variables and we propose two different models defined according to a neutral and risk-averse perspective. The models have been validated by considering a real case study concerning the evaluation of the technical efficiency of a sample of individual firms operating in the Italian leather manufacturing industry. Our findings show the validity of the proposed approach as ex-ante evaluation technique by providing the decision maker with useful insights depending on his risk aversion degree.Keywords: DEA, Stochastic Programming, Ex-ante evaluation technique, Conditional Value at Risk.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 196813217 Optimal Feature Extraction Dimension in Finger Vein Recognition Using Kernel Principal Component Analysis
Authors: Amir Hajian, Sepehr Damavandinejadmonfared
Abstract:
In this paper the issue of dimensionality reduction is investigated in finger vein recognition systems using kernel Principal Component Analysis (KPCA). One aspect of KPCA is to find the most appropriate kernel function on finger vein recognition as there are several kernel functions which can be used within PCA-based algorithms. In this paper, however, another side of PCA-based algorithms -particularly KPCA- is investigated. The aspect of dimension of feature vector in PCA-based algorithms is of importance especially when it comes to the real-world applications and usage of such algorithms. It means that a fixed dimension of feature vector has to be set to reduce the dimension of the input and output data and extract the features from them. Then a classifier is performed to classify the data and make the final decision. We analyze KPCA (Polynomial, Gaussian, and Laplacian) in details in this paper and investigate the optimal feature extraction dimension in finger vein recognition using KPCA.
Keywords: Biometrics, finger vein recognition, Principal Component Analysis (PCA), Kernel Principal Component Analysis (KPCA).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 196213216 Pattern Recognition Techniques Applied to Biomedical Patterns
Authors: Giovanni Luca Masala
Abstract:
Pattern recognition is the research area of Artificial Intelligence that studies the operation and design of systems that recognize patterns in the data. Important application areas are image analysis, character recognition, fingerprint classification, speech analysis, DNA sequence identification, man and machine diagnostics, person identification and industrial inspection. The interest in improving the classification systems of data analysis is independent from the context of applications. In fact, in many studies it is often the case to have to recognize and to distinguish groups of various objects, which requires the need for valid instruments capable to perform this task. The objective of this article is to show several methodologies of Artificial Intelligence for data classification applied to biomedical patterns. In particular, this work deals with the realization of a Computer-Aided Detection system (CADe) that is able to assist the radiologist in identifying types of mammary tumor lesions. As an additional biomedical application of the classification systems, we present a study conducted on blood samples which shows how these methods may help to distinguish between carriers of Thalassemia (or Mediterranean Anaemia) and healthy subjects.
Keywords: Computer Aided Detection, mammary tumor, pattern recognition, dissimilarity
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 236013215 Real Time Multi-Sensory Force Sensing Mat for Sports Biomechanics and Human Gait Analysis
Authors: D. Gouwanda, S. M. N. A. Senanayake
Abstract:
This paper presents a real time force sensing instrument that is designed for human gait analysis purposes. It is capable of recording and monitoring ground reaction forces exerted by human foot during various activities such as walking, running and jumping in real time. In overall, force sensing mat mainly consists of three elements: the force sensing mat, signal conditioning circuit and data acquisition device. Force sensing mat is the mat that contains an array of force sensing elements. To control and process the incoming signal from the force sensing mat, Force-Logger and Force-Reloader are developed using National Instrument Labview. This paper describes the architecture of the force sensing mat, signal conditioning circuit and the real time streaming of the incoming data from the force sensing mat. Additionally, a preliminary experiment dataset is presented in this paper.Keywords: Force platform, force sensing resistor, human gait analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 231813214 The Current Status of Middle Class Internet Use in China: An Analysis Based on the Chinese General Social Survey 2015 Data and Semi-Structured Investigation
Authors: Abigail Qian Zhou
Abstract:
In today's China, the well-educated middle class, with stable jobs and above-average income, are the driving force behind its Internet society. Through the analysis of data from the 2015 Chinese General Social Survey and 50 interviewees, this study investigates the current situation of this group’s specific internet usage. The findings of this study demonstrate that daily life among the members of this socioeconomic group is closely tied to the Internet. For Chinese middle class, the Internet is used to socialize and entertain self and others. It is also used to search for and share information as well as to build their identities. The empirical results of this study will provide a reference, supported by factual data, for enterprises seeking to target the Chinese middle class through online marketing efforts.
Keywords: China, internet use, middle class, network behavior, online marketing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 76213213 Strategic Management Methods in Non-profit Making Organization
Authors: P. Řehoř, D. Holátová, V. Doležalová
Abstract:
Paper deals with analysis of strategic management methods in non-profit making organization in the Czech Republic. Strategic management represents an aggregate of methods and approaches that can be applied for managing organizations - in this article the organizations which associate owners and keepers of nonstate forest properties. Authors use these methods of strategic management: analysis of stakeholders, SWOT analysis and questionnaire inquiries. The questionnaire was distributed electronically via e-mail. In October 2013 we obtained data from a total of 84 questionnaires. Based on the results the authors recommend the using of confrontation strategy which improves the competitiveness of non-profit making organizations.
Keywords: Strategic management, non-profit making organization, strategy analysis, SWOT analysis, strategy, competitiveness.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 426913212 SUPAR: System for User-Centric Profiling of Association Rules in Streaming Data
Authors: Sarabjeet Kaur Kochhar
Abstract:
With a surge of stream processing applications novel techniques are required for generation and analysis of association rules in streams. The traditional rule mining solutions cannot handle streams because they generally require multiple passes over the data and do not guarantee the results in a predictable, small time. Though researchers have been proposing algorithms for generation of rules from streams, there has not been much focus on their analysis. We propose Association rule profiling, a user centric process for analyzing association rules and attaching suitable profiles to them depending on their changing frequency behavior over a previous snapshot of time in a data stream. Association rule profiles provide insights into the changing nature of associations and can be used to characterize the associations. We discuss importance of characteristics such as predictability of linkages present in the data and propose metric to quantify it. We also show how association rule profiles can aid in generation of user specific, more understandable and actionable rules. The framework is implemented as SUPAR: System for Usercentric Profiling of Association Rules in streaming data. The proposed system offers following capabilities: i) Continuous monitoring of frequency of streaming item-sets and detection of significant changes therein for association rule profiling. ii) Computation of metrics for quantifying predictability of associations present in the data. iii) User-centric control of the characterization process: user can control the framework through a) constraint specification and b) non-interesting rule elimination.Keywords: Data Streams, User subjectivity, Change detection, Association rule profiles, Predictability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 145813211 EFL Learners- Perceptions of Computer-Mediated Communication (CMC) to Facilitate Communication in a Foreign Language
Authors: Lin, Huifen, Fang, Yueh-chiu
Abstract:
This study explores perceptions of English as a Foreign Language (EFL) learners on using computer mediated communication technology in their learner of English. The data consists of observations of both synchronous and asynchronous communication participants engaged in for over a period of 4 months, which included online, and offline communication protocols, open-ended interviews and reflection papers composed by participants. Content analysis of interview data and the written documents listed above, as well as, member check and triangulation techniques are the major data analysis strategies. The findings suggest that participants generally do not benefit from computer-mediated communication in terms of its effect in learning a foreign language. Participants regarded the nature of CMC as artificial, or pseudo communication that did not aid their authentic communicational skills in English. The results of this study sheds lights on insufficient and inconclusive findings, which most quantitative CMC studies previously generated.Keywords: computer-mediated communication, EFL, writing
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 258113210 Modelling Silica Optical Fibre Reliability: A Software Application
Authors: I. Severin, M. Caramihai, R. El Abdi, M. Poulain, A. Avadanii
Abstract:
In order to assess optical fiber reliability in different environmental and stress conditions series of testing are performed simulating overlapping of chemical and mechanical controlled varying factors. Each series of testing may be compared using statistical processing: i.e. Weibull plots. Due to the numerous data to treat, a software application has appeared useful to interpret selected series of experiments in function of envisaged factors. The current paper presents a software application used in the storage, modelling and interpretation of experimental data gathered from optical fibre testing. The present paper strictly deals with the software part of the project (regarding the modelling, storage and processing of user supplied data).
Keywords: Optical fibres, computer aided analysis, data models, data processing, graphical user interfaces.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 182313209 Ontology-Based Systemizing of the Science Information Devoted to Waste Utilizing by Methanogenesis
Authors: Ye. Shapovalov, V. Shapovalov, O. Stryzhak, A. Salyuk
Abstract:
Over the past decades, amount of scientific information has been growing exponentially. It became more complicated to process and systemize this amount of data. The approach to systematization of scientific information on the production of biogas based on the ontological IT platform “T.O.D.O.S.” has been developed. It has been proposed to select semantic characteristics of each work for their further introduction into the IT platform “T.O.D.O.S.”. An ontological graph with a ranking function for previous scientific research and for a system of selection of microorganisms has been worked out. These systems provide high performance of information management of scientific information.
Keywords: Ontology-based analysis, analysis of scientific data, methanogenesys, microorganism hierarchy, T.O.D.O.S.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 73513208 Biological Data Integration using SOA
Authors: Noura Meshaan Al-Otaibi, Amin Yousef Noaman
Abstract:
Nowadays scientific data is inevitably digital and stored in a wide variety of formats in heterogeneous systems. Scientists need to access an integrated view of remote or local heterogeneous data sources with advanced data accessing, analyzing, and visualization tools. This research suggests the use of Service Oriented Architecture (SOA) to integrate biological data from different data sources. This work shows SOA will solve the problems that facing integration process and if the biologist scientists can access the biological data in easier way. There are several methods to implement SOA but web service is the most popular method. The Microsoft .Net Framework used to implement proposed architecture.Keywords: Bioinformatics, Biological data, Data Integration, SOA and Web Services.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 247413207 Artificial Intelligence Techniques applied to Biomedical Patterns
Authors: Giovanni Luca Masala
Abstract:
Pattern recognition is the research area of Artificial Intelligence that studies the operation and design of systems that recognize patterns in the data. Important application areas are image analysis, character recognition, fingerprint classification, speech analysis, DNA sequence identification, man and machine diagnostics, person identification and industrial inspection. The interest in improving the classification systems of data analysis is independent from the context of applications. In fact, in many studies it is often the case to have to recognize and to distinguish groups of various objects, which requires the need for valid instruments capable to perform this task. The objective of this article is to show several methodologies of Artificial Intelligence for data classification applied to biomedical patterns. In particular, this work deals with the realization of a Computer-Aided Detection system (CADe) that is able to assist the radiologist in identifying types of mammary tumor lesions. As an additional biomedical application of the classification systems, we present a study conducted on blood samples which shows how these methods may help to distinguish between carriers of Thalassemia (or Mediterranean Anaemia) and healthy subjects.Keywords: Computer Aided Detection, mammary tumor, pattern recognition, thalassemia.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 142513206 Using TRACE, PARCS, and SNAP Codes to Analyze the Load Rejection Transient of ABWR
Authors: J. R. Wang, H. C. Chang, A. L. Ho, J. H. Yang, S. W. Chen, C. Shih
Abstract:
The purpose of the study is to analyze the load rejection transient of ABWR by using TRACE, PARCS, and SNAP codes. This study has some steps. First, using TRACE, PARCS, and SNAP codes establish the model of ABWR. Second, the key parameters are identified to refine the TRACE/PARCS/SNAP model further in the frame of a steady state analysis. Third, the TRACE/PARCS/SNAP model is used to perform the load rejection transient analysis. Finally, the FSAR data are used to compare with the analysis results. The results of TRACE/PARCS are consistent with the FSAR data for the important parameters. It indicates that the TRACE/PARCS/SNAP model of ABWR has a good accuracy in the load rejection transient.
Keywords: ABWR, TRACE, PARCS, SNAP.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 82813205 Data Mining Techniques in Computer-Aided Diagnosis: Non-Invasive Cancer Detection
Authors: Florin Gorunescu
Abstract:
Diagnosis can be achieved by building a model of a certain organ under surveillance and comparing it with the real time physiological measurements taken from the patient. This paper deals with the presentation of the benefits of using Data Mining techniques in the computer-aided diagnosis (CAD), focusing on the cancer detection, in order to help doctors to make optimal decisions quickly and accurately. In the field of the noninvasive diagnosis techniques, the endoscopic ultrasound elastography (EUSE) is a recent elasticity imaging technique, allowing characterizing the difference between malignant and benign tumors. Digitalizing and summarizing the main EUSE sample movies features in a vector form concern with the use of the exploratory data analysis (EDA). Neural networks are then trained on the corresponding EUSE sample movies vector input in such a way that these intelligent systems are able to offer a very precise and objective diagnosis, discriminating between benign and malignant tumors. A concrete application of these Data Mining techniques illustrates the suitability and the reliability of this methodology in CAD.Keywords: Endoscopic ultrasound elastography, exploratorydata analysis, neural networks, non-invasive cancer detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 186813204 Combination of Geological, Geophysical and Reservoir Engineering Analyses in Field Development: A Case Study
Authors: Atif Zafar, Fan Haijun
Abstract:
A sequence of different Reservoir Engineering methods and tools in reservoir characterization and field development are presented in this paper. The real data of Jin Gas Field of L-Basin of Pakistan is used. The basic concept behind this work is to enlighten the importance of well test analysis in a broader way (i.e. reservoir characterization and field development) unlike to just determine the permeability and skin parameters. Normally in the case of reservoir characterization we rely on well test analysis to some extent but for field development plan, the well test analysis has become a forgotten tool specifically for locations of new development wells. This paper describes the successful implementation of well test analysis in Jin Gas Field where the main uncertainties are identified during initial stage of field development when location of new development well was marked only on the basis of G&G (Geologic and Geophysical) data. The seismic interpretation could not encounter one of the boundary (fault, sub-seismic fault, heterogeneity) near the main and only producing well of Jin Gas Field whereas the results of the model from the well test analysis played a very crucial rule in order to propose the location of second well of the newly discovered field. The results from different methods of well test analysis of Jin Gas Field are also integrated with and supported by other tools of Reservoir Engineering i.e. Material Balance Method and Volumetric Method. In this way, a comprehensive way out and algorithm is obtained in order to integrate the well test analyses with Geological and Geophysical analyses for reservoir characterization and field development. On the strong basis of this working and algorithm, it was successfully evaluated that the proposed location of new development well was not justified and it must be somewhere else except South direction.Keywords: Field development, reservoir characterization, reservoir engineering, well test analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 111513203 Anomaly Detection in a Data Center with a Reconstruction Method Using a Multi-Autoencoders Model
Authors: Victor Breux, Jérôme Boutet, Alain Goret, Viviane Cattin
Abstract:
Early detection of anomalies in data centers is important to reduce downtimes and the costs of periodic maintenance. However, there is little research on this topic and even fewer on the fusion of sensor data for the detection of abnormal events. The goal of this paper is to propose a method for anomaly detection in data centers by combining sensor data (temperature, humidity, power) and deep learning models. The model described in the paper uses one autoencoder per sensor to reconstruct the inputs. The auto-encoders contain Long-Short Term Memory (LSTM) layers and are trained using the normal samples of the relevant sensors selected by correlation analysis. The difference signal between the input and its reconstruction is then used to classify the samples using feature extraction and a random forest classifier. The data measured by the sensors of a data center between January 2019 and May 2020 are used to train the model, while the data between June 2020 and May 2021 are used to assess it. Performances of the model are assessed a posteriori through F1-score by comparing detected anomalies with the data center’s history. The proposed model outperforms the state-of-the-art reconstruction method, which uses only one autoencoder taking multivariate sequences and detects an anomaly with a threshold on the reconstruction error, with an F1-score of 83.60% compared to 24.16%.
Keywords: Anomaly detection, autoencoder, data centers, deep learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 74913202 STATISTICA Software: A State of the Art Review
Authors: S. Sarumathi, N. Shanthi, S. Vidhya, P. Ranjetha
Abstract:
Data mining idea is mounting rapidly in admiration and also in their popularity. The foremost aspire of data mining method is to extract data from a huge data set into several forms that could be comprehended for additional use. The data mining is a technology that contains with rich potential resources which could be supportive for industries and businesses that pay attention to collect the necessary information of the data to discover their customer’s performances. For extracting data there are several methods are available such as Classification, Clustering, Association, Discovering, and Visualization… etc., which has its individual and diverse algorithms towards the effort to fit an appropriate model to the data. STATISTICA mostly deals with excessive groups of data that imposes vast rigorous computational constraints. These results trials challenge cause the emergence of powerful STATISTICA Data Mining technologies. In this survey an overview of the STATISTICA software is illustrated along with their significant features.
Keywords: Data Mining, STATISTICA Data Miner, Text Miner, Enterprise Server, Classification, Association, Clustering, Regression.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 260713201 Using HABIT to Establish the Chemicals Analysis Methodology for Maanshan Nuclear Power Plant
Authors: J. R. Wang, S. W. Chen, Y. Chiang, W. S. Hsu, J. H. Yang, Y. S. Tseng, C. Shih
Abstract:
In this research, the HABIT analysis methodology was established for Maanshan nuclear power plant (NPP). The Final Safety Analysis Report (FSAR), reports, and other data were used in this study. To evaluate the control room habitability under the CO2 storage burst, the HABIT methodology was used to perform this analysis. The HABIT result was below the R.G. 1.78 failure criteria. This indicates that Maanshan NPP habitability can be maintained. Additionally, the sensitivity study of the parameters (wind speed, atmospheric stability classification, air temperature, and control room intake flow rate) was also performed in this research.
Keywords: PWR, HABIT, habitability, Maanshan.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 91113200 Road Accidents Bigdata Mining and Visualization Using Support Vector Machines
Authors: Usha Lokala, Srinivas Nowduri, Prabhakar K. Sharma
Abstract:
Useful information has been extracted from the road accident data in United Kingdom (UK), using data analytics method, for avoiding possible accidents in rural and urban areas. This analysis make use of several methodologies such as data integration, support vector machines (SVM), correlation machines and multinomial goodness. The entire datasets have been imported from the traffic department of UK with due permission. The information extracted from these huge datasets forms a basis for several predictions, which in turn avoid unnecessary memory lapses. Since data is expected to grow continuously over a period of time, this work primarily proposes a new framework model which can be trained and adapt itself to new data and make accurate predictions. This work also throws some light on use of SVM’s methodology for text classifiers from the obtained traffic data. Finally, it emphasizes the uniqueness and adaptability of SVMs methodology appropriate for this kind of research work.Keywords: Road accident, machine learning, support vector machines.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 112913199 Clustering Methods Applied to the Tracking of user Traces Interacting with an e-Learning System
Authors: Larbi Omar, Elberrichi Zakaria
Abstract:
Many research works are carried out on the analysis of traces in a digital learning environment. These studies produce large volumes of usage tracks from the various actions performed by a user. However, to exploit these data, compare and improve performance, several issues are raised. To remedy this, several works deal with this problem seen recently. This research studied a series of questions about format and description of the data to be shared. Our goal is to share thoughts on these issues by presenting our experience in the analysis of trace-based log files, comparing several approaches used in automatic classification applied to e-learning platforms. Finally, the obtained results are discussed.Keywords: Classification, , e-learning platform, log file, Trace.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 147913198 LTE Performance Analysis in the City of Bogota Northern Zone for Two Different Mobile Broadband Operators over Qualipoc
Authors: Víctor D. Rodríguez, Edith P. Estupiñán, Juan C. Martínez
Abstract:
The evolution in mobile broadband technologies has allowed to increase the download rates in users considering the current services. The evaluation of technical parameters at the link level is of vital importance to validate the quality and veracity of the connection, thus avoiding large losses of data, time and productivity. Some of these failures may occur between the eNodeB (Evolved Node B) and the user equipment (UE), so the link between the end device and the base station can be observed. LTE (Long Term Evolution) is considered one of the IP-oriented mobile broadband technologies that work stably for data and VoIP (Voice Over IP) for those devices that have that feature. This research presents a technical analysis of the connection and channeling processes between UE and eNodeB with the TAC (Tracking Area Code) variables, and analysis of performance variables (Throughput, Signal to Interference and Noise Ratio (SINR)). Three measurement scenarios were proposed in the city of Bogotá using QualiPoc, where two operators were evaluated (Operator 1 and Operator 2). Once the data were obtained, an analysis of the variables was performed determining that the data obtained in transmission modes vary depending on the parameters BLER (Block Error Rate), performance and SNR (Signal-to-Noise Ratio). In the case of both operators, differences in transmission modes are detected and this is reflected in the quality of the signal. In addition, due to the fact that both operators work in different frequencies, it can be seen that Operator 1, despite having spectrum in Band 7 (2600 MHz), together with Operator 2, is reassigning to another frequency, a lower band, which is AWS (1700 MHz), but the difference in signal quality with respect to the establishment with data by the provider Operator 2 and the difference found in the transmission modes determined by the eNodeB in Operator 1 is remarkable.
Keywords: BLER, LTE, Network, Qualipoc, SNR.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 53513197 Proposal of Data Collection from Probes
Authors: M. Kebisek, L. Spendla, M. Kopcek, T. Skulavik
Abstract:
In our paper we describe the security capabilities of data collection. Data are collected with probes located in the near and distant surroundings of the company. Considering the numerous obstacles e.g. forests, hills, urban areas, the data collection is realized in several ways. The collection of data uses connection via wireless communication, LAN network, GSM network and in certain areas data are collected by using vehicles. In order to ensure the connection to the server most of the probes have ability to communicate in several ways. Collected data are archived and subsequently used in supervisory applications. To ensure the collection of the required data, it is necessary to propose algorithms that will allow the probes to select suitable communication channel.
Keywords: Communication, computer network, data collection, probe.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1782