Search results for: mathematical data analysis.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 14053

Search results for: mathematical data analysis.

12913 Data Mining Using Learning Automata

Authors: M. R. Aghaebrahimi, S. H. Zahiri, M. Amiri

Abstract:

In this paper a data miner based on the learning automata is proposed and is called LA-miner. The LA-miner extracts classification rules from data sets automatically. The proposed algorithm is established based on the function optimization using learning automata. The experimental results on three benchmarks indicate that the performance of the proposed LA-miner is comparable with (sometimes better than) the Ant-miner (a data miner algorithm based on the Ant Colony optimization algorithm) and CNZ (a well-known data mining algorithm for classification).

Keywords: Data mining, Learning automata, Classification rules, Knowledge discovery.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1935
12912 Secure and Efficient Transmission of Aggregated Data for Mobile Wireless Sensor Networks

Authors: A. Krishna Veni, R.Geetha

Abstract:

Wireless Sensor Networks (WSNs) are suitable for many scenarios in the real world. The retrieval of data is made efficient by the data aggregation techniques. Many techniques for the data aggregation are offered and most of the existing schemes are not energy efficient and secure. However, the existing techniques use the traditional clustering approach where there is a delay during the packet transmission since there is no proper scheduling. The presented system uses the Velocity Energy-efficient and Link-aware Cluster-Tree (VELCT) scheme in which there is a Data Collection Tree (DCT) which improves the lifetime of the network. The VELCT scheme and the construction of DCT reduce the delay and traffic. The network lifetime can be increased by avoiding the frequent change in cluster topology. Secure and Efficient Transmission of Aggregated data (SETA) improves the security of the data transmission via the trust value of the nodes prior the aggregation of data. Since SETA considers the data only from the trustworthy nodes for aggregation, it is more secure in transmitting the data thereby improving the accuracy of aggregated data.

Keywords: Aggregation, lifetime, network security, wireless sensor network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1217
12911 Design of a Fuzzy Feed-forward Controller for Monitor HAGC System of Cold Rolling Mill

Authors: S. Khosravi, A. Afshar, F. Barazandeh

Abstract:

In this study we propose a novel monitor hydraulic automatic gauge control (HAGC) system based on fuzzy feedforward controller. This is used in the development of cold rolling mill automation system to improve the quality of cold strip. According to features/ properties of entry steel strip like its average yield stress, width of strip, and desired exit thickness, this controller realizes the compensation for the exit thickness error. The traditional methods of adjusting the roller position, can-t tolerate the variance in the entry steel strip. The proposed method uses a mathematical model of the system together with the expert knowledge to perform this adjustment while minimizing the effect of the stated problem. In order to improve the speed of the controller in rejecting disturbances introduced by entry strip thickness variations, expert knowledge is added as a feed-forward term to the HAGC system. Simulation results for the application of the proposed controller to a real cold mill show that the exit strip quality is highly improved.

Keywords: Fuzzy feed-forward controller, monitor HAGC system, dynamic mathematical model, entry strip thickness deviation compensation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2206
12910 Optimization of PEM Fuel Cell Biphasic Model

Authors: Boubekeur Dokkar, Nasreddine Chennouf, Noureddine Settou, Belkhir Negrou, Abdesslam Benmhidi

Abstract:

The optimal operation of proton exchange membrane fuel cell (PEMFC) requires good water management which is presented under two forms vapor and liquid. Moreover, fuel cells have to reach higher output require integration of some accessories which need electrical power. In order to analyze fuel cells operation and different species transport phenomena a biphasic mathematical model is presented by governing equations set. The numerical solution of these conservation equations is calculated by Matlab program. A multi-criteria optimization with weighting between two opposite objectives is used to determine the compromise solutions between maximum output and minimal stack size. The obtained results are in good agreement with available literature data.

Keywords: Biphasic model, PEM fuel cell, optimization, simulation, specie transport.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2031
12909 Ranking Alternatives in Multi-Criteria Decision Analysis using Common Weights Based on Ideal and Anti-ideal Frontiers

Authors: Saber Saati Mohtadi, Ali Payan, Azizallah Kord

Abstract:

One of the most important issues in multi-criteria decision analysis (MCDA) is to determine the weights of criteria so that all alternatives can be compared based on the collective performance of criteria. In this paper, one of popular methods in data envelopment analysis (DEA) known as common weights (CWs) is used to determine the weights in MCDA. Two frontiers named ideal and anti-ideal frontiers, instead of ideal and anti-ideal alternatives, are defined based on two new proposed CWs models. Ideal and antiideal frontiers are more flexible than that of alternatives. According to the optimal solutions of these two models, the distances of an alternative from the ideal and anti-ideal frontiers are derived. Then, a relative distance is introduced to measure the value of each alternative. The suggested models are linear and despite weight restrictions are feasible. An example is presented for explaining the method and for comparing to the existing literature.

Keywords: Anti-ideal frontier, Common weights (CWs), Ideal frontier, Multi-criteria decision analysis (MCDA)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1891
12908 Traffic Flow Prediction using Adaboost Algorithm with Random Forests as a Weak Learner

Authors: Guy Leshem, Ya'acov Ritov

Abstract:

Traffic Management and Information Systems, which rely on a system of sensors, aim to describe in real-time traffic in urban areas using a set of parameters and estimating them. Though the state of the art focuses on data analysis, little is done in the sense of prediction. In this paper, we describe a machine learning system for traffic flow management and control for a prediction of traffic flow problem. This new algorithm is obtained by combining Random Forests algorithm into Adaboost algorithm as a weak learner. We show that our algorithm performs relatively well on real data, and enables, according to the Traffic Flow Evaluation model, to estimate and predict whether there is congestion or not at a given time on road intersections.

Keywords: Machine Learning, Boosting, Classification, TrafficCongestion, Data Collecting, Magnetic Loop Detectors, SignalizedIntersections, Traffic Signal Timing Optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3910
12907 Feasibility Analysis Studies on New National R&D Programs in Korea

Authors: Seongmin Yim, Hyun-Kyu Kang

Abstract:

As a part of evaluation system for R&D program, the Korean government has applied feasibility analysis since 2008. Various professionals put forth a great effort in order to catch up the high degree of freedom of R&D programs, and make contributions to evolving the feasibility analysis. We analyze diverse R&D programs from various viewpoints, such as technology, policy, and Economics, integrate the separate analysis, and finally arrive at a definite result; whether a program is feasible or unfeasible. This paper describes the concept and method of the feasibility analysis as a decision making tool. The analysis unit and content of each criterion, which are key elements in a comprehensive decision making structure, are examined

Keywords: Decision Making of New Government R&D Program, Feasibility Analysis Study

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1408
12906 An Attribute-Centre Based Decision Tree Classification Algorithm

Authors: Gökhan Silahtaroğlu

Abstract:

Decision tree algorithms have very important place at classification model of data mining. In literature, algorithms use entropy concept or gini index to form the tree. The shape of the classes and their closeness to each other some of the factors that affect the performance of the algorithm. In this paper we introduce a new decision tree algorithm which employs data (attribute) folding method and variation of the class variables over the branches to be created. A comparative performance analysis has been held between the proposed algorithm and C4.5.

Keywords: Classification, decision tree, split, pruning, entropy, gini.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1369
12905 Systematic Analysis of Dynamic Association of Health Outcomes with Computer Usage for Office Staff

Authors: Xiaoshu Lu, Esa-Pekka Takala, Risto Toivonen

Abstract:

This paper systematically investigates the timedependent health outcomes for office staff during computer work using the developed mathematical model. The model describes timedependent health outcomes in multiple body regions associated with computer usage. The association is explicitly presented with a doseresponse relationship which is parametrized by body region parameters. Using the developed model we perform extensive investigations of the health outcomes statically and dynamically. We compare the risk body regions and provide various severity rankings of the discomfort rate changes with respect to computer-related workload dynamically for the study population. Application of the developed model reveals a wide range of findings. Such broad spectrum of investigations in a single report literature is lacking. Based upon the model analysis, it is discovered that the highest average severity level of the discomfort exists in neck, shoulder, eyes, shoulder joint/upper arm, upper back, low back and head etc. The biggest weekly changes of discomfort rates are in eyes, neck, head, shoulder, shoulder joint/upper arm and upper back etc. The fastest discomfort rate is found in neck, followed by shoulder, eyes, head, shoulder joint/upper arm and upper back etc. Most of our findings are consistent with the literature, which demonstrates that the developed model and results are applicable and valuable and can be utilized to assess correlation between the amount of computer-related workload and health risk.

Keywords: Computer-related workload, health outcomes, dynamic association, dose-response relationship, systematic analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1288
12904 A Robust Data Hiding Technique based on LSB Matching

Authors: Emad T. Khalaf, Norrozila Sulaiman

Abstract:

Many researchers are working on information hiding techniques using different ideas and areas to hide their secrete data. This paper introduces a robust technique of hiding secret data in image based on LSB insertion and RSA encryption technique. The key of the proposed technique is to encrypt the secret data. Then the encrypted data will be converted into a bit stream and divided it into number of segments. However, the cover image will also be divided into the same number of segments. Each segment of data will be compared with each segment of image to find the best match segment, in order to create a new random sequence of segments to be inserted then in a cover image. Experimental results show that the proposed technique has a high security level and produced better stego-image quality.

Keywords: steganography; LSB Matching; RSA Encryption; data segments

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2220
12903 Tuberculosis Modelling Using Bio-PEPA Approach

Authors: Dalila Hamami, Baghdad Atmani

Abstract:

Modelling is a widely used tool to facilitate the evaluation of disease management. The interest of epidemiological models lies in their ability to explore hypothetical scenarios and provide decision makers with evidence to anticipate the consequences of disease incursion and impact of intervention strategies.

All models are, by nature, simplification of more complex systems. Models that involve diseases can be classified into different categories depending on how they treat the variability, time, space, and structure of the population. Approaches may be different from simple deterministic mathematical models, to complex stochastic simulations spatially explicit.

Thus, epidemiological modelling is now a necessity for epidemiological investigations, surveillance, testing hypotheses and generating follow-up activities necessary to perform complete and appropriate analysis.

The state of the art presented in the following, allows us to position itself to the most appropriate approaches in the epidemiological study.

Keywords: Bio-PEPA, Cellular automata, Epidemiological modelling, multi agent system, ordinary differential equations, PEPA, Process Algebra, Tuberculosis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2158
12902 Intelligent Temperature Controller for Water-Bath System

Authors: Om Prakash Verma, Rajesh Singla, Rajesh Kumar

Abstract:

Conventional controller’s usually required a prior knowledge of mathematical modelling of the process. The inaccuracy of mathematical modelling degrades the performance of the process, especially for non-linear and complex control problem. The process used is Water-Bath system, which is most widely used and nonlinear to some extent. For Water-Bath system, it is necessary to attain desired temperature within a specified period of time to avoid the overshoot and absolute error, with better temperature tracking capability, else the process is disturbed.

To overcome above difficulties intelligent controllers, Fuzzy Logic (FL) and Adaptive Neuro-Fuzzy Inference System (ANFIS), are proposed in this paper. The Fuzzy controller is designed to work with knowledge in the form of linguistic control rules. But the translation of these linguistic rules into the framework of fuzzy set theory depends on the choice of certain parameters, for which no formal method is known. To design ANFIS, Fuzzy-Inference-System is combined with learning capability of Neural-Network.

It is analyzed that ANFIS is best suitable for adaptive temperature control of above system. As compared to PID and FLC, ANFIS produces a stable control signal. It has much better temperature tracking capability with almost zero overshoot and minimum absolute error.

Keywords: PID Controller, FLC, ANFIS, Non-Linear Control System, Water-Bath System, MATLAB-7.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5548
12901 On the Noise Distance in Robust Fuzzy C-Means

Authors: M. G. C. A. Cimino, G. Frosini, B. Lazzerini, F. Marcelloni

Abstract:

In the last decades, a number of robust fuzzy clustering algorithms have been proposed to partition data sets affected by noise and outliers. Robust fuzzy C-means (robust-FCM) is certainly one of the most known among these algorithms. In robust-FCM, noise is modeled as a separate cluster and is characterized by a prototype that has a constant distance δ from all data points. Distance δ determines the boundary of the noise cluster and therefore is a critical parameter of the algorithm. Though some approaches have been proposed to automatically determine the most suitable δ for the specific application, up to today an efficient and fully satisfactory solution does not exist. The aim of this paper is to propose a novel method to compute the optimal δ based on the analysis of the distribution of the percentage of objects assigned to the noise cluster in repeated executions of the robust-FCM with decreasing values of δ . The extremely encouraging results obtained on some data sets found in the literature are shown and discussed.

Keywords: noise prototype, robust fuzzy clustering, robustfuzzy C-means

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1822
12900 Generalized Method for Estimating Best-Fit Vertical Alignments for Profile Data

Authors: Said M. Easa, Shinya Kikuchi

Abstract:

When the profile information of an existing road is missing or not up-to-date and the parameters of the vertical alignment are needed for engineering analysis, the engineer has to recreate the geometric design features of the road alignment using collected profile data. The profile data may be collected using traditional surveying methods, global positioning systems, or digital imagery. This paper develops a method that estimates the parameters of the geometric features that best characterize the existing vertical alignments in terms of tangents and the expressions of the curve, that may be symmetrical, asymmetrical, reverse, and complex vertical curves. The method is implemented using an Excel-based optimization method that minimizes the differences between the observed profile and the profiles estimated from the equations of the vertical curve. The method uses a 'wireframe' representation of the profile that makes the proposed method applicable to all types of vertical curves. A secondary contribution of this paper is to introduce the properties of the equal-arc asymmetrical curve that has been recently developed in the highway geometric design field.

Keywords: Optimization, parameters, data, reverse, spreadsheet, vertical curves

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2448
12899 A Phenomic Algorithm for Reconstruction of Gene Networks

Authors: Rio G. L. D'Souza, K. Chandra Sekaran, A. Kandasamy

Abstract:

The goal of Gene Expression Analysis is to understand the processes that underlie the regulatory networks and pathways controlling inter-cellular and intra-cellular activities. In recent times microarray datasets are extensively used for this purpose. The scope of such analysis has broadened in recent times towards reconstruction of gene networks and other holistic approaches of Systems Biology. Evolutionary methods are proving to be successful in such problems and a number of such methods have been proposed. However all these methods are based on processing of genotypic information. Towards this end, there is a need to develop evolutionary methods that address phenotypic interactions together with genotypic interactions. We present a novel evolutionary approach, called Phenomic algorithm, wherein the focus is on phenotypic interaction. We use the expression profiles of genes to model the interactions between them at the phenotypic level. We apply this algorithm to the yeast sporulation dataset and show that the algorithm can identify gene networks with relative ease.

Keywords: Evolutionary computing, gene expression analysis, gene networks, microarray data analysis, phenomic algorithms.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1926
12898 Application of the Experimental Planning Design to the Notched Precracked Tensile Fracture of Composite

Authors: N. Mahmoudi

Abstract:

Composite materials have important assets compared to traditional materials. They bring many functional advantages: lightness, mechanical resistance and chemical, etc. In the present study we examine the effect of a circular central notch and a precrack on the tensile fracture of two woven composite materials. The tensile tests were applied to a standardized specimen, notched and a precarcked (orientation of the crack 0°, 45° and 90°). These tensile tests were elaborated according to an experimental planning design of the type 23.31 requiring 24 experiments with three repetitions. By the analysis of regression, we obtained a mathematical model describing the maximum load according to the influential parameters (hole diameter, precrack length, angle of a precrack orientation). The specimens precracked at 90° have a better behavior than those having a precrack at 45° and still better than those having of the precracks oriented at 0°. In addition the maximum load is inversely proportional to the notch size.

Keywords: Polymer matrix, Glasses, Fracture.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1859
12897 Emotions Triggered by Children’s Literature Images

Authors: A. Breda, C. Cruz

Abstract:

The role of images/illustrations in communicating meanings and triggering emotions assumes an increasingly relevant role in contemporary texts, regardless of the age group for which they are intended or the nature of the texts that host them. It is no coincidence that children's books are full of illustrations and that the image/text ratio decreases as the age group grows. The vast majority of children's books can be considered as multimodal texts containing text and images/illustrations, interacting with each other, to provide the young reader with a broader and more creative understanding of the book's narrative. This interaction is very diverse, ranging from images/illustrations that are not essential for understanding the storytelling to those that contribute significantly to the meaning of the story. Usually, these books are also read by adults, namely by parents, educators, and teachers who act as mediators between the book and the children, explaining aspects that are or seem to be too complex for the child's context. It should be noted that there are books labeled as children's books, that are clearly intended for both children and adults. In this work, following a qualitative and interpretative methodology based on written productions, participant observation, and field notes, we will describe the perceptions of future teachers of the 1st cycle of basic education, attending a master’s degree at a Portuguese university, about the role of the image in literary and non-literary texts, namely in mathematical texts, and how these can constitute precious resources for emotional regulation and for the design of creative didactic situations. The analysis of the collected data allowed us to obtain evidence regarding the evolution of the participants' perception regarding the crucial role of images in children's literature, not only as an emotional regulator for young readers but also as a creative source for the design of meaningful didactical situations, crossing other scientific areas, other than the mother tongue, namely mathematics.

Keywords: Children’s literature, emotions, multimodal texts, soft skills.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 199
12896 Digital Twin of Real Electrical Distribution System with Real Time Recursive Load Flow Calculation and State Estimation

Authors: Anosh Arshad Sundhu, Francesco Giordano, Giacomo Della Croce, Maurizio Arnone

Abstract:

Digital Twin (DT) is a technology that generates a virtual representation of a physical system or process, enabling real-time monitoring, analysis, and simulation. DT of an Electrical Distribution System (EDS) can perform online analysis by integrating the static and real-time data in order to show the current grid status and predictions about the future status to the Distribution System Operator (DSO), producers and consumers. DT technology for EDS also offers the opportunity to DSO to test hypothetical scenarios. This paper discusses the development of a DT of an EDS by Smart Grid Controller (SGC) application, which is developed using open-source libraries and languages. The developed application can be integrated with Supervisory Control and Data Acquisition System (SCADA) of any EDS for creating the DT. The paper shows the performance of developed tools inside the application, tested on real EDS for grid observability, Smart Recursive Load Flow (SRLF) calculation and state estimation of loads in MV feeders.

Keywords: Digital Twin, Distribution System Operator, Electrical Distribution System, Smart Grid Controller, Supervisory Control and Data Acquisition System, Smart Recursive Load Flow.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 255
12895 Customer Satisfaction and Effective HRM Policies: Customer and Employee Satisfaction

Authors: S. Anastasiou, C. Nathanailides

Abstract:

The purpose of this study is to examine the possible link between employee and customer satisfaction. The service provided by employees, help to build a good relationship with customers and can help at increasing their loyalty. Published data for job satisfaction and indicators of customer services of banks were gathered from relevant published works which included data from five different countries. The scores of customers and employees satisfaction of the different published works were transformed and normalized to the scale of 1 to 100. The data were analyzed and a regression analysis of the two parameters was used to describe the link between employee’s satisfaction and customer’s satisfaction. Assuming that employee satisfaction has a significant influence on customer’s service and the resulting customer satisfaction, the reviewed data indicate that employee’s satisfaction contributes significantly on the level of customer satisfaction in the Banking sector. There was a significant correlation between the two parameters (Pearson correlation R2=0.52 P<0.05). The reviewed data indicate that published data support the hypothesis that practical evidence link these two parameters. During the recent global economic crisis, the financial services sector was affected severely and job security, remuneration and recruitment of personnel of banks was in many countries, including Greece, significantly reduced. Nevertheless, modern organizations should always consider their personnel as a capital, which is the driving force for success in the future. Appropriate human resource management policies can increase the level of job satisfaction of the personnel with positive consequences for the level of customer’s satisfaction.

Keywords: Job satisfaction, job performance, customer service, banks, human resources management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5121
12894 A Prediction of Attractive Evaluation Objects Based On Complex Sequential Data

Authors: Shigeaki Sakurai, Makino Kyoko, Shigeru Matsumoto

Abstract:

This paper proposes a method that predicts attractive evaluation objects. In the learning phase, the method inductively acquires trend rules from complex sequential data. The data is composed of two types of data. One is numerical sequential data. Each evaluation object has respective numerical sequential data. The other is text sequential data. Each evaluation object is described in texts. The trend rules represent changes of numerical values related to evaluation objects. In the prediction phase, the method applies new text sequential data to the trend rules and evaluates which evaluation objects are attractive. This paper verifies the effect of the proposed method by using stock price sequences and news headline sequences. In these sequences, each stock brand corresponds to an evaluation object. This paper discusses validity of predicted attractive evaluation objects, the process time of each phase, and the possibility of application tasks.

Keywords: Trend rule, frequent pattern, numerical sequential data, text sequential data, evaluation object.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1235
12893 Application of Building Information Modeling in Energy Management of Individual Departments Occupying University Facilities

Authors: Kung-Jen Tu, Danny Vernatha

Abstract:

To assist individual departments within universities in their energy management tasks, this study explores the application of Building Information Modeling in establishing the ‘BIM based Energy Management Support System’ (BIM-EMSS). The BIM-EMSS consists of six components: (1) sensors installed for each occupant and each equipment, (2) electricity sub-meters (constantly logging lighting, HVAC, and socket electricity consumptions of each room), (3) BIM models of all rooms within individual departments’ facilities, (4) data warehouse (for storing occupancy status and logged electricity consumption data), (5) building energy management system that provides energy managers with various energy management functions, and (6) energy simulation tool (such as eQuest) that generates real time 'standard energy consumptions' data against which 'actual energy consumptions' data are compared and energy efficiency evaluated. Through the building energy management system, the energy manager is able to (a) have 3D visualization (BIM model) of each room, in which the occupancy and equipment status detected by the sensors and the electricity consumptions data logged are displayed constantly; (b) perform real time energy consumption analysis to compare the actual and standard energy consumption profiles of a space; (c) obtain energy consumption anomaly detection warnings on certain rooms so that energy management corrective actions can be further taken (data mining technique is employed to analyze the relation between space occupancy pattern with current space equipment setting to indicate an anomaly, such as when appliances turn on without occupancy); and (d) perform historical energy consumption analysis to review monthly and annually energy consumption profiles and compare them against historical energy profiles. The BIM-EMSS was further implemented in a research lab in the Department of Architecture of NTUST in Taiwan and implementation results presented to illustrate how it can be used to assist individual departments within universities in their energy management tasks.

Keywords: Sensor, electricity sub-meters, database, energy anomaly detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2284
12892 Main Control Factors of Fluid Loss in Drilling and Completion in Shunbei Oilfield by Unmanned Intervention Algorithm

Authors: Peng Zhang, Lihui Zheng, Xiangchun Wang, Xiaopan Kou

Abstract:

Quantitative research on the main control factors of lost circulation has few considerations and single data source. Using Unmanned Intervention Algorithm to find the main control factors of lost circulation adopts all measurable parameters. The degree of lost circulation is characterized by the loss rate as the objective function. Geological, engineering and fluid data are used as layers, and 27 factors such as wellhead coordinates and Weight on Bit (WOB) used as dimensions. Data classification is implemented to determine function independent variables. The mathematical equation of loss rate and 27 influencing factors is established by multiple regression method, and the undetermined coefficient method is used to solve the undetermined coefficient of the equation. Only three factors in t-test are greater than the test value 40, and the F-test value is 96.557%, indicating that the correlation of the model is good. The funnel viscosity, final shear force and drilling time were selected as the main control factors by elimination method, contribution rate method and functional method. The calculated values of the two wells used for verification differ from the actual values by -3.036 m3/h and -2.374 m3/h, with errors of 7.21% and 6.35%. The influence of engineering factors on the loss rate is greater than that of funnel viscosity and final shear force, and the influence of the three factors is less than that of geological factors. The best combination of funnel viscosity, final shear force and drilling time is obtained through quantitative calculation. The minimum loss rate of lost circulation wells in Shunbei area is 10 m3/h. It can be seen that man-made main control factors can only slow down the leakage, but cannot fundamentally eliminate it. This is more in line with the characteristics of karst caves and fractures in Shunbei fault solution oil and gas reservoir.

Keywords: Drilling fluid, loss rate, main controlling factors, Unmanned Intervention Algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 401
12891 A Comparative Study of Fine Grained Security Techniques Based on Data Accessibility and Inference

Authors: Azhar Rauf, Sareer Badshah, Shah Khusro

Abstract:

This paper analyzes different techniques of the fine grained security of relational databases for the two variables-data accessibility and inference. Data accessibility measures the amount of data available to the users after applying a security technique on a table. Inference is the proportion of information leakage after suppressing a cell containing secret data. A row containing a secret cell which is suppressed can become a security threat if an intruder generates useful information from the related visible information of the same row. This paper measures data accessibility and inference associated with row, cell, and column level security techniques. Cell level security offers greatest data accessibility as it suppresses secret data only. But on the other hand, there is a high probability of inference in cell level security. Row and column level security techniques have least data accessibility and inference. This paper introduces cell plus innocent security technique that utilizes the cell level security method but suppresses some innocent data to dodge an intruder that a suppressed cell may not necessarily contain secret data. Four variations of the technique namely cell plus innocent 1/4, cell plus innocent 2/4, cell plus innocent 3/4, and cell plus innocent 4/4 respectively have been introduced to suppress innocent data equal to 1/4, 2/4, 3/4, and 4/4 percent of the true secret data inside the database. Results show that the new technique offers better control over data accessibility and inference as compared to the state-of-theart security techniques. This paper further discusses the combination of techniques together to be used. The paper shows that cell plus innocent 1/4, 2/4, and 3/4 techniques can be used as a replacement for the cell level security.

Keywords: Fine Grained Security, Data Accessibility, Inference, Row, Cell, Column Level Security.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1471
12890 Spatial Mapping of Dengue Incidence: A Case Study in Hulu Langat District, Selangor, Malaysia

Authors: Er, A. C., Rosli, M. H., Asmahani A., Mohamad Naim M. R., Harsuzilawati M.

Abstract:

Dengue is a mosquito-borne infection that has peaked to an alarming rate in recent decades. It can be found in tropical and sub-tropical climate. In Malaysia, dengue has been declared as one of the national health threat to the public. This study aimed to map the spatial distributions of dengue cases in the district of Hulu Langat, Selangor via a combination of Geographic Information System (GIS) and spatial statistic tools. Data related to dengue was gathered from the various government health agencies. The location of dengue cases was geocoded using a handheld GPS Juno SB Trimble. A total of 197 dengue cases occurring in 2003 were used in this study. Those data then was aggregated into sub-district level and then converted into GIS format. The study also used population or demographic data as well as the boundary of Hulu Langat. To assess the spatial distribution of dengue cases three spatial statistics method (Moran-s I, average nearest neighborhood (ANN) and kernel density estimation) were applied together with spatial analysis in the GIS environment. Those three indices were used to analyze the spatial distribution and average distance of dengue incidence and to locate the hot spot of dengue cases. The results indicated that the dengue cases was clustered (p < 0.01) when analyze using Moran-s I with z scores 5.03. The results from ANN analysis showed that the average nearest neighbor ratio is less than 1 which is 0.518755 (p < 0.0001). From this result, we can expect the dengue cases pattern in Hulu Langat district is exhibiting a cluster pattern. The z-score for dengue incidence within the district is -13.0525 (p < 0.0001). It was also found that the significant spatial autocorrelation of dengue incidences occurs at an average distance of 380.81 meters (p < 0.0001). Several locations especially residential area also had been identified as the hot spots of dengue cases in the district.

Keywords: Dengue, geographic information system (GIS), spatial analysis, spatial statistics

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5368
12889 Differentiation of Heart Rate Time Series from Electroencephalogram and Noise

Authors: V. I. Thajudin Ahamed, P. Dhanasekaran, Paul Joseph K.

Abstract:

Analysis of heart rate variability (HRV) has become a popular non-invasive tool for assessing the activities of autonomic nervous system. Most of the methods were hired from techniques used for time series analysis. Currently used methods are time domain, frequency domain, geometrical and fractal methods. A new technique, which searches for pattern repeatability in a time series, is proposed for quantifying heart rate (HR) time series. These set of indices, which are termed as pattern repeatability measure and pattern repeatability ratio are able to distinguish HR data clearly from noise and electroencephalogram (EEG). The results of analysis using these measures give an insight into the fundamental difference between the composition of HR time series with respect to EEG and noise.

Keywords: Approximate entropy, heart rate variability, noise, pattern repeatability, and sample entropy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1734
12888 Assessment of Occupational Exposure and Individual Radio-Sensitivity in People Subjected to Ionizing Radiation

Authors: Oksana G. Cherednichenko, Anastasia L. Pilyugina, Sergey N.Lukashenko, Elena G. Gubitskaya

Abstract:

The estimation of accumulated radiation doses in people professionally exposed to ionizing radiation was performed using methods of biological (chromosomal aberrations frequency in lymphocytes) and physical (radionuclides analysis in urine, whole-body radiation meter, individual thermoluminescent dosimeters) dosimetry. A group of 84 "A" category employees after their work in the territory of former Semipalatinsk test site (Kazakhstan) was investigated. The dose rate in some funnels exceeds 40 μSv/h. After radionuclides determination in urine using radiochemical and WBC methods, it was shown that the total effective dose of personnel internal exposure did not exceed 0.2 mSv/year, while an acceptable dose limit for staff is 20 mSv/year. The range of external radiation doses measured with individual thermo-luminescent dosimeters was 0.3-1.406 µSv. The cytogenetic examination showed that chromosomal aberrations frequency in staff was 4.27±0.22%, which is significantly higher than at the people from non-polluting settlement Tausugur (0.87±0.1%) (р ≤ 0.01) and citizens of Almaty (1.6±0.12%) (р≤ 0.01). Chromosomal type aberrations accounted for 2.32±0.16%, 0.27±0.06% of which were dicentrics and centric rings. The cytogenetic analysis of different types group radiosensitivity among «professionals» (age, sex, ethnic group, epidemiological data) revealed no significant differences between the compared values. Using various techniques by frequency of dicentrics and centric rings, the average cumulative radiation dose for group was calculated, and that was 0.084-0.143 Gy. To perform comparative individual dosimetry using physical and biological methods of dose assessment, calibration curves (including own ones) and regression equations based on general frequency of chromosomal aberrations obtained after irradiation of blood samples by gamma-radiation with the dose rate of 0,1 Gy/min were used. Herewith, on the assumption of individual variation of chromosomal aberrations frequency (1–10%), the accumulated dose of radiation varied 0-0.3 Gy. The main problem in the interpretation of individual dosimetry results is reduced to different reaction of the objects to irradiation - radiosensitivity, which dictates the need of quantitative definition of this individual reaction and its consideration in the calculation of the received radiation dose. The entire examined contingent was assigned to a group based on the received dose and detected cytogenetic aberrations. Radiosensitive individuals, at the lowest received dose in a year, showed the highest frequency of chromosomal aberrations (5.72%). In opposite, radioresistant individuals showed the lowest frequency of chromosomal aberrations (2.8%). The cohort correlation according to the criterion of radio-sensitivity in our research was distributed as follows: radio-sensitive (26.2%) — medium radio-sensitivity (57.1%), radioresistant (16.7%). Herewith, the dispersion for radioresistant individuals is 2.3; for the group with medium radio-sensitivity — 3.3; and for radio-sensitive group — 9. These data indicate the highest variation of characteristic (reactions to radiation effect) in the group of radio-sensitive individuals. People with medium radio-sensitivity show significant long-term correlation (0.66; n=48, β ≥ 0.999) between the values of doses defined according to the results of cytogenetic analysis and dose of external radiation obtained with the help of thermoluminescent dosimeters. Mathematical models based on the type of violation of the radiation dose according to the professionals radiosensitivity level were offered.

Keywords: Biodosimetry, chromosomal aberrations, ionizing radiation, radiosensitivity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 938
12887 Lexicon-Based Sentiment Analysis for Stock Movement Prediction

Authors: Zane Turner, Kevin Labille, Susan Gauch

Abstract:

Sentiment analysis is a broad and expanding field that aims to extract and classify opinions from textual data. Lexicon-based approaches are based on the use of a sentiment lexicon, i.e., a list of words each mapped to a sentiment score, to rate the sentiment of a text chunk. Our work focuses on predicting stock price change using a sentiment lexicon built from financial conference call logs. We present a method to generate a sentiment lexicon based upon an existing probabilistic approach. By using a domain-specific lexicon, we outperform traditional techniques and demonstrate that domain-specific sentiment lexicons provide higher accuracy than generic sentiment lexicons when predicting stock price change.

Keywords: Lexicon, sentiment analysis, stock movement prediction., computational finance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 779
12886 Manifold Analysis by Topologically Constrained Isometric Embedding

Authors: Guy Rosman, Alexander M. Bronstein, Michael M. Bronstein, Ron Kimmel

Abstract:

We present a new algorithm for nonlinear dimensionality reduction that consistently uses global information, and that enables understanding the intrinsic geometry of non-convex manifolds. Compared to methods that consider only local information, our method appears to be more robust to noise. Unlike most methods that incorporate global information, the proposed approach automatically handles non-convexity of the data manifold. We demonstrate the performance of our algorithm and compare it to state-of-the-art methods on synthetic as well as real data.

Keywords: Dimensionality reduction, manifold learning, multidimensional scaling, geodesic distance, boundary detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1454
12885 Weka Based Desktop Data Mining as Web Service

Authors: Sujala.D.Shetty, S.Vadivel, Sakshi Vaghella

Abstract:

Data mining is the process of sifting through large volumes of data, analyzing data from different perspectives and summarizing it into useful information. One of the widely used desktop applications for data mining is the Weka tool which is nothing but a collection of machine learning algorithms implemented in Java and open sourced under the General Public License (GPL). A web service is a software system designed to support interoperable machine to machine interaction over a network using SOAP messages. Unlike a desktop application, a web service is easy to upgrade, deliver and access and does not occupy any memory on the system. Keeping in mind the advantages of a web service over a desktop application, in this paper we are demonstrating how this Java based desktop data mining application can be implemented as a web service to support data mining across the internet.

Keywords: desktop application, Weka mining, web service

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4081
12884 Analysis of Air Quality in the Outdoor Environment of the City of Messina by an Application of the Pollution Index Method

Authors: G. Cannistraro, L. Ponterio

Abstract:

In this paper is reported an analysis about the outdoor air pollution of the urban centre of the city of Messina. The variations of the most critical pollutants concentrations (PM10, O3, CO, C6H6) and their trends respect of climatic parameters and vehicular traffic have been studied. Linear regressions have been effectuated for representing the relations among the pollutants; the differences between pollutants concentrations on weekend/weekday were also analyzed. In order to evaluate air pollution and its effects on human health, a method for calculating a pollution index was implemented and applied in the urban centre of the city. This index is based on the weighted mean of the most detrimental air pollutants concentrations respect of their limit values for protection of human health. The analyzed data of the polluting substances were collected by the Assessorship of the Environment of the Regional Province of Messina in the year 2004. A statistical analysis of the air quality index trends is also reported.

Keywords: Environmental pollution, Pollutants levels, Linearregression, Air Quality Index, Statistical analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1779