Search results for: Continuous Data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8050

Search results for: Continuous Data

7690 Power Saving System in Green Data Center

Authors: Joon-young Jung, Dong-oh Kang, Chang-seok Bae

Abstract:

Power consumption is rapidly increased in data centers because the number of data center is increased and more the scale of data center become larger. Therefore, it is one of key research items to reduce power consumption in data center. The peak power of a typical server is around 250 watts. When a server is idle, it continues to use around 60% of the power consumed when in use, though vendors are putting effort into reducing this “idle" power load. Servers tend to work at only around a 5% to 20% utilization rate, partly because of response time concerns. An average of 10% of servers in their data centers was unused. In those reason, we propose dynamic power management system to reduce power consumption in green data center. Experiment result shows that about 55% power consumption is reduced at idle time.

Keywords: Data Center, Green IT, Management Server, Power Saving.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1639
7689 Evaluation of the IMERG Product Performance at Estimating the Rainfall Properties in a Semi-Arid Region of Mexico

Authors: Eric Muñoz de la Torre, Julián González Trinidad, Efrén González Ramírez

Abstract:

Rain varies greatly in its duration, intensity, and spatial coverage, it is important to have sub-daily rainfall data for various applications, including risk prevention, however, the ground measurements are limited by the low and irregular density of rain gauges. An alternative to this problem is the Satellite Precipitation Products (SPPs) that use passive microwave and infrared sensors to estimate rainfall, as IMERG, however, these SPPs have to be validated before their application. The aim of this study is to evaluate the performance of the IMERG: Integrated Multi-satellitE Retrievals for Global Precipitation Measurement final run V06B SPP in a semi-arid region of Mexico, using four rain gauges sub-daily data of October 2019 and June to September 2021, using the Minimum inter-event Time (MIT) criterion to separate unique rain events with a dry period of 10 hrs for the purpose of evaluating the rainfall properties (depth, duration and intensity). Point to pixel analysis, continuous, categorical, and volumetric statistical metrics were used. Results show that IMERG is capable to estimate the rainfall depth with a slight overestimation but is unable to identify the real duration and intensity of the rain events, showing moderate overestimations and underestimations, respectively. The study zone presented 80 to 85% of convective rain events, the rest were stratiform rain events, classified by the depth magnitude variation of IMERG pixels and rain gauges. IMERG showed poorer performance at detecting the first ones but had a good performance at estimating stratiform rain events that are originated by Cold Fronts.

Keywords: IMERG, rainfall, rain gauge, remote sensing, statistical evaluation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 109
7688 Typical Day Prediction Model for Output Power and Energy Efficiency of a Grid-Connected Solar Photovoltaic System

Authors: Yan Su, L. C. Chan

Abstract:

A novel typical day prediction model have been built and validated by the measured data of a grid-connected solar photovoltaic (PV) system in Macau. Unlike conventional statistical method used by previous study on PV systems which get results by averaging nearby continuous points, the present typical day statistical method obtain the value at every minute in a typical day by averaging discontinuous points at the same minute in different days. This typical day statistical method based on discontinuous point averaging makes it possible for us to obtain the Gaussian shape dynamical distributions for solar irradiance and output power in a yearly or monthly typical day. Based on the yearly typical day statistical analysis results, the maximum possible accumulated output energy in a year with on site climate conditions and the corresponding optimal PV system running time are obtained. Periodic Gaussian shape prediction models for solar irradiance, output energy and system energy efficiency have been built and their coefficients have been determined based on the yearly, maximum and minimum monthly typical day Gaussian distribution parameters, which are obtained from iterations for minimum Root Mean Squared Deviation (RMSD). With the present model, the dynamical effects due to time difference in a day are kept and the day to day uncertainty due to weather changing are smoothed but still included. The periodic Gaussian shape correlations for solar irradiance, output power and system energy efficiency have been compared favorably with data of the PV system in Macau and proved to be an improvement than previous models.

Keywords: Grid Connected, RMSD, Solar PV System, Typical Day.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1688
7687 Effect of Modification and Expansion on Emergence of Cooperation in Demographic Multi-Level Donor-Recipient Game

Authors: Tsuneyuki Namekata, Yoko Namekata

Abstract:

It is known that the mean investment evolves from a very low initial value to some high level in the Continuous Prisoner's Dilemma. We examine how the cooperation level evolves from a low initial level to a high level in our Demographic Multi-level Donor-Recipient situation. In the Multi-level Donor-Recipient game, one player is selected as a Donor and the other as a Recipient randomly. The Donor has multiple cooperative moves and one defective move. A cooperative move means the Donor pays some cost for the Recipient to receive some benefit. The more cooperative move the Donor takes, the higher cost the Donor pays and the higher benefit the Recipient receives. The defective move has no effect on them. Two consecutive Multi-level Donor-Recipient games, one as a Donor and the other as a Recipient, can be viewed as a discrete version of the Continuous Prisoner's Dilemma. In the Demographic Multi-level Donor-Recipient game, players are initially distributed spatially. In each period, players play multiple Multi-level Donor-Recipient games against other players. He leaves offspring if possible and dies because of negative accumulated payoff of him or his lifespan. Cooperative moves are necessary for the survival of the whole population. There is only a low level of cooperative move besides the defective move initially available in strategies of players. A player may modify and expand his strategy by his recent experiences or practices. We distinguish several types of a player about modification and expansion. We show, by Agent-Based Simulation, that introducing only the modification increases the emergence rate of cooperation and introducing both the modification and the expansion further increases it and a high level of cooperation does emerge in our Demographic Multi-level Donor-Recipient Game.

Keywords: Agent-based simulation, donor-recipient game, emergence of cooperation, spatial structure, TFT, TF2T.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 880
7686 A Mesh Free Moving Node Method To Analyze Flow Through Spirals of Orbiting Scroll Pump

Authors: I.Banerjee, A.K.Mahendra, T.K.Bera, B.G.Chandresh

Abstract:

The scroll pump belongs to the category of positive displacement pump can be used for continuous pumping of gases at low pressure apart from general vacuum application. The shape of volume occupied by the gas moves and deforms continuously as the spiral orbits. To capture flow features in such domain where mesh deformation varies with time in a complicated manner, mesh less solver was found to be very useful. Least Squares Kinetic Upwind Method (LSKUM) is a kinetic theory based mesh free Euler solver working on arbitrary distribution of points. Here upwind is enforced in molecular level based on kinetic flux vector splitting scheme (KFVS). In the present study we extended the LSKUM to moving node viscous flow application. This new code LSKUM-NS-MN for moving node viscous flow is validated for standard airfoil pitching test case. Simulation performed for flow through scroll pump using LSKUM-NS-MN code agrees well with the experimental pumping speed data.

Keywords: Least Squares, Moving node, Pitching, Spirals.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1912
7685 MATLAB-Based Graphical User Interface (GUI) for Data Mining as a Tool for Environment Management

Authors: M. Awawdeh, A. Fedi

Abstract:

The application of data mining to environmental monitoring has become crucial for a number of tasks related to emergency management. Over recent years, many tools have been developed for decision support system (DSS) for emergency management. In this article a graphical user interface (GUI) for environmental monitoring system is presented. This interface allows accomplishing (i) data collection and observation and (ii) extraction for data mining. This tool may be the basis for future development along the line of the open source software paradigm.

Keywords: Data Mining, Environmental data, Mathematical Models, Matlab Graphical User Interface.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4747
7684 GeoSEMA: A Modelling Platform, Emerging “GeoSpatial-based Evolutionary and Mobile Agents“

Authors: Mohamed Dbouk, Ihab Sbeity

Abstract:

Spatial and mobile computing evolves. This paper describes a smart modeling platform called “GeoSEMA". This approach tends to model multidimensional GeoSpatial Evolutionary and Mobile Agents. Instead of 3D and location-based issues, there are some other dimensions that may characterize spatial agents, e.g. discrete-continuous time, agent behaviors. GeoSEMA is seen as a devoted design pattern motivating temporal geographic-based applications; it is a firm foundation for multipurpose and multidimensional special-based applications. It deals with multipurpose smart objects (buildings, shapes, missiles, etc.) by stimulating geospatial agents. Formally, GeoSEMA refers to geospatial, spatio-evolutive and mobile space constituents where a conceptual geospatial space model is given in this paper. In addition to modeling and categorizing geospatial agents, the model incorporates the concept of inter-agents event-based protocols. Finally, a rapid software-architecture prototyping GeoSEMA platform is also given. It will be implemented/ validated in the next phase of our work.

Keywords: Location-Trajectory management, GIS, Mobile- Moving Objects/Agents, Multipurpose/Spatiotemporal data, Multi- Agent Systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1659
7683 Impact of ISO 9000 on Time-based Performance: An Event Study

Authors: Chris K. Y. Lo, Andy C. L. Yeung, T. C. Edwin Cheng

Abstract:

ISO 9000 is the most popular and widely adopted meta-standard for quality and operational improvements. However, only limited empirical research has been conducted to examine the impact of ISO 9000 on operational performance based on objective and longitudinal data. To reveal any causal relationship between the adoption of ISO 9000 and operational performance, we examined the timing and magnitude of change in time-based performance as a result of ISO 9000 adoption. We analyzed the changes in operating cycle, inventory days, and account receivable days prior and after the implementation of ISO 9000 in 695 publicly listed manufacturing firms. We found that ISO 9000 certified firms shortened their operating cycle time by 5.28 days one year after the implementation of ISO 9000. In the long-run (3 years after certification), certified firms showed continuous improvement in time-based efficiency, and experienced a shorter operating cycle time of 11 days than that of non-certified firms. There was an average of 6.5% improvement in operating cycle time for ISO 9000 certified firms. Both inventory days and account receivable days showed similar significant improvements after the implementation of ISO 9000, too.

Keywords: ISO 9000, Operating Cycle, Time-based efficiency.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2083
7682 Principal Component Analysis using Singular Value Decomposition of Microarray Data

Authors: Dong Hoon Lim

Abstract:

A series of microarray experiments produces observations of differential expression for thousands of genes across multiple conditions. Principal component analysis(PCA) has been widely used in multivariate data analysis to reduce the dimensionality of the data in order to simplify subsequent analysis and allow for summarization of the data in a parsimonious manner. PCA, which can be implemented via a singular value decomposition(SVD), is useful for analysis of microarray data. For application of PCA using SVD we use the DNA microarray data for the small round blue cell tumors(SRBCT) of childhood by Khan et al.(2001). To decide the number of components which account for sufficient amount of information we draw scree plot. Biplot, a graphic display associated with PCA, reveals important features that exhibit relationship between variables and also the relationship of variables with observations.

Keywords: Principal component analysis, singular value decomposition, microarray data, SRBCT

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3258
7681 Clustering Mixed Data Using Non-normal Regression Tree for Process Monitoring

Authors: Youngji Yoo, Cheong-Sool Park, Jun Seok Kim, Young-Hak Lee, Sung-Shick Kim, Jun-Geol Baek

Abstract:

In the semiconductor manufacturing process, large amounts of data are collected from various sensors of multiple facilities. The collected data from sensors have several different characteristics due to variables such as types of products, former processes and recipes. In general, Statistical Quality Control (SQC) methods assume the normality of the data to detect out-of-control states of processes. Although the collected data have different characteristics, using the data as inputs of SQC will increase variations of data, require wide control limits, and decrease performance to detect outof- control. Therefore, it is necessary to separate similar data groups from mixed data for more accurate process control. In the paper, we propose a regression tree using split algorithm based on Pearson distribution to handle non-normal distribution in parametric method. The regression tree finds similar properties of data from different variables. The experiments using real semiconductor manufacturing process data show improved performance in fault detecting ability.

Keywords: Semiconductor, non-normal mixed process data, clustering, Statistical Quality Control (SQC), regression tree, Pearson distribution system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1788
7680 A Simulated Design and Analysis of a Solar Thermal Parabolic Trough Concentrator

Authors: Fauziah Sulaiman, Nurhayati Abdullah, Balbir Singh Mahinder Singh

Abstract:

In recent years Malaysia has included renewable energy as an alternative fuel to help in diversifying the country-s energy reliance on oil, natural gas, coal and hydropower with biomass and solar energy gaining priority. The scope of this paper is to look at the designing procedures and analysis of a solar thermal parabolic trough concentrator by simulation utilizing meteorological data in several parts of Malaysia. Parameters which include the aperture area, the diameter of the receiver and the working fluid may be varied to optimize the design. Aperture area is determined by considering the width and the length of the concentrator whereas the geometric concentration ratio (CR) is obtained by considering the width and diameter of the receiver. Three types of working fluid are investigated. Theoretically, concentration ratios can be very high in the range of 10 to 40 000 depending on the optical elements used and continuous tracking of the sun. However, a thorough analysis is essential as discussed in this paper where optical precision and thermal analysis must be carried out to evaluate the performance of the parabolic trough concentrator as the theoretical CR is not the only factor that should be considered.

Keywords: Parabolic trough concentrator, Concentration ratio, Intercept factor, Efficiency.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3988
7679 Speech Data Compression using Vector Quantization

Authors: H. B. Kekre, Tanuja K. Sarode

Abstract:

Mostly transforms are used for speech data compressions which are lossy algorithms. Such algorithms are tolerable for speech data compression since the loss in quality is not perceived by the human ear. However the vector quantization (VQ) has a potential to give more data compression maintaining the same quality. In this paper we propose speech data compression algorithm using vector quantization technique. We have used VQ algorithms LBG, KPE and FCG. The results table shows computational complexity of these three algorithms. Here we have introduced a new performance parameter Average Fractional Change in Speech Sample (AFCSS). Our FCG algorithm gives far better performance considering mean absolute error, AFCSS and complexity as compared to others.

Keywords: Vector Quantization, Data Compression, Encoding, , Speech coding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2410
7678 Ontology and CDSS Based Intelligent Health Data Management in Health Care Server

Authors: Eun-Jung Ko, Hyung-Jik Lee, Jeun-Woo Lee

Abstract:

In ubiqutious healthcare environment, user's health data are transfered to the remote healthcare server by the user's wearable system or mobile phone. These collected user's health data should be managed and analyzed in the healthcare server, so that care giver or user can monitor user's physiological state. In this paper, we designed and developed the intelligent Healthcare Server to manage the user's health data using CDSS and ontology. Our system can analyze user's health data semantically using CDSS and ontology, and report the result of user's physiological raw data to the user and care giver.

Keywords: u-healthcare, CDSS, healthcare server, health data, ontology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2241
7677 A Genetic Algorithm for Clustering on Image Data

Authors: Qin Ding, Jim Gasvoda

Abstract:

Clustering is the process of subdividing an input data set into a desired number of subgroups so that members of the same subgroup are similar and members of different subgroups have diverse properties. Many heuristic algorithms have been applied to the clustering problem, which is known to be NP Hard. Genetic algorithms have been used in a wide variety of fields to perform clustering, however, the technique normally has a long running time in terms of input set size. This paper proposes an efficient genetic algorithm for clustering on very large data sets, especially on image data sets. The genetic algorithm uses the most time efficient techniques along with preprocessing of the input data set. We test our algorithm on both artificial and real image data sets, both of which are of large size. The experimental results show that our algorithm outperforms the k-means algorithm in terms of running time as well as the quality of the clustering.

Keywords: Clustering, data mining, genetic algorithm, image data.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2061
7676 A Holistic Framework for Unifying Data Security and Management in Modern Enterprises

Authors: Ashly Joseph

Abstract:

Modern businesses struggle significantly to secure and manage their data properly as the volume and complexity of their data both expand exponentially. Through the use of a multi-layered defense strategy, a centralized management platform, and cutting-edge technologies like AI, this research paper presents a comprehensive framework to integrate data security and management. The constraints of current data protection and management strategies, technological advancements, and the evolving threat landscape are all examined in this article. It suggests best practices for putting into practice integrated data security and governance models, placing an emphasis on ongoing adaptation. The advantages mentioned include a strengthened security posture, simpler procedures, lower costs, and reduced complexity. Additionally, issues including skill shortages, antiquated systems, and cultural obstacles are examined. Security executives and Chief Information Security Officers are given practical advice on how to evaluate, plan, and put into place strong data-centric security and management capabilities. The goal of the paper is to provide a thorough study of the data security and management landscape and to arm contemporary businesses with the knowledge they need to be proactive in protecting their data assets.

Keywords: Data security, security management, cloud computing, cybersecurity, data governance, security architecture, data management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 301
7675 Post Mining- Discovering Valid Rules from Different Sized Data Sources

Authors: R. Nedunchezhian, K. Anbumani

Abstract:

A big organization may have multiple branches spread across different locations. Processing of data from these branches becomes a huge task when innumerable transactions take place. Also, branches may be reluctant to forward their data for centralized processing but are ready to pass their association rules. Local mining may also generate a large amount of rules. Further, it is not practically possible for all local data sources to be of the same size. A model is proposed for discovering valid rules from different sized data sources where the valid rules are high weighted rules. These rules can be obtained from the high frequency rules generated from each of the data sources. A data source selection procedure is considered in order to efficiently synthesize rules. Support Equalization is another method proposed which focuses on eliminating low frequency rules at the local sites itself thus reducing the rules by a significant amount.

Keywords: Association rules, multiple data stores, synthesizing, valid rules.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1408
7674 RFID-ready Master Data Management for Reverse Logistics

Authors: Jincheol Han, Hyunsun Ju, Jonghoon Chun

Abstract:

Sharing consistent and correct master data among disparate applications in a reverse-logistics chain has long been recognized as an intricate problem. Although a master data management (MDM) system can surely assume that responsibility, applications that need to co-operate with it must comply with proprietary query interfaces provided by the specific MDM system. In this paper, we present a RFID-ready MDM system which makes master data readily available for any participating applications in a reverse-logistics chain. We propose a RFID-wrapper as a part of our MDM. It acts as a gateway between any data retrieval request and query interfaces that process it. With the RFID-wrapper, any participating applications in a reverse-logistics chain can easily retrieve master data in a way that is analogous to retrieval of any other RFID-based logistics transactional data.

Keywords: Reverse Logistics, Master Data Management, RFID.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1981
7673 Dynamic Models versus Frailty Models for Recurrent Event Data

Authors: Entisar A. Elgmati

Abstract:

Recurrent event data is a special type of multivariate survival data. Dynamic and frailty models are one of the approaches that dealt with this kind of data. A comparison between these two models is studied using the empirical standard deviation of the standardized martingale residual processes as a way of assessing the fit of the two models based on the Aalen additive regression model. Here we found both approaches took heterogeneity into account and produce residual standard deviations close to each other both in the simulation study and in the real data set.

Keywords: Dynamic, frailty, misspecification, recurrent events.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2356
7672 Assessing and Evaluating the Course Outcomes of Control Systems Course Mapping Complex Engineering Problem Solving Issues and Associated Knowledge Profiles with the Program Outcomes

Authors: Muhibul Haque Bhuyan

Abstract:

In the current context, the engineering program educators need to think about how to develop the concepts and complex engineering problem-solving skills through various complex engineering activities by the undergraduate engineering students in various engineering courses. But most of them are facing challenges to assess and evaluate these skills of their students. In this study, detailed assessment and evaluation methods for the undergraduate Electrical and Electronic Engineering (EEE) program are stated using the Outcome-Based Education (OBE) approach. For this purpose, a final year course titled control systems has been selected. The assessment and evaluation approach, course contents, course objectives, course outcomes (COs), and their mapping to the program outcomes (POs) with complex engineering problems and activities via the knowledge profiles, performance indicators, rubrics of assessment, CO and PO attainment data, and other statistics, are reported for a student-cohort of control systems course registered by the students of BSc in EEE program in Spring 2021 Semester at the EEE Department of Southeast University (SEU). It is found that the target benchmark was achieved by the students of that course. Several recommendations for the continuous quality improvement (CQI) process are also provided.

Keywords: Complex engineering problem, knowledge profiles, OBE, control systems course, COs, PIs, POs, assessment rubrics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 627
7671 The Effects of Electrical Muscle Stimulation (EMS) towards Male Skeletal Muscle Mass

Authors: Mohd Faridz Ahmad, Amirul Hakim Hasbullah

Abstract:

Electrical Muscle Stimulation (EMS) has been introduced and globally gained increasing attention on its usefulness. Continuous application of EMS may lead to the increment of muscle mass and indirectly will increase the strength. This study can be used as an alternative to help people especially those living a sedentary lifestyle to improve their muscle activity without having to go through a heavy workout session. Therefore, this study intended to investigate the effectiveness of EMS training program in 5 weeks interventions towards male body composition. It was a quasiexperimental design, held at the Impulse Studio Bangsar, which examined the effects of EMS training towards skeletal muscle mass among the subjects. Fifteen subjects (n = 15) were selected to assist in this study. The demographic data showed that, the average age of the subjects was 43.07 years old ± 9.90, height (173.4 cm ± 9.09) and weight was (85.79 kg ± 18.07). Results showed that there was a significant difference on the skeletal muscle mass (p = 0.01 < 0.05), upper body (p = 0.01 < 0.05) and lower body (p = 0.00 < 0.05). Therefore, the null hypothesis has been rejected in this study. As a conclusion, the application of EMS towards body composition can increase the muscle size and strength. This method has been proven to be able to improve athlete strength and thus, may be implemented in the sports science area of knowledge.

Keywords: Body composition, EMS, skeletal muscle mass, strength.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6403
7670 Issues and Architecture for Supporting Data Warehouse Queries in Web Portals

Authors: Minsoo Lee, Yoon-kyung Lee, Hyejung Yoon, Soo-kyung Song, Sujeong Cheong

Abstract:

Data Warehousing tools have become very popular and currently many of them have moved to Web-based user interfaces to make it easier to access and use the tools. The next step is to enable these tools to be used within a portal framework. The portal framework consists of pages having several small windows that contain individual data warehouse query results. There are several issues that need to be considered when designing the architecture for a portal enabled data warehouse query tool. Some issues need special techniques that can overcome the limitations that are imposed by the nature of data warehouse queries. Issues such as single sign-on, query result caching and sharing, customization, scheduling and authorization need to be considered. This paper discusses such issues and suggests an architecture to support data warehouse queries within Web portal frameworks.

Keywords: Data Warehousing tools, data warehousing queries, web portal frameworks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2130
7669 A Probabilistic Reinforcement-Based Approach to Conceptualization

Authors: Hadi Firouzi, Majid Nili Ahmadabadi, Babak N. Araabi

Abstract:

Conceptualization strengthens intelligent systems in generalization skill, effective knowledge representation, real-time inference, and managing uncertain and indefinite situations in addition to facilitating knowledge communication for learning agents situated in real world. Concept learning introduces a way of abstraction by which the continuous state is formed as entities called concepts which are connected to the action space and thus, they illustrate somehow the complex action space. Of computational concept learning approaches, action-based conceptualization is favored because of its simplicity and mirror neuron foundations in neuroscience. In this paper, a new biologically inspired concept learning approach based on the probabilistic framework is proposed. This approach exploits and extends the mirror neuron-s role in conceptualization for a reinforcement learning agent in nondeterministic environments. In the proposed method, instead of building a huge numerical knowledge, the concepts are learnt gradually from rewards through interaction with the environment. Moreover the probabilistic formation of the concepts is employed to deal with uncertain and dynamic nature of real problems in addition to the ability of generalization. These characteristics as a whole distinguish the proposed learning algorithm from both a pure classification algorithm and typical reinforcement learning. Simulation results show advantages of the proposed framework in terms of convergence speed as well as generalization and asymptotic behavior because of utilizing both success and failures attempts through received rewards. Experimental results, on the other hand, show the applicability and effectiveness of the proposed method in continuous and noisy environments for a real robotic task such as maze as well as the benefits of implementing an incremental learning scenario in artificial agents.

Keywords: Concept learning, probabilistic decision making, reinforcement learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1538
7668 Gas Lift Optimization to Improve Well Performance

Authors: Mohamed A. G. H. Abdalsadig, Amir Nourian, G. G. Nasr, Meisam Babaie

Abstract:

Gas lift optimization is becoming more important now a day in petroleum industry. A proper lift optimization can reduce the operating cost, increase the net present value (NPV) and maximize the recovery from the asset. A widely accepted definition of gas lift optimization is to obtain the maximum output under specified operating conditions. In addition, gas lift, a costly and indispensable means to recover oil from high depth reservoir entails solving the gas lift optimization problems. Gas lift optimization is a continuous process; there are two levels of production optimization. The total field optimization involves optimizing the surface facilities and the injection rate that can be achieved by standard tools softwares. Well level optimization can be achieved by optimizing the well parameters such as point of injection, injection rate, and injection pressure. All these aspects have been investigated and presented in this study by using experimental data and PROSPER simulation program. The results show that the well head pressure has a large influence on the gas lift performance and also proved that smart gas lift valve can be used to improve gas lift performance by controlling gas injection from down hole. Obtaining the optimum gas injection rate is important because excessive gas injection reduces production rate and consequently increases the operation cost.

Keywords: Optimization, production rate, reservoir pressure effect, gas injection rate effect, gas injection pressure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6812
7667 Data Mining Using Learning Automata

Authors: M. R. Aghaebrahimi, S. H. Zahiri, M. Amiri

Abstract:

In this paper a data miner based on the learning automata is proposed and is called LA-miner. The LA-miner extracts classification rules from data sets automatically. The proposed algorithm is established based on the function optimization using learning automata. The experimental results on three benchmarks indicate that the performance of the proposed LA-miner is comparable with (sometimes better than) the Ant-miner (a data miner algorithm based on the Ant Colony optimization algorithm) and CNZ (a well-known data mining algorithm for classification).

Keywords: Data mining, Learning automata, Classification rules, Knowledge discovery.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1944
7666 Annual Changes in Some Qualitative Parameters of Groundwater in Shirvan Plain North East of Iran

Authors: Hadi Ghorbani, Samira Mohammadi Sadabad

Abstract:

Shirvan is located in plain in Northern Khorasan province north east of Iran and has semiarid to temperate climate. To investigate the annual changes in some qualitative parameters such as electrical conductivity, total dissolved solids and chloride concentrations which have increased during ten continuous years. Fourteen groundwater sources including deep as well as semi-deep wells were sampled and were analyzed using standard methods. The trends of obtained data were analyzed during these years and the effects of different factors on the changes in electrical conductivity, concentration of chloride and total dissolved solids were clarified. The results showed that the amounts of some qualitative parameters have been increased during 10 years time which has led to decrease in water quality. The results also showed that increased in urban populations as well as extensive industrialization in the studied area are the most important reasons to influence underground water quality. Furthermore decrease in water quantity is also evident due to more water utilization and occurrence of recent droughts in the region during recent years.

Keywords: Chloride, Electrical Conductivity, Shirvan, Total Dissolved Solids.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1367
7665 Secure and Efficient Transmission of Aggregated Data for Mobile Wireless Sensor Networks

Authors: A. Krishna Veni, R.Geetha

Abstract:

Wireless Sensor Networks (WSNs) are suitable for many scenarios in the real world. The retrieval of data is made efficient by the data aggregation techniques. Many techniques for the data aggregation are offered and most of the existing schemes are not energy efficient and secure. However, the existing techniques use the traditional clustering approach where there is a delay during the packet transmission since there is no proper scheduling. The presented system uses the Velocity Energy-efficient and Link-aware Cluster-Tree (VELCT) scheme in which there is a Data Collection Tree (DCT) which improves the lifetime of the network. The VELCT scheme and the construction of DCT reduce the delay and traffic. The network lifetime can be increased by avoiding the frequent change in cluster topology. Secure and Efficient Transmission of Aggregated data (SETA) improves the security of the data transmission via the trust value of the nodes prior the aggregation of data. Since SETA considers the data only from the trustworthy nodes for aggregation, it is more secure in transmitting the data thereby improving the accuracy of aggregated data.

Keywords: Aggregation, lifetime, network security, wireless sensor network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1224
7664 Development of Greenhouse Analysis Tools for Home Agriculture Project

Authors: M. Amir Abas, M. Dahlui

Abstract:

This paper presents the development of analysis tools for Home Agriculture project. The tools are required for monitoring the condition of greenhouse which involves two components: measurement hardware and data analysis engine. Measurement hardware is functioned to measure environment parameters such as temperature, humidity, air quality, dust and etc while analysis tool is used to analyse and interpret the integrated data against the condition of weather, quality of health, irradiance, quality of soil and etc. The current development of the tools is completed for off-line data recorded technique. The data is saved in MMC and transferred via ZigBee to Environment Data Manager (EDM) for data analysis. EDM converts the raw data and plot three combination graphs. It has been applied in monitoring three months data measurement for irradiance, temperature and humidity of the greenhouse..

Keywords: Monitoring, Environment, Greenhouse, Analysis tools

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2030
7663 A Robust Data Hiding Technique based on LSB Matching

Authors: Emad T. Khalaf, Norrozila Sulaiman

Abstract:

Many researchers are working on information hiding techniques using different ideas and areas to hide their secrete data. This paper introduces a robust technique of hiding secret data in image based on LSB insertion and RSA encryption technique. The key of the proposed technique is to encrypt the secret data. Then the encrypted data will be converted into a bit stream and divided it into number of segments. However, the cover image will also be divided into the same number of segments. Each segment of data will be compared with each segment of image to find the best match segment, in order to create a new random sequence of segments to be inserted then in a cover image. Experimental results show that the proposed technique has a high security level and produced better stego-image quality.

Keywords: steganography; LSB Matching; RSA Encryption; data segments

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2229
7662 Partnering with Stakeholders to Secure Digitization of Water

Authors: Sindhu Govardhan, Kenneth G. Crowther

Abstract:

Modernisation of the water sector is leading to increased connectivity and integration of emerging technologies with traditional ones, leading to new security risks. The convergence of Information Technology (IT) with Operation Technology (OT) results in solutions that are spread across larger geographic areas, increasingly consist of interconnected Industrial Internet of Things (IIOT) devices and software, rely on the integration of legacy with modern technologies, use of complex supply chain components leading to complex architectures and communication paths. The result is that multiple parties collectively own and operate these emergent technologies, threat actors find new paths to exploit, and traditional cybersecurity controls are inadequate. Our approach is to explicitly identify and draw data flows that cross trust boundaries between owners and operators of various aspects of these emerging and interconnected technologies. On these data flows, we layer potential attack vectors to create a frame of reference for evaluating possible risks against connected technologies. Finally, we identify where existing controls, mitigations, and other remediations exist across industry partners (e.g., suppliers, product vendors, integrators, water utilities, and regulators). From these, we are able to understand potential gaps in security, the roles in the supply chain that are most likely to effectively remediate those security gaps, and test cases to evaluate and strengthen security across these partners. This informs a “shared responsibility” solution that recognises that security is multi-layered and requires collaboration to be successful. This shared responsibility security framework improves visibility, understanding, and control across the entire supply chain, and particularly for those water utilities that are accountable for safe and continuous operations.

Keywords: Cyber security, shared responsibility, IIOT, threat modelling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 201
7661 Comprehensive Analysis of Data Mining Tools

Authors: S. Sarumathi, N. Shanthi

Abstract:

Due to the fast and flawless technological innovation there is a tremendous amount of data dumping all over the world in every domain such as Pattern Recognition, Machine Learning, Spatial Data Mining, Image Analysis, Fraudulent Analysis, World Wide Web etc., This issue turns to be more essential for developing several tools for data mining functionalities. The major aim of this paper is to analyze various tools which are used to build a resourceful analytical or descriptive model for handling large amount of information more efficiently and user friendly. In this survey the diverse tools are illustrated with their extensive technical paradigm, outstanding graphical interface and inbuilt multipath algorithms in which it is very useful for handling significant amount of data more indeed.

Keywords: Classification, Clustering, Data Mining, Machine learning, Visualization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2447