Search results for: time series data.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 12639

Search results for: time series data.

11619 Novel Delay-Dependent Stability Criteria for Uncertain Discrete-Time Stochastic Neural Networks with Time-Varying Delays

Authors: Mengzhuo Luo, Shouming Zhong

Abstract:

This paper investigates the problem of exponential stability for a class of uncertain discrete-time stochastic neural network with time-varying delays. By constructing a suitable Lyapunov-Krasovskii functional, combining the stochastic stability theory, the free-weighting matrix method, a delay-dependent exponential stability criteria is obtained in term of LMIs. Compared with some previous results, the new conditions obtain in this paper are less conservative. Finally, two numerical examples are exploited to show the usefulness of the results derived.

Keywords: Delay-dependent stability, Neural networks, Time varying delay, Linear matrix inequality (LMI).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1916
11618 Data Mining Techniques in Computer-Aided Diagnosis: Non-Invasive Cancer Detection

Authors: Florin Gorunescu

Abstract:

Diagnosis can be achieved by building a model of a certain organ under surveillance and comparing it with the real time physiological measurements taken from the patient. This paper deals with the presentation of the benefits of using Data Mining techniques in the computer-aided diagnosis (CAD), focusing on the cancer detection, in order to help doctors to make optimal decisions quickly and accurately. In the field of the noninvasive diagnosis techniques, the endoscopic ultrasound elastography (EUSE) is a recent elasticity imaging technique, allowing characterizing the difference between malignant and benign tumors. Digitalizing and summarizing the main EUSE sample movies features in a vector form concern with the use of the exploratory data analysis (EDA). Neural networks are then trained on the corresponding EUSE sample movies vector input in such a way that these intelligent systems are able to offer a very precise and objective diagnosis, discriminating between benign and malignant tumors. A concrete application of these Data Mining techniques illustrates the suitability and the reliability of this methodology in CAD.

Keywords: Endoscopic ultrasound elastography, exploratorydata analysis, neural networks, non-invasive cancer detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1846
11617 The Contribution of Edgeworth, Bootstrap and Monte Carlo Methods in Financial Data

Authors: Edlira Donefski, Tina Donefski, Lorenc Ekonomi

Abstract:

Edgeworth Approximation, Bootstrap and Monte Carlo Simulations have a considerable impact on the achieving certain results related to different problems taken into study. In our paper, we have treated a financial case related to the effect that have the components of a Cash-Flow of one of the most successful businesses in the world, as the financial activity, operational activity and investing activity to the cash and cash equivalents at the end of the three-months period. To have a better view of this case we have created a Vector Autoregression model, and after that we have generated the impulse responses in the terms of Asymptotic Analysis (Edgeworth Approximation), Monte Carlo Simulations and Residual Bootstrap based on the standard errors of every series created. The generated results consisted of the common tendencies for the three methods applied, that consequently verified the advantage of the three methods in the optimization of the model that contains many variants.

Keywords: Autoregression, Bootstrap, Edgeworth Expansion, Monte Carlo Method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 573
11616 Stability and Bifurcation Analysis of a Discrete Gompertz Model with Time Delay

Authors: Yingguo Li

Abstract:

In this paper, we consider a discrete Gompertz model with time delay. Firstly, the stability of the equilibrium of the system is investigated by analyzing the characteristic equation. By choosing the time delay as a bifurcation parameter, we prove that Neimark- Sacker bifurcations occur when the delay passes a sequence of critical values. The direction and stability of the Neimark-Sacker are determined by using normal forms and centre manifold theory. Finally, some numerical simulations are given to verify the theoretical analysis.

Keywords: Gompertz system, Neimark-Sacker bifurcation, stability, time delay.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1927
11615 A Study for the Effect of Fire Initiated Location on Evacuation Success Rate

Authors: Jin A Ryu, Ga Ye Kim, Hee Sun Kim

Abstract:

As the number of fire accidents is gradually raising, many studies have been reported on evacuation. Previous studies have mostly focused on evaluating the safety of evacuation and the risk of fire in particular buildings. However, studies on effects of various parameters on evacuation have not been nearly done. Therefore, this paper aims at observing evacuation time under the effect of fire initiated location. In this study, evacuation simulations are performed on a 5-floor building located in Seoul, South Korea using the commercial program, Fire Dynamics Simulator with Evacuation (FDS+EVAC). Only the fourth and fifth floors are modeled with an assumption that fire starts in a room located on the fourth floor. The parameter for evacuation simulations is location of fire initiation to observe the evacuation time and safety. Results show that the location of fire initiation is closer to exit, the more time is taken to evacuate. The case having the nearest location of fire initiation to exit has the lowest ratio of successful occupants to the total occupants. In addition, for safety evaluation, the evacuation time calculated from computer simulation model is compared with the tolerable evacuation time according to code in Japan. As a result, all cases are completed within the tolerable evacuation time. This study allows predicting evacuation time under various conditions of fire and can be used to evaluate evacuation appropriateness and fire safety of building.

Keywords: Evacuation safety, Evacuation simulation, FDS+Evac, Time.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1495
11614 Adsorption of Lead(II) and Cadmium(II) Ions from Aqueous Solutions by Adsorption on Activated Carbon Prepared from Cashew Nut Shells

Authors: S. Tangjuank, N. Insuk , J. Tontrakoon , V. Udeye

Abstract:

Cashew nut shells were converted into activated carbon powders using KOH activation plus CO2 gasification at 1027 K. The increase both of impregnation ratio and activation time, there was swiftly the development of mesoporous structure with increasing of mesopore volume ratio from 20-28% and 27-45% for activated carbon with ratio of KOH per char equal to 1 and 4, respectively. Activated carbon derived from KOH/char ratio equal to 1 and CO2 gasification time from 20 to 150 minutes were exhibited the BET surface area increasing from 222 to 627 m2.g-1. And those were derived from KOH/char ratio of 4 with activation time from 20 to 150 minutes exhibited high BET surface area from 682 to 1026 m2.g-1. The adsorption of Lead(II) and Cadmium(II) ion was investigated. This adsorbent exhibited excellent adsorption for Lead(II) and Cadmium(II) ion. Maximum adsorption presented at 99.61% at pH 6.5 and 98.87% at optimum conditions. The experimental data was calculated from Freundlich isotherm and Langmuir isotherm model. The maximum capacity of Pb2+ and Cd2+ ions was found to be 28.90 m2.g-1 and 14.29 m2.g-1, respectively.

Keywords: Activated carbon, cashew nut shell, heavy metals, adsorption.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3364
11613 A New Scheme for Improving the Quality of Service in Heterogeneous Wireless Network for Data Stream Sending

Authors: Ebadollah Zohrevandi, Rasoul Roustaei, Omid Moradtalab

Abstract:

In this paper, we first consider the quality of service problems in heterogeneous wireless networks for sending the video data, which their problem of being real-time is pronounced. At last, we present a method for ensuring the end-to-end quality of service at application layer level for adaptable sending of the video data at heterogeneous wireless networks. To do this, mechanism in different layers has been used. We have used the stop mechanism, the adaptation mechanism and the graceful degrade at the application layer, the multi-level congestion feedback mechanism in the network layer and connection cutting off decision mechanism in the link layer. At the end, the presented method and the achieved improvement is simulated and presented in the NS-2 software.

Keywords: Congestion, Handoff, Heterogeneous wireless networks, Adaptation mechanism, Stop mechanism, Graceful degrade.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1411
11612 Signed Approach for Mining Web Content Outliers

Authors: G. Poonkuzhali, K.Thiagarajan, K.Sarukesi, G.V.Uma

Abstract:

The emergence of the Internet has brewed the revolution of information storage and retrieval. As most of the data in the web is unstructured, and contains a mix of text, video, audio etc, there is a need to mine information to cater to the specific needs of the users without loss of important hidden information. Thus developing user friendly and automated tools for providing relevant information quickly becomes a major challenge in web mining research. Most of the existing web mining algorithms have concentrated on finding frequent patterns while neglecting the less frequent ones that are likely to contain outlying data such as noise, irrelevant and redundant data. This paper mainly focuses on Signed approach and full word matching on the organized domain dictionary for mining web content outliers. This Signed approach gives the relevant web documents as well as outlying web documents. As the dictionary is organized based on the number of characters in a word, searching and retrieval of documents takes less time and less space.

Keywords: Outliers, Relevant document, , Signed Approach, Web content mining, Web documents..

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2338
11611 Data-Driven Decision-Making in Digital Entrepreneurship

Authors: Abeba Nigussie Turi, Xiangming Samuel Li

Abstract:

Data-driven business models are more typical for established businesses than early-stage startups that strive to penetrate a market. This paper provided an extensive discussion on the principles of data analytics for early-stage digital entrepreneurial businesses. Here, we developed data-driven decision-making (DDDM) framework that applies to startups prone to multifaceted barriers in the form of poor data access, technical and financial constraints, to state some. The startup DDDM framework proposed in this paper is novel in its form encompassing startup data analytics enablers and metrics aligning with startups' business models ranging from customer-centric product development to servitization which is the future of modern digital entrepreneurship.

Keywords: Startup data analytics, data-driven decision-making, data acquisition, data generation, digital entrepreneurship.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 793
11610 A Reinforcement Learning Approach for Evaluation of Real-Time Disaster Relief Demand and Network Condition

Authors: Ali Nadi, Ali Edrissi

Abstract:

Relief demand and transportation links availability is the essential information that is needed for every natural disaster operation. This information is not in hand once a disaster strikes. Relief demand and network condition has been evaluated based on prediction method in related works. Nevertheless, prediction seems to be over or under estimated due to uncertainties and may lead to a failure operation. Therefore, in this paper a stochastic programming model is proposed to evaluate real-time relief demand and network condition at the onset of a natural disaster. To address the time sensitivity of the emergency response, the proposed model uses reinforcement learning for optimization of the total relief assessment time. The proposed model is tested on a real size network problem. The simulation results indicate that the proposed model performs well in the case of collecting real-time information.

Keywords: Disaster management, real-time demand, reinforcement learning, relief demand.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1912
11609 An Automated Approach to the Nozzle Configuration of Polycrystalline Diamond Compact Drill Bits for Effective Cuttings Removal

Authors: R. Suresh, Pavan Kumar Nimmagadda, Ming Zo Tan, Shane Hart, Sharp Ugwuocha

Abstract:

Polycrystalline diamond compact (PDC) drill bits are extensively used in the oil and gas industry as well as the mining industry. Industry engineers continually improve upon PDC drill bit designs and hydraulic conditions. Optimized injection nozzles play a key role in improving the drilling performance and efficiency of these ever changing PDC drill bits. In the first part of this study, computational fluid dynamics (CFD) modelling is performed to investigate the hydrodynamic characteristics of drilling fluid flow around the PDC drill bit. An Open-source CFD software – OpenFOAM simulates the flow around the drill bit, based on the field input data. A specifically developed console application integrates the entire CFD process including, domain extraction, meshing, and solving governing equations and post-processing. The results from the OpenFOAM solver are then compared with that of the ANSYS Fluent software. The data from both software programs agree. The second part of the paper describes the parametric study of the PDC drill bit nozzle to determine the effect of parameters such as number of nozzles, nozzle velocity, nozzle radial position and orientations on the flow field characteristics and bit washing patterns. After analyzing a series of nozzle configurations, the best configuration is identified and recommendations are made for modifying the PDC bit design.

Keywords: ANSYS Fluent, computational fluid dynamics, nozzle configuration, OpenFOAM, PDC dill bit.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 948
11608 Review of a Real-Time Infectious Waste Management System Using QR Code

Authors: Hiraku Nunomiya, Takuo Ichiju, Yoshiyuki Higuchi

Abstract:

In the management of industrial waste, conversion from the use of paper invoices to electronic forms is currently under way in developed countries. Difficulties in such computerization include the lack of synchronization between the actual goods and the corresponding data managed by the server. Consequently, a system which utilizes the incorporation of a QR code in connection with the waste material has been developed. The code is read at each stage, from discharge until disposal, and progress at each stage can be easily reported. This system can be linked with Japanese public digital authentication service of waste, taking advantage of its good points, and can be used to submit reports to the regulatory authorities. Its usefulness was confirmed by a verification test, and put into actual practice.

Keywords: Infectious Waste, Electronic Manifest, Real Time Management, QR code.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1805
11607 On Constructing Approximate Convex Hull

Authors: M. Zahid Hossain, M. Ashraful Amin

Abstract:

The algorithms of convex hull have been extensively studied in literature, principally because of their wide range of applications in different areas. This article presents an efficient algorithm to construct approximate convex hull from a set of n points in the plane in O(n + k) time, where k is the approximation error control parameter. The proposed algorithm is suitable for applications preferred to reduce the computation time in exchange of accuracy level such as animation and interaction in computer graphics where rapid and real-time graphics rendering is indispensable.

Keywords: Convex hull, Approximation algorithm, Computational geometry, Linear time.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2288
11606 Trust and Reliability for Public Sector Data

Authors: Klaus Stranacher, Vesna Krnjic, Thomas Zefferer

Abstract:

The public sector holds large amounts of data of various areas such as social affairs, economy, or tourism. Various initiatives such as Open Government Data or the EU Directive on public sector information aim to make these data available for public and private service providers. Requirements for the provision of public sector data are defined by legal and organizational frameworks. Surprisingly, the defined requirements hardly cover security aspects such as integrity or authenticity. In this paper we discuss the importance of these missing requirements and present a concept to assure the integrity and authenticity of provided data based on electronic signatures. We show that our concept is perfectly suitable for the provisioning of unaltered data. We also show that our concept can also be extended to data that needs to be anonymized before provisioning by incorporating redactable signatures. Our proposed concept enhances trust and reliability of provided public sector data.

Keywords: Trusted Public Sector Data, Integrity, Authenticity, Reliability, Redactable Signatures.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1749
11605 The Use of Classifiers in Image Analysis of Oil Wells Profiling Process and the Automatic Identification of Events

Authors: Jaqueline M. R. Vieira

Abstract:

Different strategies and tools are available at the oil and gas industry for detecting and analyzing tension and possible fractures in borehole walls. Most of these techniques are based on manual observation of the captured borehole images. While this strategy may be possible and convenient with small images and few data, it may become difficult and suitable to errors when big databases of images must be treated. While the patterns may differ among the image area, depending on many characteristics (drilling strategy, rock components, rock strength, etc.). In this work we propose the inclusion of data-mining classification strategies in order to create a knowledge database of the segmented curves. These classifiers allow that, after some time using and manually pointing parts of borehole images that correspond to tension regions and breakout areas, the system will indicate and suggest automatically new candidate regions, with higher accuracy. We suggest the use of different classifiers methods, in order to achieve different knowledge dataset configurations.

Keywords: Brazil, classifiers, data-mining, Image Segmentation, oil well visualization, classifiers.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2535
11604 Exploring the Activity Fabric of an Intelligent Environment with Hierarchical Hidden Markov Theory

Authors: Chiung-Hui Chen

Abstract:

The Internet of Things (IoT) was designed for widespread convenience. With the smart tag and the sensing network, a large quantity of dynamic information is immediately presented in the IoT. Through the internal communication and interaction, meaningful objects provide real-time services for users. Therefore, the service with appropriate decision-making has become an essential issue. Based on the science of human behavior, this study employed the environment model to record the time sequences and locations of different behaviors and adopted the probability module of the hierarchical Hidden Markov Model for the inference. The statistical analysis was conducted to achieve the following objectives: First, define user behaviors and predict the user behavior routes with the environment model to analyze user purposes. Second, construct the hierarchical Hidden Markov Model according to the logic framework, and establish the sequential intensity among behaviors to get acquainted with the use and activity fabric of the intelligent environment. Third, establish the intensity of the relation between the probability of objects’ being used and the objects. The indicator can describe the possible limitations of the mechanism. As the process is recorded in the information of the system created in this study, these data can be reused to adjust the procedure of intelligent design services.

Keywords: Behavior, big data, hierarchical Hidden Markov Model, intelligent object.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 748
11603 Analysis of Relation between Unlabeled and Labeled Data to Self-Taught Learning Performance

Authors: Ekachai Phaisangittisagul, Rapeepol Chongprachawat

Abstract:

Obtaining labeled data in supervised learning is often difficult and expensive, and thus the trained learning algorithm tends to be overfitting due to small number of training data. As a result, some researchers have focused on using unlabeled data which may not necessary to follow the same generative distribution as the labeled data to construct a high-level feature for improving performance on supervised learning tasks. In this paper, we investigate the impact of the relationship between unlabeled and labeled data for classification performance. Specifically, we will apply difference unlabeled data which have different degrees of relation to the labeled data for handwritten digit classification task based on MNIST dataset. Our experimental results show that the higher the degree of relation between unlabeled and labeled data, the better the classification performance. Although the unlabeled data that is completely from different generative distribution to the labeled data provides the lowest classification performance, we still achieve high classification performance. This leads to expanding the applicability of the supervised learning algorithms using unsupervised learning.

Keywords: Autoencoder, high-level feature, MNIST dataset, selftaught learning, supervised learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1816
11602 Experimental Investigation of Natural Frequency and Forced Vibration of Euler-Bernoulli Beam under Displacement of Concentrated Mass and Load

Authors: Aref Aasi, Sadegh Mehdi Aghaei, Balaji Panchapakesan

Abstract:

This work aims to evaluate the free and forced vibration of a beam with two end joints subjected to a concentrated moving mass and a load using the Euler-Bernoulli method. The natural frequency is calculated for different locations of the concentrated mass and load on the beam. The analytical results are verified by the experimental data. The variations of natural frequency as a function of the location of the mass, the effect of the forced frequency on the vibrational amplitude, and the displacement amplitude versus time are investigated. It is discovered that as the concentrated mass moves toward the center of the beam, the natural frequency of the beam and the relative error between experimental and analytical data decreases. There is a close resemblance between analytical data and experimental observations.

Keywords: Euler-Bernoulli beam, natural frequency, forced vibration, experimental setup.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 571
11601 Comparative Study on Swarm Intelligence Techniques for Biclustering of Microarray Gene Expression Data

Authors: R. Balamurugan, A. M. Natarajan, K. Premalatha

Abstract:

Microarray gene expression data play a vital in biological processes, gene regulation and disease mechanism. Biclustering in gene expression data is a subset of the genes indicating consistent patterns under the subset of the conditions. Finding a biclustering is an optimization problem. In recent years, swarm intelligence techniques are popular due to the fact that many real-world problems are increasingly large, complex and dynamic. By reasons of the size and complexity of the problems, it is necessary to find an optimization technique whose efficiency is measured by finding the near optimal solution within a reasonable amount of time. In this paper, the algorithmic concepts of the Particle Swarm Optimization (PSO), Shuffled Frog Leaping (SFL) and Cuckoo Search (CS) algorithms have been analyzed for the four benchmark gene expression dataset. The experiment results show that CS outperforms PSO and SFL for 3 datasets and SFL give better performance in one dataset. Also this work determines the biological relevance of the biclusters with Gene Ontology in terms of function, process and component.

Keywords: Particle swarm optimization, Shuffled frog leaping, Cuckoo search, biclustering, gene expression data.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2651
11600 A New Time Dependent, High Temperature Analytical Model for the Single-electron Box in Digital Applications

Authors: M.J. Sharifi

Abstract:

Several models have been introduced so far for single electron box, SEB, which all of them were restricted to DC response and or low temperature limit. In this paper we introduce a new time dependent, high temperature analytical model for SEB for the first time. DC behavior of the introduced model will be verified against SIMON software and its time behavior will be verified against a newly published paper regarding step response of SEB.

Keywords: Single electron box, SPICE, SIMON, Timedependent, Circuit model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1218
11599 Towards Development of Solution for Business Process-Oriented Data Analysis

Authors: M. Klimavicius

Abstract:

This paper proposes a modeling methodology for the development of data analysis solution. The Author introduce the approach to address data warehousing issues at the at enterprise level. The methodology covers the process of the requirements eliciting and analysis stage as well as initial design of data warehouse. The paper reviews extended business process model, which satisfy the needs of data warehouse development. The Author considers that the use of business process models is necessary, as it reflects both enterprise information systems and business functions, which are important for data analysis. The Described approach divides development into three steps with different detailed elaboration of models. The Described approach gives possibility to gather requirements and display them to business users in easy manner.

Keywords: Data warehouse, data analysis, business processmanagement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1381
11598 Development of a Remote Testing System for Performance of Gas Leakage Detectors

Authors: Gyoutae Park, Woosuk Kim, Sangguk Ahn, Seungmo Kim, Minjun Kim, Jinhan Lee, Youngdo Jo, Jongsam Moon, Hiesik Kim

Abstract:

In this research, we designed a remote system to test parameters of gas detectors such as gas concentration and initial response time. This testing system is available to measure two gas instruments simultaneously. First of all, we assembled an experimental jig with a square structure. Those parts are included with a glass flask, two high-quality cameras, and two Ethernet modems for transmitting data. This remote gas detector testing system extracts numerals from videos with continually various gas concentrations while LCDs show photographs from cameras. Extracted numeral data are received to a laptop computer through Ethernet modem. And then, the numerical data with gas concentrations and the measured initial response speeds are recorded and graphed. Our remote testing system will be diversely applied on gas detector’s test and will be certificated in domestic and international countries.

Keywords: Gas leakage detector, inspection instrument, extracting numerals, concentration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 888
11597 Aggregation Scheduling Algorithms in Wireless Sensor Networks

Authors: Min Kyung An

Abstract:

In Wireless Sensor Networks which consist of tiny wireless sensor nodes with limited battery power, one of the most fundamental applications is data aggregation which collects nearby environmental conditions and aggregates the data to a designated destination, called a sink node. Important issues concerning the data aggregation are time efficiency and energy consumption due to its limited energy, and therefore, the related problem, named Minimum Latency Aggregation Scheduling (MLAS), has been the focus of many researchers. Its objective is to compute the minimum latency schedule, that is, to compute a schedule with the minimum number of timeslots, such that the sink node can receive the aggregated data from all the other nodes without any collision or interference. For the problem, the two interference models, the graph model and the more realistic physical interference model known as Signal-to-Interference-Noise-Ratio (SINR), have been adopted with different power models, uniform-power and non-uniform power (with power control or without power control), and different antenna models, omni-directional antenna and directional antenna models. In this survey article, as the problem has proven to be NP-hard, we present and compare several state-of-the-art approximation algorithms in various models on the basis of latency as its performance measure.

Keywords: Data aggregation, convergecast, gathering, approximation, interference, omni-directional, directional.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 788
11596 Rapid Monitoring of Earthquake Damages Using Optical and SAR Data

Authors: Saeid Gharechelou, Ryutaro Tateishi

Abstract:

Earthquake is an inevitable catastrophic natural disaster. The damages of buildings and man-made structures, where most of the human activities occur are the major cause of casualties from earthquakes. A comparison of optical and SAR data is presented in the case of Kathmandu valley which was hardly shaken by 2015-Nepal Earthquake. Though many existing researchers have conducted optical data based estimated or suggested combined use of optical and SAR data for improved accuracy, however finding cloud-free optical images when urgently needed are not assured. Therefore, this research is specializd in developing SAR based technique with the target of rapid and accurate geospatial reporting. Should considers that limited time available in post-disaster situation offering quick computation exclusively based on two pairs of pre-seismic and co-seismic single look complex (SLC) images. The InSAR coherence pre-seismic, co-seismic and post-seismic was used to detect the change in damaged area. In addition, the ground truth data from field applied to optical data by random forest classification for detection of damaged area. The ground truth data collected in the field were used to assess the accuracy of supervised classification approach. Though a higher accuracy obtained from the optical data then integration by optical-SAR data. Limitation of cloud-free images when urgently needed for earthquak evevent are and is not assured, thus further research on improving the SAR based damage detection is suggested. Availability of very accurate damage information is expected for channelling the rescue and emergency operations. It is expected that the quick reporting of the post-disaster damage situation quantified by the rapid earthquake assessment should assist in channeling the rescue and emergency operations, and in informing the public about the scale of damage.

Keywords: Sentinel-1A data, Landsat-8, earthquake damage, InSAR, rapid monitoring, 2015-Nepal earthquake.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1035
11595 Preliminary Overview of Data Mining Technology for Knowledge Management System in Institutions of Higher Learning

Authors: Muslihah Wook, Zawiyah M. Yusof, Mohd Zakree Ahmad Nazri

Abstract:

Data mining has been integrated into application systems to enhance the quality of the decision-making process. This study aims to focus on the integration of data mining technology and Knowledge Management System (KMS), due to the ability of data mining technology to create useful knowledge from large volumes of data. Meanwhile, KMS vitally support the creation and use of knowledge. The integration of data mining technology and KMS are popularly used in business for enhancing and sustaining organizational performance. However, there is a lack of studies that applied data mining technology and KMS in the education sector; particularly students- academic performance since this could reflect the IHL performance. Realizing its importance, this study seeks to integrate data mining technology and KMS to promote an effective management of knowledge within IHLs. Several concepts from literature are adapted, for proposing the new integrative data mining technology and KMS framework to an IHL.

Keywords: Data mining, Institutions of Higher Learning, Knowledge Management System, Students' academic performance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2129
11594 Vehicle Routing Problem with Mixed Fleet of Conventional and Heterogenous Electric Vehicles and Time Dependent Charging Costs

Authors: Ons Sassi, Wahiba Ramdane Cherif-Khettaf, Ammar Oulamara

Abstract:

In this paper, we consider the vehicle routing problem with mixed fleet of conventional and heterogenous electric vehicles and time dependent charging costs, denoted VRP-HFCC, in which a set of geographically scattered customers have to be served by a mixed fleet of vehicles composed of a heterogenous fleet of Electric Vehicles (EVs), having different battery capacities and operating costs, and Conventional Vehicles (CVs). We include the possibility of charging EVs in the available charging stations during the routes in order to serve all customers. Each charging station offers charging service with a known technology of chargers and time dependent charging costs. Charging stations are also subject to operating time windows constraints. EVs are not necessarily compatible with all available charging technologies and a partial charging is allowed. Intermittent charging at the depot is also allowed provided that constraints related to the electricity grid are satisfied. The objective is to minimize the number of employed vehicles and then minimize the total travel and charging costs. In this study, we present a Mixed Integer Programming Model and develop a Charging Routing Heuristic and a Local Search Heuristic based on the Inject-Eject routine with different insertion methods. All heuristics are tested on real data instances.

Keywords: charging problem, electric vehicle, heuristics, local search, optimization, routing problem.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2658
11593 Towards a Secure Storage in Cloud Computing

Authors: Mohamed Elkholy, Ahmed Elfatatry

Abstract:

Cloud computing has emerged as a flexible computing paradigm that reshaped the Information Technology map. However, cloud computing brought about a number of security challenges as a result of the physical distribution of computational resources and the limited control that users have over the physical storage. This situation raises many security challenges for data integrity and confidentiality as well as authentication and access control. This work proposes a security mechanism for data integrity that allows a data owner to be aware of any modification that takes place to his data. The data integrity mechanism is integrated with an extended Kerberos authentication that ensures authorized access control. The proposed mechanism protects data confidentiality even if data are stored on an untrusted storage. The proposed mechanism has been evaluated against different types of attacks and proved its efficiency to protect cloud data storage from different malicious attacks.

Keywords: Access control, data integrity, data confidentiality, Kerberos authentication, cloud security.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1752
11592 Thailand National Biodiversity Database System with webMathematica and Google Earth

Authors: W. Katsarapong, W. Srisang, K. Jaroensutasinee, M. Jaroensutasinee

Abstract:

National Biodiversity Database System (NBIDS) has been developed for collecting Thai biodiversity data. The goal of this project is to provide advanced tools for querying, analyzing, modeling, and visualizing patterns of species distribution for researchers and scientists. NBIDS data record two types of datasets: biodiversity data and environmental data. Biodiversity data are specie presence data and species status. The attributes of biodiversity data can be further classified into two groups: universal and projectspecific attributes. Universal attributes are attributes that are common to all of the records, e.g. X/Y coordinates, year, and collector name. Project-specific attributes are attributes that are unique to one or a few projects, e.g., flowering stage. Environmental data include atmospheric data, hydrology data, soil data, and land cover data collecting by using GLOBE protocols. We have developed webbased tools for data entry. Google Earth KML and ArcGIS were used as tools for map visualization. webMathematica was used for simple data visualization and also for advanced data analysis and visualization, e.g., spatial interpolation, and statistical analysis. NBIDS will be used by park rangers at Khao Nan National Park, and researchers.

Keywords: GLOBE protocol, Biodiversity, Database System, ArcGIS, Google Earth and webMathematica.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1961
11591 Volatility Switching between Two Regimes

Authors: Josip Visković, Josip Arnerić, Ante Rozga

Abstract:

Based on the fact that volatility is time varying in high frequency data and that periods of high volatility tend to cluster, the most successful and popular models in modeling time varying volatility are GARCH type models. When financial returns exhibit sudden jumps that are due to structural breaks, standard GARCH models show high volatility persistence, i.e. integrated behavior of the conditional variance. In such situations models in which the parameters are allowed to change over time are more appropriate. This paper compares different GARCH models in terms of their ability to describe structural changes in returns caused by financial crisis at stock markets of six selected central and east European countries. The empirical analysis demonstrates that Markov regime switching GARCH model resolves the problem of excessive persistence and outperforms uni-regime GARCH models in forecasting volatility when sudden switching occurs in response to financial crisis.

Keywords: Central and east European countries, financial crisis, Markov switching GARCH model, transition probabilities.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2509
11590 An Experimental Study on Autoignition of Wood

Authors: Tri Poespowati

Abstract:

Experiments were conducted to characterize fire properties of wood exposed to the certain external heat flux and under variety of wood moisture content. Six kinds of Indonesian wood: keruing, sono, cemara, kamper, pinus, and mahoni were exposed to radiant heat from a conical heater, result in appearance of a stable flame on the wood surface caused by spontaneous ignition. A thermocouple K-type was used to measure the wood surface temperature. Temperature histories were recorded throughout each experiment at 1 s intervals using a TC-08. Data of first ignition time and temperature, end ignition time and temperature, and charring rate have been successfully collected. It was found that the ignition temperature and charring rate depend on moisture content of wood.

Keywords: Fire properties, moisture content, wood, charring rate.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2048