Search results for: Nuclear Data Evaluation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9239

Search results for: Nuclear Data Evaluation

8549 Secure Data Aggregation Using Clusters in Sensor Networks

Authors: Prakash G L, Thejaswini M, S H Manjula, K R Venugopal, L M Patnaik

Abstract:

Wireless sensor network can be applied to both abominable and military environments. A primary goal in the design of wireless sensor networks is lifetime maximization, constrained by the energy capacity of batteries. One well-known method to reduce energy consumption in such networks is data aggregation. Providing efcient data aggregation while preserving data privacy is a challenging problem in wireless sensor networks research. In this paper, we present privacy-preserving data aggregation scheme for additive aggregation functions. The Cluster-based Private Data Aggregation (CPDA)leverages clustering protocol and algebraic properties of polynomials. It has the advantage of incurring less communication overhead. The goal of our work is to bridge the gap between collaborative data collection by wireless sensor networks and data privacy. We present simulation results of our schemes and compare their performance to a typical data aggregation scheme TAG, where no data privacy protection is provided. Results show the efficacy and efficiency of our schemes.

Keywords: Aggregation, Clustering, Query Processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1715
8548 Intelligibility of Cued Speech in Video

Authors: P. Heribanová, J. Polec, S. Ondrušová, M. Hosťovecký

Abstract:

This paper discusses the cued speech recognition methods in videoconference. Cued speech is a specific gesture language that is used for communication between deaf people. We define the criteria for sentence intelligibility according to answers of testing subjects (deaf people). In our tests we use 30 sample videos coded by H.264 codec with various bit-rates and various speed of cued speech. Additionally, we define the criteria for consonant sign recognizability in single-handed finger alphabet (dactyl) analogically to acoustics. We use another 12 sample videos coded by H.264 codec with various bit-rates in four different video formats. To interpret the results we apply the standard scale for subjective video quality evaluation and the percentual evaluation of intelligibility as in acoustics. From the results we construct the minimum coded bit-rate recommendations for every spatial resolution.

Keywords: cued speech, inteligibility, logatom, video

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1512
8547 Classification Influence Index and its Application for k-Nearest Neighbor Classifier

Authors: Sejong Oh

Abstract:

Classification is an important topic in machine learning and bioinformatics. Many datasets have been introduced for classification tasks. A dataset contains multiple features, and the quality of features influences the classification accuracy of the dataset. The power of classification for each feature differs. In this study, we suggest the Classification Influence Index (CII) as an indicator of classification power for each feature. CII enables evaluation of the features in a dataset and improved classification accuracy by transformation of the dataset. By conducting experiments using CII and the k-nearest neighbor classifier to analyze real datasets, we confirmed that the proposed index provided meaningful improvement of the classification accuracy.

Keywords: accuracy, classification, dataset, data preprocessing

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1475
8546 A Product Development for Green Logistics Model by Integrated Evaluation of Design and Manufacturing and Green Supply Chain

Authors: Yuan-Jye Tseng, Yen-Jung Wang

Abstract:

A product development for green logistics model using the fuzzy analytic network process method is presented for evaluating the relationships among the product design, the manufacturing activities, and the green supply chain. In the product development stage, there can be alternative ways to design the detailed components to satisfy the design concept and product requirement. In different design alternative cases, the manufacturing activities can be different. In addition, the manufacturing activities can affect the green supply chain of the components and product. In this research, a fuzzy analytic network process evaluation model is presented for evaluating the criteria in product design, manufacturing activities, and green supply chain. The comparison matrices for evaluating the criteria among the three groups are established. The total relational values between the three groups represent the relationships and effects. In application, the total relational values can be used to evaluate the design alternative cases for decision-making to select a suitable design case and the green supply chain. In this presentation, an example product is illustrated. It shows that the model is useful for integrated evaluation of design and manufacturing and green supply chain for the purpose of product development for green logistics.

Keywords: Supply chain management, green supply chain, product development for logistics, fuzzy analytic network process.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2227
8545 A New Protocol for Concealed Data Aggregation in Wireless Sensor Networks

Authors: M. Abbasi Dezfouli, S. Mazraeh, M. H. Yektaie

Abstract:

Wireless sensor networks (WSN) consists of many sensor nodes that are placed on unattended environments such as military sites in order to collect important information. Implementing a secure protocol that can prevent forwarding forged data and modifying content of aggregated data and has low delay and overhead of communication, computing and storage is very important. This paper presents a new protocol for concealed data aggregation (CDA). In this protocol, the network is divided to virtual cells, nodes within each cell produce a shared key to send and receive of concealed data with each other. Considering to data aggregation in each cell is locally and implementing a secure authentication mechanism, data aggregation delay is very low and producing false data in the network by malicious nodes is not possible. To evaluate the performance of our proposed protocol, we have presented computational models that show the performance and low overhead in our protocol.

Keywords: Wireless Sensor Networks, Security, Concealed Data Aggregation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1722
8544 Energy Efficient Reliable Cooperative Multipath Routing in Wireless Sensor Networks

Authors: Gergely Treplan, Long Tran-Thanh, Janos Levendovszky

Abstract:

In this paper, a reliable cooperative multipath routing algorithm is proposed for data forwarding in wireless sensor networks (WSNs). In this algorithm, data packets are forwarded towards the base station (BS) through a number of paths, using a set of relay nodes. In addition, the Rayleigh fading model is used to calculate the evaluation metric of links. Here, the quality of reliability is guaranteed by selecting optimal relay set with which the probability of correct packet reception at the BS will exceed a predefined threshold. Therefore, the proposed scheme ensures reliable packet transmission to the BS. Furthermore, in the proposed algorithm, energy efficiency is achieved by energy balancing (i.e. minimizing the energy consumption of the bottleneck node of the routing path) at the same time. This work also demonstrates that the proposed algorithm outperforms existing algorithms in extending longevity of the network, with respect to the quality of reliability. Given this, the obtained results make possible reliable path selection with minimum energy consumption in real time.

Keywords: wireless sensor networks, reliability, cooperativerouting, Rayleigh fading model, energy balancing

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1593
8543 IMDC: An Image-Mapped Data Clustering Technique for Large Datasets

Authors: Faruq A. Al-Omari, Nabeel I. Al-Fayoumi

Abstract:

In this paper, we present a new algorithm for clustering data in large datasets using image processing approaches. First the dataset is mapped into a binary image plane. The synthesized image is then processed utilizing efficient image processing techniques to cluster the data in the dataset. Henceforth, the algorithm avoids exhaustive search to identify clusters. The algorithm considers only a small set of the data that contains critical boundary information sufficient to identify contained clusters. Compared to available data clustering techniques, the proposed algorithm produces similar quality results and outperforms them in execution time and storage requirements.

Keywords: Data clustering, Data mining, Image-mapping, Pattern discovery, Predictive analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1482
8542 Universities Strategic Evaluation Using Balanced Scorecard

Authors: M. D. Nayeri, M. M. Mashhadi, K. Mohajeri

Abstract:

Defining strategic position of the organizations within the industry environment is one of the basic and most important phases of strategic planning to which extent that one of the fundamental schools of strategic planning is the strategic positioning school. In today-s knowledge-based economy and dynamic environment, it is essential for universities as the centers of education, knowledge creation and knowledge worker evolvement. Till now, variant models with different approaches to strategic positioning are deployed in defining the strategic position within the various industries. Balanced Scorecard as one of the powerful models for strategic positioning, analyzes all aspects of the organization evenly. In this paper with the consideration of BSC strength in strategic evaluation, it is used for analyzing the environmental position of the best-s Iranian Business Schools. The results could be used in developing strategic plans for these schools as well as other Iranian Management and Business Schools.

Keywords: Strategic planning, Strategic position, Balancedscorecard, Higher education institutions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4415
8541 The New Method of Concealed Data Aggregation in Wireless Sensor: A Case Study

Authors: M. Abbasi Dezfouli, S. Mazraeh, M. H. Yektaie

Abstract:

Wireless sensor networks (WSN) consists of many sensor nodes that are placed on unattended environments such as military sites in order to collect important information. Implementing a secure protocol that can prevent forwarding forged data and modifying content of aggregated data and has low delay and overhead of communication, computing and storage is very important. This paper presents a new protocol for concealed data aggregation (CDA). In this protocol, the network is divided to virtual cells, nodes within each cell produce a shared key to send and receive of concealed data with each other. Considering to data aggregation in each cell is locally and implementing a secure authentication mechanism, data aggregation delay is very low and producing false data in the network by malicious nodes is not possible. To evaluate the performance of our proposed protocol, we have presented computational models that show the performance and low overhead in our protocol.

Keywords: Wireless Sensor Networks, Security, Concealed Data Aggregation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1749
8540 A Super-Efficiency Model for Evaluating Efficiency in the Presence of Time Lag Effect

Authors: Yanshuang Zhang, Byungho Jeong

Abstract:

In many cases, there are some time lag between the consumption of inputs and the production of outputs. This time lag effect should be considered in evaluating the performance of organizations. Recently, a couple of DEA models were developed for considering time lag effect in efficiency evaluation of research activities. Multi-periods input(MpI) and Multi-periods output(MpO) models are integrate models to calculate simple efficiency considering time lag effect. However, these models can’t discriminate efficient DMUs because of the nature of basic DEA model in which efficiency scores are limited to ‘1’. That is, efficient DMUs can’t be discriminated because their efficiency scores are same. Thus, this paper suggests a super-efficiency model for efficiency evaluation under the consideration of time lag effect based on the MpO model. A case example using a long term research project is given to compare the suggested model with the MpO model.

Keywords: DEA, Super-efficiency, Time Lag.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1606
8539 Development of a Miniature and Low-Cost IoT-Based Remote Health Monitoring Device

Authors: Sreejith Jayachandran, Mojtaba Ghodsi, Morteza Mohammadzaheri

Abstract:

The modern busy world is running behind new embedded technologies based on computers and software meanwhile some people are unable to monitor their health condition and regular medical check-ups. Some of them postpone medical check-ups due to a lack of time and convenience while others skip these regular evaluations and medical examinations due to huge medical bills and hospital expenses. In this research, we present a device in the telemonitoring system capable of monitoring, checking, and evaluating the health status of the human body remotely through the internet for the needs of all kinds of people. The remote health monitoring device is a microcontroller-based embedded unit. The various types of sensors in this device are connected to the human body, and with the help of an Arduino UNO board, the required analogue data are collected from the sensors. The microcontroller on the Arduino board processes the analogue data collected in this way into digital data and transfers that information to the cloud and stores it there; the processed digital data are then instantly displayed through the LCD attached to the machine. By accessing the cloud storage with a username and password, the concerned person’s health care teams/doctors, and other health staff can collect these data for the assessment and follow-up of that patient. Besides that, the family members/guardians can use and evaluate these data for awareness of the patient's current health status. Moreover, the system is connected to a GPS module. In emergencies, the concerned team can be positioning the patient or the person with this device. The setup continuously evaluates and transfers the data to the cloud and also the user can prefix a normal value range for the evaluation. For example, the blood pressure normal value is universally prefixed between 80/120 mmHg. Similarly, the Remote Health Monitoring System (RHMS) is also allowed to fix the range of values referred to as normal coefficients. This IoT-based miniature system 11×10×10 cm3 with a low weight of 500 gr only consumes 10 mW. This smart monitoring system is manufactured for 100 GBP (British Pound Sterling), and can facilitate the communication between patients and health systems, but also it can be employed for numerous other uses including communication sectors in the aerospace and transportation systems.

Keywords: Embedded Technology, Telemonitoring system, Microcontroller, Arduino UNO, Cloud storage, GPS, RHMS, Remote Health Monitoring System, Alert system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 223
8538 Automatic Map Simplification for Visualization on Mobile Devices

Authors: Hang Yu

Abstract:

The visualization of geographic information on mobile devices has become popular as the widespread use of mobile Internet. The mobility of these devices brings about much convenience to people-s life. By the add-on location-based services of the devices, people can have an access to timely information relevant to their tasks. However, visual analysis of geographic data on mobile devices presents several challenges due to the small display and restricted computing resources. These limitations on the screen size and resources may impair the usability aspects of the visualization applications. In this paper, a variable-scale visualization method is proposed to handle the challenge of small mobile display. By merging multiple scales of information into a single image, the viewer is able to focus on the interesting region, while having a good grasp of the surrounding context. This is essentially visualizing the map through a fisheye lens. However, the fisheye lens induces undesirable geometric distortion in the peripheral, which renders the information meaningless. The proposed solution is to apply map generalization that removes excessive information around the peripheral and an automatic smoothing process to correct the distortion while keeping the local topology consistent. The proposed method is applied on both artificial and real geographical data for evaluation.

Keywords: Map simplification, visualization, mobile devices.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1420
8537 Motion-Based Detection and Tracking of Multiple Pedestrians

Authors: A. Harras, A. Tsuji, K. Terada

Abstract:

Tracking of moving people has gained a matter of great importance due to rapid technological advancements in the field of computer vision. The objective of this study is to design a motion based detection and tracking multiple walking pedestrians randomly in different directions. In our proposed method, Gaussian mixture model (GMM) is used to determine moving persons in image sequences. It reacts to changes that take place in the scene like different illumination; moving objects start and stop often, etc. Background noise in the scene is eliminated through applying morphological operations and the motions of tracked people which is determined by using the Kalman filter. The Kalman filter is applied to predict the tracked location in each frame and to determine the likelihood of each detection. We used a benchmark data set for the evaluation based on a side wall stationary camera. The actual scenes from the data set are taken on a street including up to eight people in front of the camera in different two scenes, the duration is 53 and 35 seconds, respectively. In the case of walking pedestrians in close proximity, the proposed method has achieved the detection ratio of 87%, and the tracking ratio is 77 % successfully. When they are deferred from each other, the detection ratio is increased to 90% and the tracking ratio is also increased to 79%.

Keywords: Automatic detection, tracking, pedestrians.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 817
8536 Peakwise Smoothing of Data Models using Wavelets

Authors: D Sudheer Reddy, N Gopal Reddy, P V Radhadevi, J Saibaba, Geeta Varadan

Abstract:

Smoothing or filtering of data is first preprocessing step for noise suppression in many applications involving data analysis. Moving average is the most popular method of smoothing the data, generalization of this led to the development of Savitzky-Golay filter. Many window smoothing methods were developed by convolving the data with different window functions for different applications; most widely used window functions are Gaussian or Kaiser. Function approximation of the data by polynomial regression or Fourier expansion or wavelet expansion also gives a smoothed data. Wavelets also smooth the data to great extent by thresholding the wavelet coefficients. Almost all smoothing methods destroys the peaks and flatten them when the support of the window is increased. In certain applications it is desirable to retain peaks while smoothing the data as much as possible. In this paper we present a methodology called as peak-wise smoothing that will smooth the data to any desired level without losing the major peak features.

Keywords: smoothing, moving average, peakwise smoothing, spatialdensity models, planar shape models, wavelets.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1738
8535 Development of Researcher Knowledge in Mathematics Education: Towards a Confluence Framework

Authors: I. Kontorovich, R. Zazkis

Abstract:

We present a framework of researcher knowledge development in conducting a study in mathematics education. The key components of the framework are: knowledge germane to conducting a particular study, processes of knowledge accumulation, and catalyzing filters that influence a researcher decision making. The components of the framework originated from a confluence between constructs and theories in Mathematics Education, Higher Education and Sociology. Drawing on a self-reflective interview with a leading researcher in mathematics education, Professor Michèle Artigue, we illustrate how the framework can be utilized in data analysis. Criteria for framework evaluation are discussed.

Keywords: Community of practice, knowledge development, mathematics education research, researcher knowledge.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1830
8534 Adopting Flocks of Birds Approach to Predator for Anomalies Detection on Industrial Control Systems

Authors: M. Okeke, A. Blyth

Abstract:

Industrial Control Systems (ICS) such as Supervisory Control And Data Acquisition (SCADA) can be seen in many different critical infrastructures, from nuclear management to utility, medical equipment, power, waste and engine management on ships and planes. The role SCADA plays in critical infrastructure has resulted in a call to secure them. Many lives depend on it for daily activities and the attack vectors are becoming more sophisticated. Hence, the security of ICS is vital as malfunction of it might result in huge risk. This paper describes how the application of Prey Predator (PP) approach in flocks of birds could enhance the detection of malicious activities on ICS. The PP approach explains how these animals in groups or flocks detect predators by following some simple rules. They are not necessarily very intelligent animals but their approach in solving complex issues such as detection through corporation, coordination and communication worth emulating. This paper will emulate flocking behavior seen in birds in detecting predators. The PP approach will adopt six nearest bird approach in detecting any predator. Their local and global bests are based on the individual detection as well as group detection. The PP algorithm was designed following MapReduce methodology that follows a Split Detection Convergence (SDC) approach.

Keywords: Industrial control systems, prey predator, SCADA, SDC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1156
8533 An Efficient Data Mining Approach on Compressed Transactions

Authors: Jia-Yu Dai, Don-Lin Yang, Jungpin Wu, Ming-Chuan Hung

Abstract:

In an era of knowledge explosion, the growth of data increases rapidly day by day. Since data storage is a limited resource, how to reduce the data space in the process becomes a challenge issue. Data compression provides a good solution which can lower the required space. Data mining has many useful applications in recent years because it can help users discover interesting knowledge in large databases. However, existing compression algorithms are not appropriate for data mining. In [1, 2], two different approaches were proposed to compress databases and then perform the data mining process. However, they all lack the ability to decompress the data to their original state and improve the data mining performance. In this research a new approach called Mining Merged Transactions with the Quantification Table (M2TQT) was proposed to solve these problems. M2TQT uses the relationship of transactions to merge related transactions and builds a quantification table to prune the candidate itemsets which are impossible to become frequent in order to improve the performance of mining association rules. The experiments show that M2TQT performs better than existing approaches.

Keywords: Association rule, data mining, merged transaction, quantification table.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1948
8532 Evaluating and Selecting Optimization Software Packages: A Framework for Business Applications

Authors: Waleed Abohamad, Amr Arisha

Abstract:

Owing the fact that optimization of business process is a crucial requirement to navigate, survive and even thrive in today-s volatile business environment, this paper presents a framework for selecting a best-fit optimization package for solving complex business problems. Complexity level of the problem and/or using incorrect optimization software can lead to biased solutions of the optimization problem. Accordingly, the proposed framework identifies a number of relevant factors (e.g. decision variables, objective functions, and modeling approach) to be considered during the evaluation and selection process. Application domain, problem specifications, and available accredited optimization approaches are also to be regarded. A recommendation of one or two optimization software is the output of the framework which is believed to provide the best results of the underlying problem. In addition to a set of guidelines and recommendations on how managers can conduct an effective optimization exercise is discussed.

Keywords: Complex Business Problems, Optimization, Selection Criteria, Software Evaluation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2896
8531 Integrating Security Indifference Curve to Formal Decision Evaluation

Authors: Anon Yantarasri, Yachai Limpiyakorn

Abstract:

Decisions are regularly made during a project or daily life. Some decisions are critical and have a direct impact on project or human success. Formal evaluation is thus required, especially for crucial decisions, to arrive at the optimal solution among alternatives to address issues. According to microeconomic theory, all people-s decisions can be modeled as indifference curves. The proposed approach supports formal analysis and decision by constructing indifference curve model from the previous experts- decision criteria. These knowledge embedded in the system can be reused or help naïve users select alternative solution of the similar problem. Moreover, the method is flexible to cope with unlimited number of factors influencing the decision-making. The preliminary experimental results of the alternative selection are accurately matched with the expert-s decisions.

Keywords: Decision Analysis and Resolution, Indifference Curve, Multi-criteria Decision Making.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1607
8530 Weigh-in-Motion Data Analysis Software for Developing Traffic Data for Mechanistic Empirical Pavement Design

Authors: M. A. Hasan, M. R. Islam, R. A. Tarefder

Abstract:

Currently, there are few user friendly Weigh-in- Motion (WIM) data analysis softwares available which can produce traffic input data for the recently developed AASHTOWare pavement Mechanistic-Empirical (ME) design software. However, these softwares have only rudimentary Quality Control (QC) processes. Therefore, they cannot properly deal with erroneous WIM data. As the pavement performance is highly sensible to the quality of WIM data, it is highly recommended to use more refined QC process on raw WIM data to get a good result. This study develops a userfriendly software, which can produce traffic input for the ME design software. This software takes the raw data (Class and Weight data) collected from the WIM station and processes it with a sophisticated QC procedure. Traffic data such as traffic volume, traffic distribution, axle load spectra, etc. can be obtained from this software; which can directly be used in the ME design software.

Keywords: Weigh-in-motion, software, axle load spectra, traffic distribution, AASHTOWare.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1881
8529 An Evaluation of Drivers in Implementing Sustainable Manufacturing in India: Using DEMATEL Approach

Authors: D. Garg, S. Luthra, A. Haleem

Abstract:

Due to growing concern about environmental and social consequences throughout the world, a need has been felt to incorporate sustainability concepts in conventional manufacturing. This paper is an attempt to identify and evaluate drivers in implementing sustainable manufacturing in Indian context. Nine possible drivers for successful implementation of sustainable manufacturing have been identified from extensive review. Further, Decision Making Trial and Evaluation Laboratory (DEMATEL) approach has been utilized to evaluate and categorize these identified drivers for implementing sustainable manufacturing in to the cause and effect groups. Five drivers (Societal Pressure and Public Concerns; Regulations and Government Policies; Top Management Involvement, Commitment and Support; Effective Strategies and Activities towards Socially Responsible Manufacturing and Market Trends) have been categorized into the cause group and four drivers (Holistic View in Manufacturing Systems; Supplier Participation; Building Sustainable culture in Organization; and Corporate Image and Benefits) have been categorized into the effect group. “Societal Pressure and Public Concerns” has been found the most critical driver and “Corporate Image and Benefits” as least critical or the most easily influenced driver to implementing sustainable manufacturing in Indian context. This paper may surely help practitioners in better understanding of these drivers and their priorities towards effective implementation of sustainable manufacturing.

Keywords: Drivers, Decision Making Trial and Evaluation Laboratory (DEMATEL), India, Sustainable Manufacturing (SM).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3200
8528 Survey of Access Controls in Cloud Computing

Authors: Monirah Alkathiry, Hanan Aljarwan

Abstract:

Cloud computing is one of the most significant technologies that the world deals with, in different sectors with different purposes and capabilities. The cloud faces various challenges in securing data from unauthorized access or modification. Consequently, security risks and levels have greatly increased. Therefore, cloud service providers (CSPs) and users need secure mechanisms that ensure that data are kept secret and safe from any disclosures or exploits. For this reason, CSPs need a number of techniques and technologies to manage and secure access to the cloud services to achieve security goals, such as confidentiality, integrity, identity access management (IAM), etc. Therefore, this paper will review and explore various access controls implemented in a cloud environment that achieve different security purposes. The methodology followed in this survey was conducting an assessment, evaluation, and comparison between those access controls mechanisms and technologies based on different factors, such as the security goals it achieves, usability, and cost-effectiveness. This assessment resulted in the fact that the technology used in an access control affects the security goals it achieves as well as there is no one access control method that achieves all security goals. Consequently, such a comparison would help decision-makers to choose properly the access controls that meet their requirements.

Keywords: Access controls, cloud computing, confidentiality, identity and access management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 705
8527 TFRank: An Evaluation of Users Importance with Fractal Views in Social Networks

Authors: Fei Hao, Hai Wang

Abstract:

One of research issues in social network analysis is to evaluate the position/importance of users in social networks. As the information diffusion in social network is evolving, it seems difficult to evaluate the importance of users using traditional approaches. In this paper, we propose an evaluation approach for user importance with fractal view in social networks. In this approach, the global importance (Fractal Importance) and the local importance (Topological Importance) of nodes are considered. The basic idea is that the bigger the product of fractal importance and topological importance of a node is, the more important of the node is. We devise the algorithm called TFRank corresponding to the proposed approach. Finally, we evaluate TFRank by experiments. Experimental results demonstrate our TFRank has the high correlations with PageRank algorithm and potential ranking algorithm, and it shows the effectiveness and advantages of our approach.

Keywords: TFRank, Fractal Importance, Topological Importance, Social Network

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1498
8526 Maintenance Function's Performance Evaluation Using Adapted Balanced Scorecard Model

Authors: A. Bakhtiar, B. Purwanggono, N. Metasari

Abstract:

PT XYZ is a bottled drinking water company. To preserve production resources owned by the company so that the resources could be utilized well, it has implemented maintenance management system, which has important role in company's profitability, and is one of the factors influenced overall company's performance. Yet, up to now the company has never measured maintenance activities' contribution to company's performance. Performance evaluation is done according to adapted Balanced Scorecard model fitted to maintenance function context. This model includes six perspectives: innovation and growth, production, maintenance, environment, costumer, and finance. Actual performance measurement is done through Analytic Hierarchy Process and Objective Matrix. From the research done, we can conclude that the company's maintenance function is categorized in moderate performance. But, there are some indicators which has high priority but low performance, which are: costumers' complain rate, work lateness rate, and Return on Investment.

Keywords: Maintenance, performance, balanced scorecard, objective matrix.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2695
8525 Human Growth Curve Estimation through a Combination of Longitudinal and Cross-sectional Data

Authors: Sedigheh Mirzaei S., Debasis Sengupta

Abstract:

Parametric models have been quite popular for studying human growth, particularly in relation to biological parameters such as peak size velocity and age at peak size velocity. Longitudinal data are generally considered to be vital for fittinga parametric model to individual-specific data, and for studying the distribution of these biological parameters in a human population. However, cross-sectional data are easier to obtain than longitudinal data. In this paper, we present a method of combining longitudinal and cross-sectional data for the purpose of estimating the distribution of the biological parameters. We demonstrate, through simulations in the special case ofthePreece Baines model, how estimates based on longitudinal data can be improved upon by harnessing the information contained in cross-sectional data.We study the extent of improvement for different mixes of the two types of data, and finally illustrate the use of the method through data collected by the Indian Statistical Institute.

Keywords: Preece-Baines growth model, MCMC method, Mixed effect model

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2120
8524 An Interval-Based Multi-Attribute Decision Making Approach for Electric Utility Resource Planning

Authors: M. Sedighizadeh, A. Rezazadeh

Abstract:

This paper presents an interval-based multi-attribute decision making (MADM) approach in support of the decision process with imprecise information. The proposed decision methodology is based on the model of linear additive utility function but extends the problem formulation with the measure of composite utility variance. A sample study concerning with the evaluation of electric generation expansion strategies is provided showing how the imprecise data may affect the choice toward the best solution and how a set of alternatives, acceptable to the decision maker (DM), may be identified with certain confidence.

Keywords: Decision Making, Power Generation, ElectricUtilities, Resource Planning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1564
8523 Semantic Support for Hypothesis-Based Research from Smart Environment Monitoring and Analysis Technologies

Authors: T. S. Myers, J. Trevathan

Abstract:

Improvements in the data fusion and data analysis phase of research are imperative due to the exponential growth of sensed data. Currently, there are developments in the Semantic Sensor Web community to explore efficient methods for reuse, correlation and integration of web-based data sets and live data streams. This paper describes the integration of remotely sensed data with web-available static data for use in observational hypothesis testing and the analysis phase of research. The Semantic Reef system combines semantic technologies (e.g., well-defined ontologies and logic systems) with scientific workflows to enable hypothesis-based research. A framework is presented for how the data fusion concepts from the Semantic Reef architecture map to the Smart Environment Monitoring and Analysis Technologies (SEMAT) intelligent sensor network initiative. The data collected via SEMAT and the inferred knowledge from the Semantic Reef system are ingested to the Tropical Data Hub for data discovery, reuse, curation and publication.

Keywords: Information architecture, Semantic technologies Sensor networks, Ontologies.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1701
8522 Considering Aerosol Processes in Nuclear Transport Package Containment Safety Cases

Authors: Andrew Cummings, Rhianne Boag, Sarah Bryson, Gordon Turner

Abstract:

Packages designed for transport of radioactive material must satisfy rigorous safety regulations specified by the International Atomic Energy Agency (IAEA). Higher Activity Waste (HAW) transport packages have to maintain containment of their contents during normal and accident conditions of transport (NCT and ACT). To ensure containment criteria is satisfied these packages are required to be leak-tight in all transport conditions to meet allowable activity release rates. Package design safety reports are the safety cases that provide the claims, evidence and arguments to demonstrate that packages meet the regulations and once approved by the competent authority (in the UK this is the Office for Nuclear Regulation) a licence to transport radioactive material is issued for the package(s). The standard approach to demonstrating containment in the RWM transport safety case is set out in BS EN ISO 12807. In this document a method for measuring a leak rate from the package is explained by way of a small interspace test volume situated between two O-ring seals on the underside of the package lid. The interspace volume is pressurised and a pressure drop measured. A small interspace test volume makes the method more sensitive enabling the measurement of smaller leak rates. By ascertaining the activity of the contents, identifying a releasable fraction of material and by treating that fraction of material as a gas, allowable leak rates for NCT and ACT are calculated. The adherence to basic safety principles in ISO12807 is very pessimistic and current practice in the demonstration of transport safety, which is accepted by the UK regulator. It is UK government policy that management of HAW will be through geological disposal. It is proposed that the intermediate level waste be transported to the geological disposal facility (GDF) in large cuboid packages. This poses a challenge for containment demonstration because such packages will have long seals and therefore large interspace test volumes. There is also uncertainty on the releasable fraction of material within the package ullage space. This is because the waste may be in many different forms which makes it difficult to define the fraction of material released by the waste package. Additionally because of the large interspace test volume, measuring the calculated leak rates may not be achievable. For this reason a justification for a lower releasable fraction of material is sought. This paper considers the use of aerosol processes to reduce the releasable fraction for both NCT and ACT. It reviews the basic coagulation and removal processes and applies the dynamic aerosol balance equation. The proposed solution includes only the most well understood physical processes namely; Brownian coagulation and gravitational settling. Other processes have been eliminated either on the basis that they would serve to reduce the release to the environment further (pessimistically in keeping with the essence of nuclear transport safety cases) or that they are not credible in the conditions of transport considered.

Keywords: Aerosol processes, Brownian coagulation, gravitational settling, transport regulations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 622
8521 Surveying Earthquake Vulnerabilities of District 13 of Kabul City, Afghanistan

Authors: Mohsen Mohammadi, Toshio Fujimi

Abstract:

High population and irregular urban development in Kabul city, Afghanistan's capital, are among factors that increase its vulnerability to earthquake disasters (on top of its location in a high seismic region); this can lead to widespread economic loss and casualties. This study aims to evaluate earthquake risks in Kabul's 13th district based on scientific data. The research data, which include hazard curves of Kabul, vulnerability curves, and a questionnaire survey through sampling in district 13, have been incorporated to develop risk curves. To estimate potential casualties, we used a set of M parameters in a model developed by Coburn and Spence. The results indicate that in the worst case scenario, more than 90% of district 13, which comprises mostly residential buildings, is exposed to high risk; this may lead to nearly 1000 million USD economic loss and 120 thousand casualties (equal to 25.88% of the 13th district's population) for a nighttime earthquake. To reduce risks, we present the reconstruction of the most vulnerable buildings, which are primarily adobe and masonry buildings. A comparison of risk reduction between reconstructing adobe and masonry buildings indicates that rebuilding adobe buildings would be more effective.

Keywords: Earthquake risk evaluation, Kabul, mitigation, vulnerability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1616
8520 Data Migration between Document-Oriented and Relational Databases

Authors: Bogdan Walek, Cyril Klimes

Abstract:

Current tools for data migration between documentoriented and relational databases have several disadvantages. We propose a new approach for data migration between documentoriented and relational databases. During data migration the relational schema of the target (relational database) is automatically created from collection of XML documents. Proposed approach is verified on data migration between document-oriented database IBM Lotus/ Notes Domino and relational database implemented in relational database management system (RDBMS) MySQL.

Keywords: data migration, database, document-oriented database, XML, relational schema

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3499