Search results for: data reliability
7676 Random Projections for Dimensionality Reduction in ICA
Authors: Sabrina Gaito, Andrea Greppi, Giuliano Grossi
Abstract:
In this paper we present a technique to speed up ICA based on the idea of reducing the dimensionality of the data set preserving the quality of the results. In particular we refer to FastICA algorithm which uses the Kurtosis as statistical property to be maximized. By performing a particular Johnson-Lindenstrauss like projection of the data set, we find the minimum dimensionality reduction rate ¤ü, defined as the ratio between the size k of the reduced space and the original one d, which guarantees a narrow confidence interval of such estimator with high confidence level. The derived dimensionality reduction rate depends on a system control parameter β easily computed a priori on the basis of the observations only. Extensive simulations have been done on different sets of real world signals. They show that actually the dimensionality reduction is very high, it preserves the quality of the decomposition and impressively speeds up FastICA. On the other hand, a set of signals, on which the estimated reduction rate is greater than 1, exhibits bad decomposition results if reduced, thus validating the reliability of the parameter β. We are confident that our method will lead to a better approach to real time applications.Keywords: Independent Component Analysis, FastICA algorithm, Higher-order statistics, Johnson-Lindenstrauss lemma.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18907675 Simulation Model for Predicting Dengue Fever Outbreak
Authors: Azmi Ibrahim, Nor Azan Mat Zin, Noraidah Sahari Ashaari
Abstract:
Dengue fever is prevalent in Malaysia with numerous cases including mortality recorded over the years. Public education on the prevention of the desease through various means has been carried out besides the enforcement of legal means to eradicate Aedes mosquitoes, the dengue vector breeding ground. Hence, other means need to be explored, such as predicting the seasonal peak period of the dengue outbreak and identifying related climate factors contributing to the increase in the number of mosquitoes. Simulation model can be employed for this purpose. In this study, we created a simulation of system dynamic to predict the spread of dengue outbreak in Hulu Langat, Selangor Malaysia. The prototype was developed using STELLA 9.1.2 software. The main data input are rainfall, temperature and denggue cases. Data analysis from the graph showed that denggue cases can be predicted accurately using these two main variables- rainfall and temperature. However, the model will be further tested over a longer time period to ensure its accuracy, reliability and efficiency as a prediction tool for dengue outbreak.Keywords: dengue fever, prediction, system dynamic, simulation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23367674 Alternating Current Photovoltaic Module Model
Authors: Irtaza M. Syed, Kaamran Raahemifar
Abstract:
This paper presents modeling of an Alternating Current (AC) Photovoltaic (PV) module using Matlab/Simulink. The proposed AC-PV module model is simple, realistic, and application oriented. The model is derived on module level as compared to cell level directly from the information provided by the manufacturer data sheet. DC-PV module, MPPT control, BC, VSI and LC filter, all were treated as a single unit. The model accounts for changes in variations of both irradiance and temperature. The AC-PV module proposed model is simulated and the results are compared with the datasheet projected numbers to validate model’s accuracy and effectiveness. Implementation and results demonstrate simplicity and accuracy, as well as reliability of the model.
Keywords: AC PV Module, Datasheet, Matlab/Simulink, PV modeling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29237673 Application of IED to Condition Based Maintenance of Medium Voltage GCB/VCB
Authors: Ming-Ta Yang, Jyh-Cherng Gu, Chun-Wei Huang, Jin-Lung Guan
Abstract:
Time base maintenance (TBM) is conventionally applied by the power utilities to maintain circuit breakers (CBs), transformers, bus bars and cables, which may result in under maintenance or over maintenance. As information and communication technology (ICT) industry develops, the maintenance policies of many power utilities have gradually changed from TBM to condition base maintenance (CBM) to improve system operating efficiency, operation cost and power supply reliability. This paper discusses the feasibility of using intelligent electronic devices (IEDs) to construct a CB CBM management platform. CBs in power substations can be monitored using IEDs with additional logic configuration and wire connections. The CB monitoring data can be sent through intranet to a control center and be analyzed and integrated by the Elipse Power Studio software. Finally, a human-machine interface (HMI) of supervisory control and data acquisition (SCADA) system can be designed to construct a CBM management platform to provide maintenance decision information for the maintenance personnel, management personnel and CB manufacturers.
Keywords: Circuit breaker, Condition base maintenance, Intelligent electronic device, Time base maintenance, SCADA.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22877672 Performance Improvement of Information System of a Banking System Based on Integrated Resilience Engineering Design
Authors: S. H. Iranmanesh, L. Aliabadi, A. Mollajan
Abstract:
Integrated resilience engineering (IRE) is capable of returning banking systems to the normal state in extensive economic circumstances. In this study, information system of a large bank (with several branches) is assessed and optimized under severe economic conditions. Data envelopment analysis (DEA) models are employed to achieve the objective of this study. Nine IRE factors are considered to be the outputs, and a dummy variable is defined as the input of the DEA models. A standard questionnaire is designed and distributed among executive managers to be considered as the decision-making units (DMUs). Reliability and validity of the questionnaire is examined based on Cronbach's alpha and t-test. The most appropriate DEA model is determined based on average efficiency and normality test. It is shown that the proposed integrated design provides higher efficiency than the conventional RE design. Results of sensitivity and perturbation analysis indicate that self-organization, fault tolerance, and reporting culture respectively compose about 50 percent of total weight.
Keywords: Banking system, data envelopment analysis, DEA, integrated resilience engineering, IRE, performance evaluation, perturbation analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8417671 Comparative Analysis of Diverse Collection of Big Data Analytics Tools
Authors: S. Vidhya, S. Sarumathi, N. Shanthi
Abstract:
Over the past era, there have been a lot of efforts and studies are carried out in growing proficient tools for performing various tasks in big data. Recently big data have gotten a lot of publicity for their good reasons. Due to the large and complex collection of datasets it is difficult to process on traditional data processing applications. This concern turns to be further mandatory for producing various tools in big data. Moreover, the main aim of big data analytics is to utilize the advanced analytic techniques besides very huge, different datasets which contain diverse sizes from terabytes to zettabytes and diverse types such as structured or unstructured and batch or streaming. Big data is useful for data sets where their size or type is away from the capability of traditional relational databases for capturing, managing and processing the data with low-latency. Thus the out coming challenges tend to the occurrence of powerful big data tools. In this survey, a various collection of big data tools are illustrated and also compared with the salient features.
Keywords: Big data, Big data analytics, Business analytics, Data analysis, Data visualization, Data discovery.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 37757670 A Stochastic Diffusion Process Based on the Two-Parameters Weibull Density Function
Authors: Meriem Bahij, Ahmed Nafidi, Boujemâa Achchab, Sílvio M. A. Gama, José A. O. Matos
Abstract:
Stochastic modeling concerns the use of probability to model real-world situations in which uncertainty is present. Therefore, the purpose of stochastic modeling is to estimate the probability of outcomes within a forecast, i.e. to be able to predict what conditions or decisions might happen under different situations. In the present study, we present a model of a stochastic diffusion process based on the bi-Weibull distribution function (its trend is proportional to the bi-Weibull probability density function). In general, the Weibull distribution has the ability to assume the characteristics of many different types of distributions. This has made it very popular among engineers and quality practitioners, who have considered it the most commonly used distribution for studying problems such as modeling reliability data, accelerated life testing, and maintainability modeling and analysis. In this work, we start by obtaining the probabilistic characteristics of this model, as the explicit expression of the process, its trends, and its distribution by transforming the diffusion process in a Wiener process as shown in the Ricciaardi theorem. Then, we develop the statistical inference of this model using the maximum likelihood methodology. Finally, we analyse with simulated data the computational problems associated with the parameters, an issue of great importance in its application to real data with the use of the convergence analysis methods. Overall, the use of a stochastic model reflects only a pragmatic decision on the part of the modeler. According to the data that is available and the universe of models known to the modeler, this model represents the best currently available description of the phenomenon under consideration.Keywords: Diffusion process, discrete sampling, likelihood estimation method, simulation, stochastic diffusion equation, trends functions, bi-parameters Weibull density function.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19677669 Fuzzy Logic Based Determination of Battery Charging Efficiency Applied to Hybrid Power System
Authors: Priyanka Paliwal, N. P. Patidar, R. K. Nema
Abstract:
Battery storage system is emerging as an essential component of hybrid power system based on renewable energy resources such as solar and wind in order to make these sources dispatchable. Accurate modeling of battery storage system is ssential in order to ensure optimal planning of hybrid power systems incorporating battery storage. Majority of the system planning studies involving battery storage assume battery charging efficiency to be constant. However a strong correlation exists between battery charging efficiency and battery state of charge. In this work a Fuzzy logic based model has been presented for determining battery charging efficiency relative to a particular SOC. In order to demonstrate the efficacy of proposed approach, reliability evaluation studies are carried out for a hypothetical autonomous hybrid power system located in Jaisalmer, Rajasthan, India. The impact of considering battery charging efficiency as a function of state of charge is compared against the assumption of fixed battery charging efficiency for three different configurations comprising of wind-storage, solar-storage and wind-solar-storage.
Keywords: Battery Storage, Charging efficiency, Fuzzy Logic, Hybrid Power System, Reliability
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20937668 Global Kinetics of Direct Dimethyl Ether Synthesis Process from Syngas in Slurry Reactor over a Novel Cu-Zn-Al-Zr Slurry Catalyst
Authors: Zhen Chen, Haitao Zhang, Weiyong Ying, Dingye Fang
Abstract:
The direct synthesis process of dimethyl ether (DME) from syngas in slurry reactors is considered to be promising because of its advantages in caloric transfer. In this paper, the influences of operating conditions (temperature, pressure and weight hourly space velocity) on the conversion of CO, selectivity of DME and methanol were studied in a stirred autoclave over Cu-Zn-Al-Zr slurry catalyst, which is far more suitable to liquid phase dimethyl ether synthesis process than bifunctional catalyst commercially. A Langmuir- Hinshelwood mechanism type global kinetics model for liquid phase DME direct synthesis based on methanol synthesis models and a methanol dehydration model has been investigated by fitting our experimental data. The model parameters were estimated with MATLAB program based on general Genetic Algorithms and Levenberg-Marquardt method, which is suitably fitting experimental data and its reliability was verified by statistical test and residual error analysis.Keywords: alcohol/ether fuel, Cu-Zn-Al-Zr slurry catalyst, global kinetics, slurry reactor
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 55217667 Determinants of Service Quality on Thai Passengers’ Repeated Purchase of Domestic Flight Service with Thai Airways International
Authors: Nattapong Techarattanased
Abstract:
This research paper aimed to identify determinants of airline service quality on passengers’ repeated purchase of service. The population of this study was Thai passengers flying domestic flights with Thai Airways, making a total of 300 samples. These 300 samples participated in this research by answering a collection of questions by means of a questionnaire. An analysis of means score and multiple regression revealed that perceived service quality for tangible elements, reliability, responsiveness, assurance and empathy had determined repeated purchase of flight service of the passengers at a high level. Moreover, reliability and responsiveness factors could predict the passengers’ repeated purchase of flight service at the percentage of 30.6. The findings gave a signal that Thai Airways may consider a development of route network and fleet strategy as well as an establishment of aircraft and seat qualification to meet passengers’ needs and requirements. Passengers’ level of satisfaction could also be maximized by offering service value through various kinds of special deals and programs, whereas value- added pricing strategy should be considered in order to differentiate from and beat other leading airline competitors.
Keywords: Service Quality, Repeated Purchase.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26977666 An Improved Tie Force Method for Progressive Collapse Resistance of Precast Concrete Cross Wall Structures
Authors: M. Tohidi, J. Yang, C. Baniotopoulos
Abstract:
Progressive collapse of buildings typically occurs when abnormal loading conditions cause local damages, which leads to a chain reaction of failure and ultimately catastrophic collapse. The tie force (TF) method is one of the main design approaches for progressive collapse. As the TF method is a simplified method, further investigations on the reliability of the method is necessary. This study aims to develop an improved TF method to design the cross wall structures for progressive collapse. To this end, the pullout behavior of strands in grout was firstly analyzed; and then, by considering the tie force-slip relationship in the friction stage together with the catenary action mechanism, a comprehensive analytical method was developed. The reliability of this approach is verified by the experimental results of concrete block pullout tests and full scale floor-to-floor joints tests undertaken by Portland Cement Association (PCA). Discrepancies in the tie force between the analytical results and codified specifications have suggested the deficiency of TF method, hence an improved model based on the analytical results has been proposed to address this concern.
Keywords: Cross wall, progressive collapse, ties force method, catenary, analytical.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 36877665 The Reliability of Management Earnings Forecasts in IPO Prospectuses: A Study of Managers’ Forecasting Preferences
Authors: Maha Hammami, Olfa Benouda Sioud
Abstract:
This study investigates the reliability of management earnings forecasts with reference to these two ingredients: verifiability and neutrality. Specifically, we examine the biasedness (or accuracy) of management earnings forecasts and company specific characteristics that can be associated with accuracy. Based on sample of 102 IPO prospectuses published for admission on NYSE Euronext Paris from 2002 to 2010, we found that these forecasts are on average optimistic and two of the five test variables, earnings variability and financial leverage are significant in explaining ex post bias. Acknowledging the possibility that the bias is the result of the managers’ forecasting behavior, we then examine whether managers decide to under-predict, over-predict or forecast accurately for self-serving purposes. Explicitly, we examine the role of financial distress, operating performance, ownership by insiders and the economy state in influencing managers’ forecasting preferences. We find that managers of distressed firms seem to over-predict future earnings. We also find that when managers are given more stock options, they tend to under-predict future earnings. Finally, we conclude that the management earnings forecasts are affected by an intentional bias due to managers’ forecasting preferences.
Keywords: Intentional bias, Management earnings forecasts, neutrality, verifiability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22437664 Multi-labeled Data Expressed by a Set of Labels
Authors: Tetsuya Furukawa, Masahiro Kuzunishi
Abstract:
Collected data must be organized to be utilized efficiently, and hierarchical classification of data is efficient approach to organize data. When data is classified to multiple categories or annotated with a set of labels, users request multi-labeled data by giving a set of labels. There are several interpretations of the data expressed by a set of labels. This paper discusses which data is expressed by a set of labels by introducing orders for sets of labels and shows that there are four types of orders, which are characterized by whether the labels of expressed data includes every label of the given set of labels within the range of the set. Desirable properties of the orders, data is also expressed by the higher set of labels and different sets of labels express different data, are discussed for the orders.
Keywords: Classification Hierarchies, Multi-labeled Data, Multiple Classificaiton, Orders of Sets of Labels
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13047663 Evaluating Factors Influencing Information Quality in Large Firms
Authors: B. E. Narkhede, S. K. Mahajan, B. T. Patil, R. D. Raut
Abstract:
Information quality is a major performance measure for an Enterprise Resource Planning (ERP) system of any firm. This study identifies various critical success factors of information quality. The effect of various critical success factors like project management, reengineering efforts and interdepartmental communications on information quality is analyzed using a multiple regression model. Here quantitative data are collected from respondents from various firms through structured questionnaire for assessment of the information quality, project management, reengineering efforts and interdepartmental communications. The validity and reliability of the data are ensured using techniques like factor analysis, computing of Cronbach’s alpha. This study gives relative importance of each of the critical success factors. The findings suggest that among the various factors influencing information quality careful reengineering efforts are the most influencing factor. This paper gives clear insight to managers and practitioners regarding the relative importance of critical success factors influencing information quality so that they can formulate a strategy at the beginning of ERP system implementation.
Keywords: Enterprise resource planning, information systems, multiple regression, information quality.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21157662 A Review of Survey Methodology Employedin IT Outsourcing
Authors: B. Terzioglu, E.S.K. Chan
Abstract:
The purpose of this paper is to provide an overview on methodological aspects of the information technology outsourcing (ITO) surveys, in an attempt to improve the data quality and reporting in survey research. It is based on a review of thirty articles on ITO surveys and focuses on two commonly explored dimensions of ITO, namely what are outsourced and why should there be ITO. This study highlights weaknesses in ITO surveys including lack of a clear definition of population, lack of information regarding the sampling method used, not citing the response rate, no information pertaining to pilot testing of survey instrument and absence of information on internal validity in the use or reporting of surveys. This study represents an attempt with a limited scope to point to shortfalls in the use survey methodology in ITO, and thus raise awareness among researchers in enhancing the reliability of survey findings.
Keywords: ITO, information technology outsourcing, survey methodology
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18527661 Evaluating the Effectiveness of Memory Overcommit Techniques on KVM-based Hosting Platform
Authors: Chin-Hung Li
Abstract:
Determining how many virtual machines a Linux host could run can be a challenge. One of tough missions is to find the balance among performance, density and usability. Now KVM hypervisor has become the most popular open source full virtualization solution. It supports several ways of running guests with more memory than host really has. Due to large differences between minimum and maximum guest memory requirements, this paper presents initial results on same-page merging, ballooning and live migration techniques that aims at optimum memory usage on KVM-based cloud platform. Given the design of initial experiments, the results data is worth reference for system administrators. The results from these experiments concluded that each method offers different reliability tradeoff.Keywords: Kernel-based Virtual Machine, Overcommit, Virtualization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31217660 Availability Analysis of a Power Plant by Computer Simulation
Authors: Mehmet Savsar
Abstract:
Reliability and availability of power stations are extremely important in order to achieve a required level of power generation. In particular, in the hot desert climate of Kuwait, reliable power generation is extremely important because of cooling requirements at temperatures exceeding 50-centigrade degrees. In this paper, a particular power plant, named Sabiya Power Plant, which has 8 steam turbines and 13 gas turbine stations, has been studied in detail; extensive data are collected; and availability of station units are determined. Furthermore, a simulation model is developed and used to analyze the effects of different maintenance policies on availability of these stations. The results show that significant improvements can be achieved in power plant availabilities if appropriate maintenance policies are implemented.Keywords: Power plants, steam turbines, gas turbines, maintenance, availability, simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14997659 Cooperative Energy Efficient Routing for Wireless Sensor Networks in Smart Grid Communications
Authors: Ghazi AL-Sukkar, Iyad Jafar, Khalid Darabkh, Raed Al-Zubi, Mohammed Hawa
Abstract:
Smart Grids employ wireless sensor networks for their control and monitoring. Sensors are characterized by limitations in the processing power, energy supply and memory spaces, which require a particular attention on the design of routing and data management algorithms. Since most routing algorithms for sensor networks, focus on finding energy efficient paths to prolong the lifetime of sensor networks, the power of sensors on efficient paths depletes quickly, and consequently sensor networks become incapable of monitoring events from some parts of their target areas. In consequence, the design of routing protocols should consider not only energy efficiency paths, but also energy efficient algorithms in general. In this paper we propose an energy efficient routing protocol for wireless sensor networks without the support of any location information system. The reliability and the efficiency of this protocol have been demonstrated by simulation studies where we compare them to the legacy protocols. Our simulation results show that these algorithms scale well with network size and density.Keywords: Data-centric storage, Dynamic Address Allocation, Sensor networks, Smart Grid Communications.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18527658 The Comparison of Data Replication in Distributed Systems
Authors: Iman Zangeneh, Mostafa Moradi, Ali Mokhtarbaf
Abstract:
The necessity of ever-increasing use of distributed data in computer networks is obvious for all. One technique that is performed on the distributed data for increasing of efficiency and reliablity is data rplication. In this paper, after introducing this technique and its advantages, we will examine some dynamic data replication. We will examine their characteristies for some overus scenario and the we will propose some suggestion for their improvement.Keywords: data replication, data hiding, consistency, dynamicdata replication strategy
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16357657 Integrating Life Cycle Uncertainties for Evaluating a Building Overall Cost
Authors: M. Arja, G. Sauce, B. Souyri
Abstract:
Overall cost is a significant consideration in any decision-making process. Although many studies were carried out on overall cost in construction, little has treated the uncertainties of real life cycle development. On the basis of several case studies, a feedback process was performed on the historical data of studied buildings. This process enabled to identify some factors causing uncertainty during the operational period. As a result, the research proposes a new method for assessing the overall cost during a part of the building-s life cycle taking account of the building actual value, its end-of-life value and the influence of the identified life cycle uncertainty factors. The findings are a step towards a higher level of reliability in overall cost evaluation taking account of some usually unexpected uncertainty factors.Keywords: Asset management, building life cycle uncertainty, building value, overall cost.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16537656 Scale Development for Measuring E-Service Quality in Banking
Authors: Vivek Agrawal, Vikas Tripathi, Nitin Seth
Abstract:
This study examines several critical dimensions of eservice quality overlooked in the existing literature and proposes a model and instrument framework for measuring customer perceived e-service quality in the banking sector. The initial design was derived from a pool of instrument dimensions and their items from the existing literature review by content analysis. Based on focused group discussion, nine dimensions were extracted. An exploratory factor analysis approach was applied to data from a survey of 323 respondents. The instrument has been designed specifically for the banking sector. Research data was collected from bank customers who use electronic banking in a developing economy. A nine-factor instrument has been proposed to measure the e-service quality. The instrument has been checked for reliability. The validity and sample place limited the applicability of the instrument across economies and service categories. Future research must be conducted to check the validity. This instrument can help bankers in developing economies like India to measure the e-service quality and make improvements. The present study offers a systematic procedure that provides insights on to the conceptual and empirical comprehension of customer perceived e-service quality and its constituents.
Keywords: Testing, instrument, e-service quality, factor analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 38377655 Understanding Barriers to Sports Participation as a Means of Achieving Sustainable Development in Michael Otedola College of Primary Education
Authors: Osifeko Olalekan Remigious, Osifeko Christiana Osikorede, Folarin Bolanle Eunice, Olugbenga Adebola Shodiya
Abstract:
During these difficult economic times, nations are looking for ways to improve their finances, preserve the environment as well as the socio-political climate and educational institutions, which are needed to increase their economy and preserve their sustainable development. Sport is one of the ways through which sustainable development can be achieved. The purpose of this study was to examine and understanding barriers to participation in sport. A total of 1,025 students were purposively selected from five schools (School of Arts and Social Sciences, School of Languages, School of Education, School of Sciences and School of Vocational and Technical Education) in Michael Otedola College of Primary Education (MOCPED). A questionnaire, with a tested reliability coefficient of 0.71, was used for data collection. The collected data were subjected to the descriptive survey research design. The findings showed that sports facilities, funding and lecture schedules were significant barriers to sports participation. It was recommended that sports facilities be provided by the Lagos State government.Keywords: MOCPED sports, sustainable development, sports participation, state government.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8187654 Secure Block-Based Video Authentication with Localization and Self-Recovery
Authors: Ammar M. Hassan, Ayoub Al-Hamadi, Yassin M. Y. Hasan, Mohamed A. A. Wahab, Bernd Michaelis
Abstract:
Because of the great advance in multimedia technology, digital multimedia is vulnerable to malicious manipulations. In this paper, a public key self-recovery block-based video authentication technique is proposed which can not only precisely localize the alteration detection but also recover the missing data with high reliability. In the proposed block-based technique, multiple description coding MDC is used to generate two codes (two descriptions) for each block. Although one block code (one description) is enough to rebuild the altered block, the altered block is rebuilt with better quality by the two block descriptions. So using MDC increases the ratability of recovering data. A block signature is computed using a cryptographic hash function and a doubly linked chain is utilized to embed the block signature copies and the block descriptions into the LSBs of distant blocks and the block itself. The doubly linked chain scheme gives the proposed technique the capability to thwart vector quantization attacks. In our proposed technique , anyone can check the authenticity of a given video using the public key. The experimental results show that the proposed technique is reliable for detecting, localizing and recovering the alterations.Keywords: Authentication, hash function, multiple descriptioncoding, public key encryption, watermarking.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19407653 Implementation of an IoT Sensor Data Collection and Analysis Library
Authors: Jihyun Song, Kyeongjoo Kim, Minsoo Lee
Abstract:
Due to the development of information technology and wireless Internet technology, various data are being generated in various fields. These data are advantageous in that they provide real-time information to the users themselves. However, when the data are accumulated and analyzed, more various information can be extracted. In addition, development and dissemination of boards such as Arduino and Raspberry Pie have made it possible to easily test various sensors, and it is possible to collect sensor data directly by using database application tools such as MySQL. These directly collected data can be used for various research and can be useful as data for data mining. However, there are many difficulties in using the board to collect data, and there are many difficulties in using it when the user is not a computer programmer, or when using it for the first time. Even if data are collected, lack of expert knowledge or experience may cause difficulties in data analysis and visualization. In this paper, we aim to construct a library for sensor data collection and analysis to overcome these problems.
Keywords: Clustering, data mining, DBSCAN, k-means, k-medoids, sensor data.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20107652 Malaysian Multi-Ethnic Discrimination Scale: Preliminary Factor and Psychometric Analysis
Authors: Chua Bee Seok, Shamsul Amri Baharuddin, Rosnah Ismail, Ferlis Bahari, Jasmine Adela Mutang, Lailawati Madlan, Asong Joseph
Abstract:
The aims of this study were to determine the factor structure and psychometric properties (i.e., reliability and convergent validity) of the Malaysian Multi-Ethnic Discrimination Scale (MMEDS). It consists of 71-items measure experience, strategies used and consequences of ethnic discrimination. A sample of 649 university students from one of the higher education institution in Malaysia was asked to complete MMEDS, as well as Perceived Ethnic and Racial Discrimination. The exploratory factor analysis on ethnic discrimination experience extracted two factors labeled ‘unfair treatment’ (15 items) and ‘Denial of the ethnic right’ (12 items) which accounted for 60.92% of the total variance. The two sub scales demonstrated clear reliability with internal consistency above .70. The convergent validity of the Scale was supported by an expected pattern of correlations (positive and significant correlation) between the score of unfair treatment and denial of the ethnic right and the score of Perceived Ethnic and Racial Discrimination by Peers Scale. The results suggest that the MMEDS is a reliable and valid measure. However, further studies need to be carried out in other groups of sample as to validate the Scale.Keywords: Factor structure, psychometric properties, exploratory factor analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24957651 Performance Analysis of a Combined Ordered Successive and Interference Cancellation Using Zero-Forcing Detection over Rayleigh Fading Channels in MIMO Systems
Authors: Jamal R. Elbergali
Abstract:
Multiple Input Multiple Output (MIMO) systems are wireless systems with multiple antenna elements at both ends of the link. Wireless communication systems demand high data rate and spectral efficiency with increased reliability. MIMO systems have been popular techniques to achieve these goals because increased data rate is possible through spatial multiplexing scheme and diversity. Spatial Multiplexing (SM) is used to achieve higher possible throughput than diversity. In this paper, we propose a Zero- Forcing (ZF) detection using a combination of Ordered Successive Interference Cancellation (OSIC) and Zero Forcing using Interference Cancellation (ZF-IC). The proposed method used an OSIC based on Signal to Noise Ratio (SNR) ordering to get the estimation of last symbol, then the estimated last symbol is considered to be an input to the ZF-IC. We analyze the Bit Error Rate (BER) performance of the proposed MIMO system over Rayleigh Fading Channel, using Binary Phase Shift Keying (BPSK) modulation scheme. The results show better performance than the previous methods.Keywords: SNR, BER, BPSK, MIMO, Modulation, Zero forcing (ZF), OSIC, ZF-IC, Spatial Multiplexing (SM).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16957650 Use of Chlorophyll Meters to Assess In-Season Wheat Nitrogen Fertilizer Requirements in the Southern San Joaquin Valley
Authors: Brian H. Marsh
Abstract:
Nitrogen fertilizer is the most used and often the most mismanaged nutrient input. Nitrogen management has tremendous implications on crop productivity, quality and environmental stewardship. Sufficient nitrogen is needed to optimum yield and quality. Soil and in-season plant tissue testing for nitrogen status are a time consuming and expensive process. Real time sensing of plant nitrogen status can be a useful tool in managing nitrogen inputs. The objectives of this project were to assess the reliability of remotely sensed non-destructive plant nitrogen measurements compared to wet chemistry data from sampled plant tissue, develop in-season nitrogen recommendations based on remotely sensed data for improved nitrogen use efficiency and assess the potential for determining yield and quality from remotely sensed data. Very good correlations were observed between early-season remotely sensed crop nitrogen status and plant nitrogen concentrations and subsequent in-season fertilizer recommendations. The transmittance/absorbance type meters gave the most accurate readings. Early in-season fertilizer recommendation would be to apply 40 kg nitrogen per hectare plus 15 kg nitrogen per hectare for each unit difference measured with the SPAD meter between the crop and reference area or 25 kg plus 13 kg per hectare for each unit difference measured with the CCM 200. Once the crop was sufficiently fertilized meter readings became inconclusive and were of no benefit for determining nitrogen status, silage yield and quality and grain yield and protein.
Keywords: Wheat, nitrogen fertilization, chlorophyll meter.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21997649 Health Monitoring of Power Transformers by Dissolved Gas Analysis using Regression Method and Study the Effect of Filtration on Oil
Authors: Anjali Chatterjee, Nirmal Kumar Roy
Abstract:
Economically transformers constitute one of the largest investments in a Power system. For this reason, transformer condition assessment and management is a high priority task. If a transformer fails, it would have a significant negative impact on revenue and service reliability. Monitoring the state of health of power transformers has traditionally been carried out using laboratory Dissolved Gas Analysis (DGA) tests performed at periodic intervals on the oil sample, collected from the transformers. DGA of transformer oil is the single best indicator of a transformer-s overall condition and is a universal practice today, which started somewhere in the 1960s. Failure can occur in a transformer due to different reasons. Some failures can be limited or prevented by maintenance. Oil filtration is one of the methods to remove the dissolve gases and prevent the deterioration of the oil. In this paper we analysis the DGA data by regression method and predict the gas concentration in the oil in the future. We bring about a comparative study of different traditional methods of regression and the errors generated out of their predictions. With the help of these data we can deduce the health of the transformer by finding the type of fault if it has occurred or will occur in future. Additional in this paper effect of filtration on the transformer health is highlight by calculating the probability of failure of a transformer with and without oil filtrating.
Keywords: Power Transformers, Dissolve gas Analysis, Regression method, Filtration, oil.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29437648 Government (Big) Data Ecosystem: Definition, Classification of Actors, and Their Roles
Authors: Syed Iftikhar Hussain Shah, Vasilis Peristeras, Ioannis Magnisalis
Abstract:
Organizations, including governments, generate (big) data that are high in volume, velocity, veracity, and come from a variety of sources. Public Administrations are using (big) data, implementing base registries, and enforcing data sharing within the entire government to deliver (big) data related integrated services, provision of insights to users, and for good governance. Government (Big) data ecosystem actors represent distinct entities that provide data, consume data, manipulate data to offer paid services, and extend data services like data storage, hosting services to other actors. In this research work, we perform a systematic literature review. The key objectives of this paper are to propose a robust definition of government (big) data ecosystem and a classification of government (big) data ecosystem actors and their roles. We showcase a graphical view of actors, roles, and their relationship in the government (big) data ecosystem. We also discuss our research findings. We did not find too much published research articles about the government (big) data ecosystem, including its definition and classification of actors and their roles. Therefore, we lent ideas for the government (big) data ecosystem from numerous areas that include scientific research data, humanitarian data, open government data, industry data, in the literature.
Keywords: Big data, big data ecosystem, classification of big data actors, big data actors roles, definition of government (big) data ecosystem, data-driven government, eGovernment, gaps in data ecosystems, government (big) data, public administration, systematic literature review.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21437647 Deployment of Beyond 4G Wireless Communication Networks with Carrier Aggregation
Authors: Bahram Khan, Anderson Rocha Ramos, Rui R. Paulo, Fernando J. Velez
Abstract:
With the growing demand for a new blend of applications, the users dependency on the internet is increasing day by day. Mobile internet users are giving more attention to their own experiences, especially in terms of communication reliability, high data rates and service stability on move. This increase in the demand is causing saturation of existing radio frequency bands. To address these challenges, researchers are investigating the best approaches, Carrier Aggregation (CA) is one of the newest innovations, which seems to fulfill the demands of the future spectrum, also CA is one the most important feature for Long Term Evolution - Advanced (LTE-Advanced). For this purpose to get the upcoming International Mobile Telecommunication Advanced (IMT-Advanced) mobile requirements (1 Gb/s peak data rate), the CA scheme is presented by 3GPP, which would sustain a high data rate using widespread frequency bandwidth up to 100 MHz. Technical issues such as aggregation structure, its implementations, deployment scenarios, control signal techniques, and challenges for CA technique in LTE-Advanced, with consideration of backward compatibility, are highlighted in this paper. Also, performance evaluation in macro-cellular scenarios through a simulation approach is presented, which shows the benefits of applying CA, low-complexity multi-band schedulers in service quality, system capacity enhancement and concluded that enhanced multi-band scheduler is less complex than the general multi-band scheduler, which performs better for a cell radius longer than 1800 m (and a PLR threshold of 2%).Keywords: Component carrier, carrier aggregation, LTE-Advanced, scheduling, spectrum management.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 562