Search results for: data portal
24151 Applying Hybrid Graph Drawing and Clustering Methods on Stock Investment Analysis
Authors: Mouataz Zreika, Maria Estela Varua
Abstract:
Stock investment decisions are often made based on current events of the global economy and the analysis of historical data. Conversely, visual representation could assist investors’ gain deeper understanding and better insight on stock market trends more efficiently. The trend analysis is based on long-term data collection. The study adopts a hybrid method that combines the Clustering algorithm and Force-directed algorithm to overcome the scalability problem when visualizing large data. This method exemplifies the potential relationships between each stock, as well as determining the degree of strength and connectivity, which will provide investors another understanding of the stock relationship for reference. Information derived from visualization will also help them make an informed decision. The results of the experiments show that the proposed method is able to produced visualized data aesthetically by providing clearer views for connectivity and edge weights.Keywords: clustering, force-directed, graph drawing, stock investment analysis
Procedia PDF Downloads 30424150 Clinical and Laboratory Diagnosis of Malaria in Surat Thani, Southern Thailand
Authors: Manas Kotepui, Chatree Ratcha, Kwuntida Uthaisar
Abstract:
Malaria infection is still to be considered a major public health problem in Thailand. This study, a retrospective data of patients in Surat Thani Province, Southern Thailand during 2012-2015 was retrieved and analyzed. These data include demographic data, clinical characteristics and laboratory diagnosis. Statistical analyses were performed to demonstrate the frequency, proportion, data tendency, and group comparisons. Total of 395 malaria patients were found. Most of patients were male (253 cases, 64.1%). Most of patients (262 cases, 66.3%) were admitted at 6 am-11.59 am of the day. Three hundred and fifty-five patients (97.5%) were positive with P. falciparum. Hemoglobin, hematocrit, and MCHC between P. falciparum and P. vivax were significant different (P value<0.05).During 2012-2015, prevalence of malaria was highest in 2013. Neutrophils, lymphocytes, and monocytes were significantly changed among patients with fever ≤ 3 days compared with patients with fever >3 days. This information will guide to understanding pathogenesis and characteristic of malaria infection in Sothern Thailand.Keywords: prevalence, malaria, Surat Thani, Thailand
Procedia PDF Downloads 27724149 Adaptive Swarm Balancing Algorithms for Rare-Event Prediction in Imbalanced Healthcare Data
Authors: Jinyan Li, Simon Fong, Raymond Wong, Mohammed Sabah, Fiaidhi Jinan
Abstract:
Clinical data analysis and forecasting have make great contributions to disease control, prevention and detection. However, such data usually suffer from highly unbalanced samples in class distributions. In this paper, we target at the binary imbalanced dataset, where the positive samples take up only the minority. We investigate two different meta-heuristic algorithms, particle swarm optimization and bat-inspired algorithm, and combine both of them with the synthetic minority over-sampling technique (SMOTE) for processing the datasets. One approach is to process the full dataset as a whole. The other is to split up the dataset and adaptively process it one segment at a time. The experimental results reveal that while the performance improvements obtained by the former methods are not scalable to larger data scales, the later one, which we call Adaptive Swarm Balancing Algorithms, leads to significant efficiency and effectiveness improvements on large datasets. We also find it more consistent with the practice of the typical large imbalanced medical datasets. We further use the meta-heuristic algorithms to optimize two key parameters of SMOTE. Leading to more credible performances of the classifier, and shortening the running time compared with the brute-force method.Keywords: Imbalanced dataset, meta-heuristic algorithm, SMOTE, big data
Procedia PDF Downloads 44424148 Convergence and Stability in Federated Learning with Adaptive Differential Privacy Preservation
Authors: Rizwan Rizwan
Abstract:
This paper provides an overview of Federated Learning (FL) and its application in enhancing data security, privacy, and efficiency. FL utilizes three distinct architectures to ensure privacy is never compromised. It involves training individual edge devices and aggregating their models on a server without sharing raw data. This approach not only provides secure models without data sharing but also offers a highly efficient privacy--preserving solution with improved security and data access. Also we discusses various frameworks used in FL and its integration with machine learning, deep learning, and data mining. In order to address the challenges of multi--party collaborative modeling scenarios, a brief review FL scheme combined with an adaptive gradient descent strategy and differential privacy mechanism. The adaptive learning rate algorithm adjusts the gradient descent process to avoid issues such as model overfitting and fluctuations, thereby enhancing modeling efficiency and performance in multi-party computation scenarios. Additionally, to cater to ultra-large-scale distributed secure computing, the research introduces a differential privacy mechanism that defends against various background knowledge attacks.Keywords: federated learning, differential privacy, gradient descent strategy, convergence, stability, threats
Procedia PDF Downloads 3424147 Data Security in Cloud Storage
Authors: Amir Rashid
Abstract:
Today is the world of innovation and Cloud Computing is becoming a day to day technology with every passing day offering remarkable services and features on the go with rapid elasticity. This platform took business computing into an innovative dimension where clients interact and operate through service provider web portals. Initially, the trust relationship between client and service provider remained a big question but with the invention of several cryptographic paradigms, it is becoming common in everyday business. This research work proposes a solution for building a cloud storage service with respect to Data Security addressing public cloud infrastructure where the trust relationship matters a lot between client and service provider. For the great satisfaction of client regarding high-end Data Security, this research paper propose a layer of cryptographic primitives combining several architectures in order to achieve the goal. A survey has been conducted to determine the benefits for such an architecture would provide to both clients/service providers and recent developments in cryptography specifically by cloud storage.Keywords: data security in cloud computing, cloud storage architecture, cryptographic developments, token key
Procedia PDF Downloads 29724146 Fuzzy Total Factor Productivity by Credibility Theory
Authors: Shivi Agarwal, Trilok Mathur
Abstract:
This paper proposes the method to measure the total factor productivity (TFP) change by credibility theory for fuzzy input and output variables. Total factor productivity change has been widely studied with crisp input and output variables, however, in some cases, input and output data of decision-making units (DMUs) can be measured with uncertainty. These data can be represented as linguistic variable characterized by fuzzy numbers. Malmquist productivity index (MPI) is widely used to estimate the TFP change by calculating the total factor productivity of a DMU for different time periods using data envelopment analysis (DEA). The fuzzy DEA (FDEA) model is solved using the credibility theory. The results of FDEA is used to measure the TFP change for fuzzy input and output variables. Finally, numerical examples are presented to illustrate the proposed method to measure the TFP change input and output variables. The suggested methodology can be utilized for performance evaluation of DMUs and help to assess the level of integration. The methodology can also apply to rank the DMUs and can find out the DMUs that are lagging behind and make recommendations as to how they can improve their performance to bring them at par with other DMUs.Keywords: chance-constrained programming, credibility theory, data envelopment analysis, fuzzy data, Malmquist productivity index
Procedia PDF Downloads 36824145 What the Future Holds for Social Media Data Analysis
Authors: P. Wlodarczak, J. Soar, M. Ally
Abstract:
The dramatic rise in the use of Social Media (SM) platforms such as Facebook and Twitter provide access to an unprecedented amount of user data. Users may post reviews on products and services they bought, write about their interests, share ideas or give their opinions and views on political issues. There is a growing interest in the analysis of SM data from organisations for detecting new trends, obtaining user opinions on their products and services or finding out about their online reputations. A recent research trend in SM analysis is making predictions based on sentiment analysis of SM. Often indicators of historic SM data are represented as time series and correlated with a variety of real world phenomena like the outcome of elections, the development of financial indicators, box office revenue and disease outbreaks. This paper examines the current state of research in the area of SM mining and predictive analysis and gives an overview of the analysis methods using opinion mining and machine learning techniques.Keywords: social media, text mining, knowledge discovery, predictive analysis, machine learning
Procedia PDF Downloads 42824144 Development of Automatic Laser Scanning Measurement Instrument
Authors: Chien-Hung Liu, Yu-Fen Chen
Abstract:
This study used triangular laser probe and three-axial direction mobile platform for surface measurement, programmed it and applied it to real-time analytic statistics of different measured data. This structure was used to design a system integration program: using triangular laser probe for scattering or reflection non-contact measurement, transferring the captured signals to the computer through RS-232, and using RS-485 to control the three-axis platform for a wide range of measurement. The data captured by the laser probe are formed into a 3D surface. This study constructed an optical measurement application program in the concept of visual programming language. First, the signals are transmitted to the computer through RS-232/RS-485, and then the signals are stored and recorded in graphic interface timely. This programming concept analyzes various messages, and makes proper presentation graphs and data processing to provide the users with friendly graphic interfaces and data processing state monitoring, and identifies whether the present data are normal in graphic concept. The major functions of the measurement system developed by this study are thickness measurement, SPC, surface smoothness analysis, and analytical calculation of trend line. A result report can be made and printed promptly. This study measured different heights and surfaces successfully, performed on-line data analysis and processing effectively, and developed a man-machine interface for users to operate.Keywords: laser probe, non-contact measurement, triangulation measurement principle, statistical process control, labVIEW
Procedia PDF Downloads 36224143 An Optimized Association Rule Mining Algorithm
Authors: Archana Singh, Jyoti Agarwal, Ajay Rana
Abstract:
Data Mining is an efficient technology to discover patterns in large databases. Association Rule Mining techniques are used to find the correlation between the various item sets in a database, and this co-relation between various item sets are used in decision making and pattern analysis. In recent years, the problem of finding association rules from large datasets has been proposed by many researchers. Various research papers on association rule mining (ARM) are studied and analyzed first to understand the existing algorithms. Apriori algorithm is the basic ARM algorithm, but it requires so many database scans. In DIC algorithm, less amount of database scan is needed but complex data structure lattice is used. The main focus of this paper is to propose a new optimized algorithm (Friendly Algorithm) and compare its performance with the existing algorithms A data set is used to find out frequent itemsets and association rules with the help of existing and proposed (Friendly Algorithm) and it has been observed that the proposed algorithm also finds all the frequent itemsets and essential association rules from databases as compared to existing algorithms in less amount of database scan. In the proposed algorithm, an optimized data structure is used i.e. Graph and Adjacency Matrix.Keywords: association rules, data mining, dynamic item set counting, FP-growth, friendly algorithm, graph
Procedia PDF Downloads 42224142 Failure Statistics Analysis of China’s Spacecraft in Full-Life
Authors: Xin-Yan Ji
Abstract:
The historical failures data of the spacecraft is very useful to improve the spacecraft design and the test philosophies and reduce the spacecraft flight risk. A study of spacecraft failures data was performed, which is the most comprehensive statistics of spacecrafts in China. 2593 on-orbit failures data and 1298 ground data that occurred on 150 spacecraft launched from 2000 to 2016 were identified and collected, which covered the navigation satellites, communication satellites, remote sensing deep space exploration manned spaceflight platforms. In this paper, the failures were analyzed to compare different spacecraft subsystem and estimate their impact on the mission, then the development of spacecraft in China was evaluated from design, software, workmanship, management, parts, and materials. Finally, the lessons learned from the past years show that electrical and mechanical failures are responsible for the largest parts, and the key solution to reduce in-orbit failures is improving design technology, enough redundancy, adequate space environment protection measures, and adequate ground testing.Keywords: spacecraft anomalies, anomalies mechanism, failure cause, spacecraft testing
Procedia PDF Downloads 11924141 Advances in Fiber Optic Technology for High-Speed Data Transmission
Authors: Salim Yusif
Abstract:
Fiber optic technology has revolutionized telecommunications and data transmission, providing unmatched speed, bandwidth, and reliability. This paper presents the latest advancements in fiber optic technology, focusing on innovations in fiber materials, transmission techniques, and network architectures that enhance the performance of high-speed data transmission systems. Key advancements include the development of ultra-low-loss optical fibers, multi-core fibers, advanced modulation formats, and the integration of fiber optics into next-generation network architectures such as Software-Defined Networking (SDN) and Network Function Virtualization (NFV). Additionally, recent developments in fiber optic sensors are discussed, extending the utility of optical fibers beyond data transmission. Through comprehensive analysis and experimental validation, this research offers valuable insights into the future directions of fiber optic technology, highlighting its potential to drive innovation across various industries.Keywords: fiber optics, high-speed data transmission, ultra-low-loss optical fibers, multi-core fibers, modulation formats, coherent detection, software-defined networking, network function virtualization, fiber optic sensors
Procedia PDF Downloads 6324140 Data Science Inquiry to Manage Football Referees’ Careers
Authors: Iñaki Aliende, Tom Webb, Lorenzo Escot
Abstract:
There is a concern about the decrease in football referees globally. A study in Spain has analyzed the factors affecting a referee's career over the past 30 years through a survey of 758 referees. Results showed the impact of factors such as threats, education, initial vocation, and dependents on a referee's career. To improve the situation, the federation needs to provide better information, support young referees, monitor referees, and raise public awareness of violence toward referees. The study also formed a comprehensive model for federations to enhance their officiating policies by means of data-driven techniques that can serve other federations to improve referees' careers.Keywords: data science, football referees, sport management, sport careers, survival analysis
Procedia PDF Downloads 10124139 Towards the Management of Cybersecurity Threats in Organisations
Authors: O. A. Ajigini, E. N. Mwim
Abstract:
Cybersecurity is the protection of computers, programs, networks, and data from attack, damage, unauthorised, unintended access, change, or destruction. Organisations collect, process and store their confidential and sensitive information on computers and transmit this data across networks to other computers. Moreover, the advent of internet technologies has led to various cyberattacks resulting in dangerous consequences for organisations. Therefore, with the increase in the volume and sophistication of cyberattacks, there is a need to develop models and make recommendations for the management of cybersecurity threats in organisations. This paper reports on various threats that cause malicious damage to organisations in cyberspace and provides measures on how these threats can be eliminated or reduced. The paper explores various aspects of protection measures against cybersecurity threats such as handling of sensitive data, network security, protection of information assets and cybersecurity awareness. The paper posits a model and recommendations on how to manage cybersecurity threats in organisations effectively. The model and the recommendations can then be utilised by organisations to manage the threats affecting their cyberspace. The paper provides valuable information to assist organisations in managing their cybersecurity threats and hence protect their computers, programs, networks and data in cyberspace. The paper aims to assist organisations to protect their information assets and data from cyberthreats as part of the contributions toward community engagement.Keywords: confidential information, cyberattacks, cybersecurity, cyberspace, sensitive information
Procedia PDF Downloads 26024138 Programming without Code: An Approach and Environment to Conditions-On-Data Programming
Authors: Philippe Larvet
Abstract:
This paper presents the concept of an object-based programming language where tests (if... then... else) and control structures (while, repeat, for...) disappear and are replaced by conditions on data. According to the object paradigm, by using this concept, data are still embedded inside objects, as variable-value couples, but object methods are expressed into the form of logical propositions (‘conditions on data’ or COD).For instance : variable1 = value1 AND variable2 > value2 => variable3 = value3. Implementing this approach, a central inference engine turns and examines objects one after another, collecting all CODs of each object. CODs are considered as rules in a rule-based system: the left part of each proposition (left side of the ‘=>‘ sign) is the premise and the right part is the conclusion. So, premises are evaluated and conclusions are fired. Conclusions modify the variable-value couples of the object and the engine goes to examine the next object. The paper develops the principles of writing CODs instead of complex algorithms. Through samples, the paper also presents several hints for implementing a simple mechanism able to process this ‘COD language’. The proposed approach can be used within the context of simulation, process control, industrial systems validation, etc. By writing simple and rigorous conditions on data, instead of using classical and long-to-learn languages, engineers and specialists can easily simulate and validate the functioning of complex systems.Keywords: conditions on data, logical proposition, programming without code, object-oriented programming, system simulation, system validation
Procedia PDF Downloads 22524137 Applying Theory of Self-Efficacy in Intelligent Transportation Systems by Potential Usage of Vehicle as a Sensor
Authors: Aby Nesan Raj, Sumil K. Raj, Sumesh Jayan
Abstract:
The objective of the study is to formulate a self-regulation model that shall enhance the usage of Intelligent Transportation Systems by understanding the theory of self-efficacy. The core logic of the self-regulation model shall monitor driver's behavior based on the situations related to the various sources of Self Efficacy like enactive mastery, vicarious experience, verbal persuasion and physiological arousal in addition to the vehicle data. For this study, four different vehicle data, speed, drowsiness, diagnostic data and surround camera views are considered. This data shall be given to the self-regulation model for evaluation. The oddness, which is the output of self-regulation model, shall feed to Intelligent Transportation Systems where appropriate actions are being taken. These actions include warning to the user as well as the input to the related transportation systems. It is also observed that the usage of vehicle as a sensor reduces the wastage of resource utilization or duplication. Altogether, this approach enhances the intelligence of the transportation systems especially in safety, productivity and environmental performance.Keywords: emergency management, intelligent transportation system, self-efficacy, traffic management
Procedia PDF Downloads 24624136 Airborne SAR Data Analysis for Impact of Doppler Centroid on Image Quality and Registration Accuracy
Authors: Chhabi Nigam, S. Ramakrishnan
Abstract:
This paper brings out the analysis of the airborne Synthetic Aperture Radar (SAR) data to study the impact of Doppler centroid on Image quality and geocoding accuracy from the perspective of Stripmap mode of data acquisition. Although in Stripmap mode of data acquisition radar beam points at 90 degrees broad side (side looking), shift in the Doppler centroid is invariable due to platform motion. In-accurate estimation of Doppler centroid leads to poor image quality and image miss-registration. The effect of Doppler centroid is analyzed in this paper using multiple sets of data collected from airborne platform. Occurrences of ghost (ambiguous) targets and their power levels have been analyzed that impacts appropriate choice of PRF. Effect of aircraft attitudes (roll, pitch and yaw) on the Doppler centroid is also analyzed with the collected data sets. Various stages of the RDA (Range Doppler Algorithm) algorithm used for image formation in Stripmap mode, range compression, Doppler centroid estimation, azimuth compression, range cell migration correction are analyzed to find the performance limits and the dependence of the imaging geometry on the final image. The ability of Doppler centroid estimation to enhance the imaging accuracy for registration are also illustrated in this paper. The paper also tries to bring out the processing of low squint SAR data, the challenges and the performance limits imposed by the imaging geometry and the platform dynamics on the final image quality metrics. Finally, the effect on various terrain types, including land, water and bright scatters is also presented.Keywords: ambiguous target, Doppler Centroid, image registration, Airborne SAR
Procedia PDF Downloads 21824135 Multimodal Optimization of Density-Based Clustering Using Collective Animal Behavior Algorithm
Authors: Kristian Bautista, Ruben A. Idoy
Abstract:
A bio-inspired metaheuristic algorithm inspired by the theory of collective animal behavior (CAB) was integrated to density-based clustering modeled as multimodal optimization problem. The algorithm was tested on synthetic, Iris, Glass, Pima and Thyroid data sets in order to measure its effectiveness relative to CDE-based Clustering algorithm. Upon preliminary testing, it was found out that one of the parameter settings used was ineffective in performing clustering when applied to the algorithm prompting the researcher to do an investigation. It was revealed that fine tuning distance δ3 that determines the extent to which a given data point will be clustered helped improve the quality of cluster output. Even though the modification of distance δ3 significantly improved the solution quality and cluster output of the algorithm, results suggest that there is no difference between the population mean of the solutions obtained using the original and modified parameter setting for all data sets. This implies that using either the original or modified parameter setting will not have any effect towards obtaining the best global and local animal positions. Results also suggest that CDE-based clustering algorithm is better than CAB-density clustering algorithm for all data sets. Nevertheless, CAB-density clustering algorithm is still a good clustering algorithm because it has correctly identified the number of classes of some data sets more frequently in a thirty trial run with a much smaller standard deviation, a potential in clustering high dimensional data sets. Thus, the researcher recommends further investigation in the post-processing stage of the algorithm.Keywords: clustering, metaheuristics, collective animal behavior algorithm, density-based clustering, multimodal optimization
Procedia PDF Downloads 23424134 Multiphase Coexistence for Aqueous System with Hydrophilic Agent
Authors: G. B. Hong
Abstract:
Liquid-Liquid Equilibrium (LLE) data are measured for the ternary mixtures of water + 1-butanol + butyl acetate and quaternary mixtures of water + 1-butanol + butyl acetate + glycerol at atmospheric pressure at 313.15 K. In addition, isothermal Vapor–Liquid–Liquid Equilibrium (VLLE) data are determined experimentally at 333.15 K. The region of heterogeneity is found to increase as the hydrophilic agent (glycerol) is introduced into the aqueous mixtures. The experimental data are correlated with the NRTL model. The predicted results from the solution model with the model parameters determined from the constituent binaries are also compared with the experimental values.Keywords: LLE, VLLE, hydrophilic agent, NRTL
Procedia PDF Downloads 24524133 ISMARA: Completely Automated Inference of Gene Regulatory Networks from High-Throughput Data
Authors: Piotr J. Balwierz, Mikhail Pachkov, Phil Arnold, Andreas J. Gruber, Mihaela Zavolan, Erik van Nimwegen
Abstract:
Understanding the key players and interactions in the regulatory networks that control gene expression and chromatin state across different cell types and tissues in metazoans remains one of the central challenges in systems biology. Our laboratory has pioneered a number of methods for automatically inferring core gene regulatory networks directly from high-throughput data by modeling gene expression (RNA-seq) and chromatin state (ChIP-seq) measurements in terms of genome-wide computational predictions of regulatory sites for hundreds of transcription factors and micro-RNAs. These methods have now been completely automated in an integrated webserver called ISMARA that allows researchers to analyze their own data by simply uploading RNA-seq or ChIP-seq data sets and provides results in an integrated web interface as well as in downloadable flat form. For any data set, ISMARA infers the key regulators in the system, their activities across the input samples, the genes and pathways they target, and the core interactions between the regulators. We believe that by empowering experimental researchers to apply cutting-edge computational systems biology tools to their data in a completely automated manner, ISMARA can play an important role in developing our understanding of regulatory networks across metazoans.Keywords: gene expression analysis, high-throughput sequencing analysis, transcription factor activity, transcription regulation
Procedia PDF Downloads 6724132 The Power of the Proper Orthogonal Decomposition Method
Authors: Charles Lee
Abstract:
The Principal Orthogonal Decomposition (POD) technique has been used as a model reduction tool for many applications in engineering and science. In principle, one begins with an ensemble of data, called snapshots, collected from an experiment or laboratory results. The beauty of the POD technique is that when applied, the entire data set can be represented by the smallest number of orthogonal basis elements. It is the such capability that allows us to reduce the complexity and dimensions of many physical applications. Mathematical formulations and numerical schemes for the POD method will be discussed along with applications in NASA’s Deep Space Large Antenna Arrays, Satellite Image Reconstruction, Cancer Detection with DNA Microarray Data, Maximizing Stock Return, and Medical Imaging.Keywords: reduced-order methods, principal component analysis, cancer detection, image reconstruction, stock portfolios
Procedia PDF Downloads 8724131 A Reflection of the Contemporary Life of Urban People Through Mixed Media Art
Authors: Van Huong Mai, Kanokwan Nithiratphat, Adool Booncham
Abstract:
The Movement of Contemporary Life consisted of two purposes, which were to study the movement and development of the modern life and to create the visual arts, which were paintings expressed via the form of apartment buildings was used from mixed media (digital printing and acrylic painting on canvas) which conveyed the rapid pace of modern life leading to diverse movements in viewer’s feeling. The operation of this creation was collected field data, documentary data, and influence from creative work. The data analysis was analyzed in order to theme, form, technique, and process to satisfy of concept and special character of the pieces.Keywords: movement, contemporary life, visual art, acrylic painting, digital art, urban space
Procedia PDF Downloads 10024130 Mining Educational Data to Support Students’ Major Selection
Authors: Kunyanuth Kularbphettong, Cholticha Tongsiri
Abstract:
This paper aims to create the model for student in choosing an emphasized track of student majoring in computer science at Suan Sunandha Rajabhat University. The objective of this research is to develop the suggested system using data mining technique to analyze knowledge and conduct decision rules. Such relationships can be used to demonstrate the reasonableness of student choosing a track as well as to support his/her decision and the system is verified by experts in the field. The sampling is from student of computer science based on the system and the questionnaire to see the satisfaction. The system result is found to be satisfactory by both experts and student as well.Keywords: data mining technique, the decision support system, knowledge and decision rules, education
Procedia PDF Downloads 42624129 SPBAC: A Semantic Policy-Based Access Control for Database Query
Authors: Aaron Zhang, Alimire Kahaer, Gerald Weber, Nalin Arachchilage
Abstract:
Access control is an essential safeguard for the security of enterprise data, which controls users’ access to information resources and ensures the confidentiality and integrity of information resources [1]. Research shows that the more common types of access control now have shortcomings [2]. In this direction, to improve the existing access control, we have studied the current technologies in the field of data security, deeply investigated the previous data access control policies and their problems, identified the existing deficiencies, and proposed a new extension structure of SPBAC. SPBAC extension proposed in this paper aims to combine Policy-Based Access Control (PBAC) with semantics to provide logically connected, real-time data access functionality by establishing associations between enterprise data through semantics. Our design combines policies with linked data through semantics to create a "Semantic link" so that access control is no longer per-database and determines that users in each role should be granted access based on the instance policy, and improves the SPBAC implementation by constructing policies and defined attributes through the XACML specification, which is designed to extend on the original XACML model. While providing relevant design solutions, this paper hopes to continue to study the feasibility and subsequent implementation of related work at a later stage.Keywords: access control, semantic policy-based access control, semantic link, access control model, instance policy, XACML
Procedia PDF Downloads 9524128 A Regression Analysis Study of the Applicability of Side Scan Sonar based Safety Inspection of Underwater Structures
Authors: Chul Park, Youngseok Kim, Sangsik Choi
Abstract:
This study developed an electric jig for underwater structure inspection in order to solve the problem of the application of side scan sonar to underwater inspection, and analyzed correlations of empirical data in order to enhance sonar data resolution. For the application of tow-typed sonar to underwater structure inspection, an electric jig was developed. In fact, it was difficult to inspect a cross-section at the time of inspection with tow-typed equipment. With the development of the electric jig for underwater structure inspection, it was possible to shorten an inspection time over 20%, compared to conventional tow-typed side scan sonar, and to inspect a proper cross-section through accurate angle control. The indoor test conducted to enhance sonar data resolution proved that a water depth, the distance from an underwater structure, and a filming angle influenced a resolution and data quality. Based on the data accumulated through field experience, multiple regression analysis was conducted on correlations between three variables. As a result, the relational equation of sonar operation according to a water depth was drawn.Keywords: underwater structure, SONAR, safety inspection, resolution
Procedia PDF Downloads 26624127 Enhanced Imperialist Competitive Algorithm for the Cell Formation Problem Using Sequence Data
Authors: S. H. Borghei, E. Teymourian, M. Mobin, G. M. Komaki, S. Sheikh
Abstract:
Imperialist competitive algorithm (ICA) is a recent meta-heuristic method that is inspired by the social evolutions for solving NP-Hard problems. The ICA is a population based algorithm which has achieved a great performance in comparison to other meta-heuristics. This study is about developing enhanced ICA approach to solve the cell formation problem (CFP) using sequence data. In addition to the conventional ICA, an enhanced version of ICA, namely EICA, applies local search techniques to add more intensification aptitude and embed the features of exploration and intensification more successfully. Suitable performance measures are used to compare the proposed algorithms with some other powerful solution approaches in the literature. In the same way, for checking the proficiency of algorithms, forty test problems are presented. Five benchmark problems have sequence data, and other ones are based on 0-1 matrices modified to sequence based problems. Computational results elucidate the efficiency of the EICA in solving CFP problems.Keywords: cell formation problem, group technology, imperialist competitive algorithm, sequence data
Procedia PDF Downloads 45524126 Establishment of Bit Selective Mode Storage Covert Channel in VANETs
Authors: Amarpreet Singh, Kimi Manchanda
Abstract:
Intended for providing the security in the VANETS (Vehicular Ad hoc Network) scenario, the covert storage channel is implemented through data transmitted between the sender and the receiver. Covert channels are the logical links which are used for the communication purpose and hiding the secure data from the intruders. This paper refers to the Establishment of bit selective mode covert storage channels in VANETS. In this scenario, the data is being transmitted with two modes i.e. the normal mode and the covert mode. During the communication between vehicles in this scenario, the controlling of bits is possible through the optional bits of IPV6 Header Format. This implementation is fulfilled with the help of Network simulator.Keywords: covert mode, normal mode, VANET, OBU, on-board unit
Procedia PDF Downloads 36824125 Enhancing Temporal Extrapolation of Wind Speed Using a Hybrid Technique: A Case Study in West Coast of Denmark
Authors: B. Elshafei, X. Mao
Abstract:
The demand for renewable energy is significantly increasing, major investments are being supplied to the wind power generation industry as a leading source of clean energy. The wind energy sector is entirely dependable and driven by the prediction of wind speed, which by the nature of wind is very stochastic and widely random. This s0tudy employs deep multi-fidelity Gaussian process regression, used to predict wind speeds for medium term time horizons. Data of the RUNE experiment in the west coast of Denmark were provided by the Technical University of Denmark, which represent the wind speed across the study area from the period between December 2015 and March 2016. The study aims to investigate the effect of pre-processing the data by denoising the signal using empirical wavelet transform (EWT) and engaging the vector components of wind speed to increase the number of input data layers for data fusion using deep multi-fidelity Gaussian process regression (GPR). The outcomes were compared using root mean square error (RMSE) and the results demonstrated a significant increase in the accuracy of predictions which demonstrated that using vector components of the wind speed as additional predictors exhibits more accurate predictions than strategies that ignore them, reflecting the importance of the inclusion of all sub data and pre-processing signals for wind speed forecasting models.Keywords: data fusion, Gaussian process regression, signal denoise, temporal extrapolation
Procedia PDF Downloads 13724124 Deadline Missing Prediction for Mobile Robots through the Use of Historical Data
Authors: Edwaldo R. B. Monteiro, Patricia D. M. Plentz, Edson R. De Pieri
Abstract:
Mobile robotics is gaining an increasingly important role in modern society. Several potentially dangerous or laborious tasks for human are assigned to mobile robots, which are increasingly capable. Many of these tasks need to be performed within a specified period, i.e., meet a deadline. Missing the deadline can result in financial and/or material losses. Mechanisms for predicting the missing of deadlines are fundamental because corrective actions can be taken to avoid or minimize the losses resulting from missing the deadline. In this work we propose a simple but reliable deadline missing prediction mechanism for mobile robots through the use of historical data and we use the Pioneer 3-DX robot for experiments and simulations, one of the most popular robots in academia.Keywords: deadline missing, historical data, mobile robots, prediction mechanism
Procedia PDF Downloads 40224123 The Intention to Use Telecare in People of Fall Experience: Application of Fuzzy Neural Network
Authors: Jui-Chen Huang, Shou-Hsiung Cheng
Abstract:
This study examined their willingness to use telecare for people who have had experience falling in the last three months in Taiwan. This study adopted convenience sampling and a structural questionnaire to collect data. It was based on the definition and the constructs related to the Health Belief Model (HBM). HBM is comprised of seven constructs: perceived benefits (PBs), perceived disease threat (PDT), perceived barriers of taking action (PBTA), external cues to action (ECUE), internal cues to action (ICUE), attitude toward using (ATT), and behavioral intention to use (BI). This study adopted Fuzzy Neural Network (FNN) to put forward an effective method. It shows the dependence of ATT on PB, PDT, PBTA, ECUE, and ICUE. The training and testing data RMSE (root mean square error) are 0.028 and 0.166 in the FNN, respectively. The training and testing data RMSE are 0.828 and 0.578 in the regression model, respectively. On the other hand, as to the dependence of ATT on BI, as presented in the FNN, the training and testing data RMSE are 0.050 and 0.109, respectively. The training and testing data RMSE are 0.529 and 0.571 in the regression model, respectively. The results show that the FNN method is better than the regression analysis. It is an effective and viable good way.Keywords: fall, fuzzy neural network, health belief model, telecare, willingness
Procedia PDF Downloads 20224122 Effect of Viscous Dissipation on 3-D MHD Casson Flow in Presence of Chemical Reaction: A Numerical Study
Authors: Bandari Shanker, Alfunsa Prathiba
Abstract:
The influence of viscous dissipation on MHD Casson 3-D fluid flow in two perpendicular directions past a linearly stretching sheet in the presence of a chemical reaction is explored in this work. For exceptional circumstances, self-similar solutions are obtained and compared to the given data. The enhancement in the values Ecert number the temperature boundary layer increases. Further, the current findings are observed to be in great accord with the existing data. In both directions, non - dimensional velocities and stress distribution are achieved. The relevant data are graphed and explained quantitatively in relation to changes in the Casson fluid parameter as well as other fluid flow parameters.Keywords: viscous dissipation, 3-D Casson flow, chemical reaction, Ecert number
Procedia PDF Downloads 194