Search results for: data storage
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25976

Search results for: data storage

25076 Improved K-Means Clustering Algorithm Using RHadoop with Combiner

Authors: Ji Eun Shin, Dong Hoon Lim

Abstract:

Data clustering is a common technique used in data analysis and is used in many applications, such as artificial intelligence, pattern recognition, economics, ecology, psychiatry and marketing. K-means clustering is a well-known clustering algorithm aiming to cluster a set of data points to a predefined number of clusters. In this paper, we implement K-means algorithm based on MapReduce framework with RHadoop to make the clustering method applicable to large scale data. RHadoop is a collection of R packages that allow users to manage and analyze data with Hadoop. The main idea is to introduce a combiner as a function of our map output to decrease the amount of data needed to be processed by reducers. The experimental results demonstrated that K-means algorithm using RHadoop can scale well and efficiently process large data sets on commodity hardware. We also showed that our K-means algorithm using RHadoop with combiner was faster than regular algorithm without combiner as the size of data set increases.

Keywords: big data, combiner, K-means clustering, RHadoop

Procedia PDF Downloads 415
25075 Kinetics of Growth Rate of Microalga: The Effect of Carbon Dioxide Concentration

Authors: Retno Ambarwati Sigit Lestari

Abstract:

Microalga is one of the organisms that can be considered ideal and potential for raw material of bioenergy production, because the content of lipids in microalga is relatively high. Microalga is an aquatic organism that produces complex organic compounds from inorganic molecules using carbon dioxide as a carbon source, and sunlight for energy supply. Microalga-CO₂ fixation has potential advantages over other carbon captures and storage approaches, such as wide distribution, high photosynthetic rate, good environmental adaptability, and ease of operation. The rates of growth and CO₂ capture of microalga are influenced by CO₂ concentration and light intensity. This study quantitatively investigates the effects of CO₂ concentration on the rates of growth and CO₂ capture of a type of microalga, cultivated in bioreactors. The works include laboratory experiments as well as mathematical modelling. The mathematical models were solved numerically and the accuracy of the model was tested by the experimental data. It turned out that the mathematical model proposed can well quantitatively describe the growth and CO₂ capture of microalga, in which the effects of CO₂ concentration can be observed.

Keywords: Microalga, CO2 concentration, photobioreactor, mathematical model

Procedia PDF Downloads 113
25074 Emperical Correlation for Measurement of Thermal Diffusivity of Spherical Shaped Food Products under Forced Convection Environment

Authors: M. Riaz, Inamur Rehman, Abhishek Sharma

Abstract:

The present work is the development of an experimental method for determining the thermal diffusivity variations with temperature of selected regular shaped solid fruits and vegetables subjected to forced convection cooling. Experimental investigations were carried on the sample chosen (potato and brinjal), which is approximately of spherical geometry. The variation of temperature within the food product is measured at several locations from centre to skin, under forced convection environment using a deep freezer, maintained at -10°C.This method uses one dimensional Fourier equation applied to regular shapes. For this, the experimental temperature data obtained from cylindrical and spherical shaped products during pre-cooling was utilised. Such temperature and thermal diffusivity profiles can be readily used with other information such as degradation rate, etc. to evaluate thermal treatments based on cold air cooling methods for storage of perishable food products.

Keywords: thermal diffusivity, skin temperature, precooling, forced convection, regular shaped

Procedia PDF Downloads 446
25073 Acid Fuchsin Dye Based PMMA Film for Holographic Investigations

Authors: G. Vinitha, A. Ramalingam

Abstract:

In view of a possible application in optical data storage devices, diffraction grating efficiency of an organic dye, Acid Fuchsin doped in PMMA matrix was studied under excitation with CW diode pumped Nd: YAG laser at 532 nm. The open aperture Z-scan of dye doped polymer displayed saturable absorption and the closed aperture Z-scan of the samples exhibited negative nonlinearity. The diffraction efficiency of the grating is the ratio of the intensity of the first order diffracted power to the incident read beam power. The dye doped polymer films were found to be good media for recording. It is observed that the formation of gratings strongly depend on the concentration of dye in the polymer film, the intensity ratios of the writing beams and the angle between the writing beams. It has been found that efficient writing can be made at an angle of 20° and when the intensity ratio of the writing beams is unity.

Keywords: diffraction efficiency, nonlinear optical material, saturable absorption, surface-relief-gratings

Procedia PDF Downloads 289
25072 Spherical Harmonic Based Monostatic Anisotropic Point Scatterer Model for RADAR Applications

Authors: Eric Huang, Coleman DeLude, Justin Romberg, Saibal Mukhopadhyay, Madhavan Swaminathan

Abstract:

High performance computing (HPC) based emulators can be used to model the scattering from multiple stationary and moving targets for RADAR applications. These emulators rely on the RADAR Cross Section (RCS) of the targets being available in complex scenarios. Representing the RCS using tables generated from electromagnetic (EM) simulations is often times cumbersome leading to large storage requirement. This paper proposed a spherical harmonic based anisotropic scatterer model to represent the RCS of complex targets. The problem of finding the locations and reflection profiles of all scatterers can be formulated as a linear least square problem with a special sparsity constraint. This paper solves this problem using a modified Orthogonal Matching Pursuit algorithm. The results show that the spherical harmonic based scatterer model can effectively represent the RCS data of complex targets.

Keywords: RADAR, RCS, high performance computing, point scatterer model

Procedia PDF Downloads 178
25071 Framework for Integrating Big Data and Thick Data: Understanding Customers Better

Authors: Nikita Valluri, Vatcharaporn Esichaikul

Abstract:

With the popularity of data-driven decision making on the rise, this study focuses on providing an alternative outlook towards the process of decision-making. Combining quantitative and qualitative methods rooted in the social sciences, an integrated framework is presented with a focus on delivering a much more robust and efficient approach towards the concept of data-driven decision-making with respect to not only Big data but also 'Thick data', a new form of qualitative data. In support of this, an example from the retail sector has been illustrated where the framework is put into action to yield insights and leverage business intelligence. An interpretive approach to analyze findings from both kinds of quantitative and qualitative data has been used to glean insights. Using traditional Point-of-sale data as well as an understanding of customer psychographics and preferences, techniques of data mining along with qualitative methods (such as grounded theory, ethnomethodology, etc.) are applied. This study’s final goal is to establish the framework as a basis for providing a holistic solution encompassing both the Big and Thick aspects of any business need. The proposed framework is a modified enhancement in lieu of traditional data-driven decision-making approach, which is mainly dependent on quantitative data for decision-making.

Keywords: big data, customer behavior, customer experience, data mining, qualitative methods, quantitative methods, thick data

Procedia PDF Downloads 142
25070 Correlation between Polysaccharides Molecular Weight Changes and Pectinases Gene Expression during Papaya Ripening

Authors: Samira B. R. Prado, Paulo R. Melfi, Beatriz T. Minguzzi, João P. Fabi

Abstract:

Fruit softening is the main change that occurs during papaya (Carica papaya L.) ripening. It is characterized by the depolymerization of cell wall polysaccharides, especially the pectic fractions, which causes cell wall disassembling. However, it is uncertain how the modification of the two main pectin polysaccharides fractions (water-soluble – WSF, and oxalate-soluble fractions - OSF) accounts for fruit softening. The aim of this work was to correlate molecular weight changes of WSF and OSF with the gene expression of pectin-solubilizing enzymes (pectinases) during papaya ripening. Papaya fruits obtained from a producer were harvest and storage under specific conditions. The fruits were divided in five groups according to days after harvesting. Cell walls from all groups of papaya pulp were isolated and fractionated (WSF and OSF). Expression profiles of pectinase genes were achieved according to the MIQE guidelines (Minimum Information for publication of Quantitative real-time PCR Experiments). The results showed an increased yield and a decreased molecular weight throughout ripening for WSF and OSF. Gene expression data support that papaya softening is achieved by polygalacturonases (PGs) up-regulation, in which their actions might have been facilitated by the constant action of pectinesterases (PMEs). Moreover, BGAL1 gene was up-regulated during ripening with a simultaneous galactose release, suggesting that galactosidases (GALs) could also account for pulp softening. The data suggest that a solubilization of galacturonans and a depolymerization of cell wall components were caused mainly by the action of PGs and GALs.

Keywords: carica papaya, fruit ripening, galactosidases, plant cell wall, polygalacturonases

Procedia PDF Downloads 409
25069 Concentrated Whey Protein Drink with Orange Flavor: Protein Modification and Formulation

Authors: Shahram Naghizadeh Raeisi, Ali Alghooneh

Abstract:

The application of whey protein in drink industry to enhance the nutritional value of the products is important. Furthermore, the gelification of protein during thermal treatment and shelf life makes some limitations in its application. So, the main goal of this research is manufacturing of high concentrate whey protein orange drink with appropriate shelf life. In this way, whey protein was 5 to 30% hydrolyzed ( in 5 percent intervals at six stages), then thermal stability of samples with 10% concentration of protein was tested in acidic condition (T= 90 °C, pH=4.2, 5 minutes ) and neutral condition (T=120° C, pH:6.7, 20 minutes.) Furthermore, to study the shelf life of heat treated samples in 4 months at 4 and 24 °C, the time sweep rheological test were done. At neutral conditions, 5 to 20% hydrolyzed sample showed gelling during thermal treatment, whereas at acidic condition, was happened only in 5 to 10 percent hydrolyzed samples. This phenomenon could be related to the difference in hydrodynamic radius and zeta potential of samples with different level of hydrolyzation at acidic and neutral conditions. To study the gelification of heat resistant protein solutions during shelf life, for 4 months with 7 days intervals, the time sweep analysis were performed. Cross over was observed for all heat resistant neutral samples at both storage temperature, while in heat resistant acidic samples with degree of hydrolysis, 25 and 30 percentage at 4 and 20 °C, it was not seen. It could be concluded that the former sample was stable during heat treatment and 4 months storage, which made them a good choice for manufacturing high protein drinks. The Scheffe polynomial model and numerical optimization were employed for modeling and high protein orange drink formula optimization. Scheffe model significantly predicted the overal acceptance index (Pvalue<0.05) of sensorial analysis. The coefficient of determination (R2) of 0.94, the adjusted coefficient of determination (R2Adj) of 0.90, insignificance of the lack-of-fit test and F value of 64.21 showed the accuracy of the model. Moreover, the coefficient of variable (C.V) was 6.8% which suggested the replicability of the experimental data. The desirability function had been achieved to be 0.89, which indicates the high accuracy of optimization. The optimum formulation was found as following: Modified whey protein solution (65.30%), natural orange juice (33.50%), stevia sweetener (0.05%), orange peel oil (0.15%) and citric acid (1 %), respectively. Its worth mentioning that this study made an appropriate model for application of whey protein in drink industry without bitter flavor and gelification during heat treatment and shelf life.

Keywords: croos over, orange beverage, protein modification, optimization

Procedia PDF Downloads 55
25068 Consumer Protection Law For Users Mobile Commerce as a Global Effort to Improve Business in Indonesia

Authors: Rina Arum Prastyanti

Abstract:

Information technology has changed the ways of transacting and enabling new opportunities in business transactions. Problems to be faced by consumers M Commerce, among others, the consumer will have difficulty accessing the full information about the products on offer and the forms of transactions given the small screen and limited storage capacity, the need to protect children from various forms of excess supply and usage as well as errors in access and disseminate personal data, not to mention the more complex problems as well as problems agreements, dispute resolution that can protect consumers and assurance of security of personal data. It is no less important is the risk of payment and personal information of payment dal am also an important issue that should be on the swatch solution. The purpose of this study is 1) to describe the phenomenon of the use of Mobile Commerce in Indonesia. 2) To determine the form of legal protection for the consumer use of Mobile Commerce. 3) To get the right type of law so as to provide legal protection for consumers Mobile Commerce users. This research is a descriptive qualitative research. Primary and secondary data sources. This research is a normative law. Engineering conducted engineering research library collection or library research. The analysis technique used is deductive analysis techniques. Growing mobile technology and more affordable prices as well as low rates of provider competition also affects the increasing number of mobile users, Indonesia is placed into 4 HP users in the world, the number of mobile phones in Indonesia is estimated at around 250.1 million telephones with a population of 237 556. 363. Indonesian form of legal protection in the use of mobile commerce still a part of the Law No. 11 of 2008 on Information and Electronic Transactions and until now there is no rule of law that specifically regulates mobile commerce. Legal protection model that can be applied to protect consumers of mobile commerce users ensuring that consumers get information about potential security and privacy challenges they may face in m commerce and measures that can be used to limit the risk. Encourage the development of security measures and built security features. To encourage mobile operators to implement data security policies and measures to prevent unauthorized transactions. Provide appropriate methods both time and effectiveness of redress when consumers suffer financial loss.

Keywords: mobile commerce, legal protection, consumer, effectiveness

Procedia PDF Downloads 351
25067 Incremental Learning of Independent Topic Analysis

Authors: Takahiro Nishigaki, Katsumi Nitta, Takashi Onoda

Abstract:

In this paper, we present a method of applying Independent Topic Analysis (ITA) to increasing the number of document data. The number of document data has been increasing since the spread of the Internet. ITA was presented as one method to analyze the document data. ITA is a method for extracting the independent topics from the document data by using the Independent Component Analysis (ICA). ICA is a technique in the signal processing; however, it is difficult to apply the ITA to increasing number of document data. Because ITA must use the all document data so temporal and spatial cost is very high. Therefore, we present Incremental ITA which extracts the independent topics from increasing number of document data. Incremental ITA is a method of updating the independent topics when the document data is added after extracted the independent topics from a just previous the data. In addition, Incremental ITA updates the independent topics when the document data is added. And we show the result applied Incremental ITA to benchmark datasets.

Keywords: text mining, topic extraction, independent, incremental, independent component analysis

Procedia PDF Downloads 292
25066 Open Data for e-Governance: Case Study of Bangladesh

Authors: Sami Kabir, Sadek Hossain Khoka

Abstract:

Open Government Data (OGD) refers to all data produced by government which are accessible in reusable way by common people with access to Internet and at free of cost. In line with “Digital Bangladesh” vision of Bangladesh government, the concept of open data has been gaining momentum in the country. Opening all government data in digital and customizable format from single platform can enhance e-governance which will make government more transparent to the people. This paper presents a well-in-progress case study on OGD portal by Bangladesh Government in order to link decentralized data. The initiative is intended to facilitate e-service towards citizens through this one-stop web portal. The paper further discusses ways of collecting data in digital format from relevant agencies with a view to making it publicly available through this single point of access. Further, possible layout of this web portal is presented.

Keywords: e-governance, one-stop web portal, open government data, reusable data, web of data

Procedia PDF Downloads 336
25065 Device for Reversible Hydrogen Isotope Storage with Aluminum Oxide Ceramic Case

Authors: Igor P. Maximkin, Arkady A. Yukhimchuk, Victor V. Baluev, Igor L. Malkov, Rafael K. Musyaev, Damir T. Sitdikov, Alexey V. Buchirin, Vasily V. Tikhonov

Abstract:

Minimization of tritium diffusion leakage when developing devices handling tritium-containing media is key problems whose solution will at least allow essential enhancement of radiation safety and minimization of diffusion losses of expensive tritium. One of the ways to solve this problem is to use Al₂O₃ high-strength non-porous ceramics as a structural material of the bed body. This alumina ceramics offers high strength characteristics, but its main advantages are low hydrogen permeability (as against the used structural material) and high dielectric properties. The latter enables direct induction heating of an hydride-forming metal without essential heating of the pressure and containment vessel. The use of alumina ceramics and induction heating allows: - essential reduction of tritium extraction time; - several orders reduction of tritium diffusion leakage; - more complete extraction of tritium from metal hydrides due to its higher heating up to melting in the event of final disposal of the device. The paper presents computational and experimental results for the tritium bed designed to absorb 6 liters of tritium. Titanium was used as hydrogen isotope sorbent. Results of hydrogen realize kinetic from hydride-forming metal, strength and cyclic service life tests are reported. Recommendations are also provided for the practical use of the given bed type.

Keywords: aluminum oxide ceramic, hydrogen pressure, hydrogen isotope storage, titanium hydride

Procedia PDF Downloads 388
25064 Numerical Investigation of Solid Subcooling on a Low Melting Point Metal in Latent Thermal Energy Storage Systems Based on Flat Slab Configuration

Authors: Cleyton S. Stampa

Abstract:

This paper addresses the perspectives of using low melting point metals (LMPMs) as phase change materials (PCMs) in latent thermal energy storage (LTES) units, through a numerical approach. This is a new class of PCMs that has been one of the most prospective alternatives to be considered in LTES, due to these materials present high thermal conductivity and elevated heat of fusion, per unit volume. The chosen type of LTES consists of several horizontal parallel slabs filled with PCM. The heat transfer fluid (HTF) circulates through the channel formed between each two consecutive slabs on a laminar regime through forced convection. The study deals with the LTES charging process (heat-storing) by using pure gallium as PCM, and it considers heat conduction in the solid phase during melting driven by natural convection in the melt. The transient heat transfer problem is analyzed in one arbitrary slab under the influence of the HTF. The mathematical model to simulate the isothermal phase change is based on a volume-averaged enthalpy method, which is successfully verified by comparing its predictions with experimental data from works available in the pertinent literature. Regarding the convective heat transfer problem in the HTF, it is assumed that the flow is thermally developing, whereas the velocity profile is already fully developed. The study aims to learn about the effect of the solid subcooling in the melting rate through comparisons with the melting process of the solid in which it starts to melt from its fusion temperature. In order to best understand this effect in a metallic compound, as it is the case of pure gallium, the study also evaluates under the same conditions established for the gallium, the melting process of commercial paraffin wax (organic compound) and of the calcium chloride hexahydrate (CaCl₂ 6H₂O-inorganic compound). In the present work, it is adopted the best options that have been established by several researchers in their parametric studies with respect to this type of LTES, which lead to high values of thermal efficiency. To do so, concerning with the geometric aspects, one considers a gap of the channel formed by two consecutive slabs, thickness and length of the slab. About the HTF, one considers the type of fluid, the mass flow rate, and inlet temperature.

Keywords: flat slab, heat storing, pure metal, solid subcooling

Procedia PDF Downloads 130
25063 Resource Framework Descriptors for Interestingness in Data

Authors: C. B. Abhilash, Kavi Mahesh

Abstract:

Human beings are the most advanced species on earth; it's all because of the ability to communicate and share information via human language. In today's world, a huge amount of data is available on the web in text format. This has also resulted in the generation of big data in structured and unstructured formats. In general, the data is in the textual form, which is highly unstructured. To get insights and actionable content from this data, we need to incorporate the concepts of text mining and natural language processing. In our study, we mainly focus on Interesting data through which interesting facts are generated for the knowledge base. The approach is to derive the analytics from the text via the application of natural language processing. Using semantic web Resource framework descriptors (RDF), we generate the triple from the given data and derive the interesting patterns. The methodology also illustrates data integration using the RDF for reliable, interesting patterns.

Keywords: RDF, interestingness, knowledge base, semantic data

Procedia PDF Downloads 145
25062 Private Coded Computation of Matrix Multiplication

Authors: Malihe Aliasgari, Yousef Nejatbakhsh

Abstract:

The era of Big Data and the immensity of real-life datasets compels computation tasks to be performed in a distributed fashion, where the data is dispersed among many servers that operate in parallel. However, massive parallelization leads to computational bottlenecks due to faulty servers and stragglers. Stragglers refer to a few slow or delay-prone processors that can bottleneck the entire computation because one has to wait for all the parallel nodes to finish. The problem of straggling processors, has been well studied in the context of distributed computing. Recently, it has been pointed out that, for the important case of linear functions, it is possible to improve over repetition strategies in terms of the tradeoff between performance and latency by carrying out linear precoding of the data prior to processing. The key idea is that, by employing suitable linear codes operating over fractions of the original data, a function may be completed as soon as enough number of processors, depending on the minimum distance of the code, have completed their operations. The problem of matrix-matrix multiplication in the presence of practically big sized of data sets faced with computational and memory related difficulties, which makes such operations are carried out using distributed computing platforms. In this work, we study the problem of distributed matrix-matrix multiplication W = XY under storage constraints, i.e., when each server is allowed to store a fixed fraction of each of the matrices X and Y, which is a fundamental building of many science and engineering fields such as machine learning, image and signal processing, wireless communication, optimization. Non-secure and secure matrix multiplication are studied. We want to study the setup, in which the identity of the matrix of interest should be kept private from the workers and then obtain the recovery threshold of the colluding model, that is, the number of workers that need to complete their task before the master server can recover the product W. The problem of secure and private distributed matrix multiplication W = XY which the matrix X is confidential, while matrix Y is selected in a private manner from a library of public matrices. We present the best currently known trade-off between communication load and recovery threshold. On the other words, we design an achievable PSGPD scheme for any arbitrary privacy level by trivially concatenating a robust PIR scheme for arbitrary colluding workers and private databases and the proposed SGPD code that provides a smaller computational complexity at the workers.

Keywords: coded distributed computation, private information retrieval, secret sharing, stragglers

Procedia PDF Downloads 107
25061 Data Mining Practices: Practical Studies on the Telecommunication Companies in Jordan

Authors: Dina Ahmad Alkhodary

Abstract:

This study aimed to investigate the practices of Data Mining on the telecommunication companies in Jordan, from the viewpoint of the respondents. In order to achieve the goal of the study, and test the validity of hypotheses, the researcher has designed a questionnaire to collect data from managers and staff members from main department in the researched companies. The results shows improvements stages of the telecommunications companies towered Data Mining.

Keywords: data, mining, development, business

Procedia PDF Downloads 478
25060 Voltage Stabilization of Hybrid PV and Battery Systems by Considering Temperature and Irradiance Changes in Standalone Operation

Authors: S. Jalilzadeh, S. M. Mohseni Bonab

Abstract:

Solar and battery energy storage systems are very useful for consumers who live in deprived areas and do not have access to electricity distribution networks. Nowadays one of the problems that photo voltaic systems (PV) have changing of output power in temperature and irradiance variations, which directly affects the load that is connected to photo voltaic systems. In this paper, with considering the fact that the solar array varies with change in temperature and solar power radiation, a voltage stabilizer system of a load connected to photo voltaic array is designed to stabilize the load voltage and to transfer surplus power of the battery. Also, in proposed hybrid system, the needed load power amount is supplemented considering the voltage stabilization in standalone operation for supplying unbalanced AC load. Electrical energy storage system for voltage control and improvement of the performance of PV by a DC/DC converter is connected to the DC bus. The load is also feed by an AC/DC converter. In this paper, when the voltage increases in its reference limit, the battery gets charged by the photo voltaic array and when it decreases in its defined limit, the power gets injected to the DC bus by this battery. The constant of DC bus Voltage is the cause for the reduced harmonics generated by the inverter. In addition, a series of filters are provided in the inverter output in to reduced harmonics. The inverter control circuit is designed that the voltage and frequency of the load remain almost constant at different load conditions. This paper has focused on controlling strategies of converters to improve their performance.

Keywords: photovoltaic array (PV), DC/DC Boost converter, battery converter, inverters control

Procedia PDF Downloads 471
25059 Quantification of E-Waste: A Case Study in Federal University of Espírito Santo, Brazil

Authors: Andressa S. T. Gomes, Luiza A. Souza, Luciana H. Yamane, Renato R. Siman

Abstract:

The segregation of waste of electrical and electronic equipment (WEEE) in the generating source, its characterization (quali-quantitative) and identification of origin, besides being integral parts of classification reports, are crucial steps to the success of its integrated management. The aim of this paper was to count WEEE generation at the Federal University of Espírito Santo (UFES), Brazil, as well as to define sources, temporary storage sites, main transportations routes and destinations, the most generated WEEE and its recycling potential. Quantification of WEEE generated at the University in the years between 2010 and 2015 was performed using data analysis provided by UFES’s sector of assets management. EEE and WEEE flow in the campuses information were obtained through questionnaires applied to the University workers. It was recorded 6028 WEEEs units of data processing equipment disposed by the university between 2010 and 2015. Among these waste, the most generated were CRT screens, desktops, keyboards and printers. Furthermore, it was observed that these WEEEs are temporarily stored in inappropriate places at the University campuses. In general, these WEEE units are donated to NGOs of the city, or sold through auctions (2010 and 2013). As for recycling potential, from the primary processing and further sale of printed circuit boards (PCB) from the computers, the amount collected could reach U$ 27,839.23. The results highlight the importance of a WEEE management policy at the University.

Keywords: solid waste, waste of electrical and electronic equipment, waste management, institutional solid waste generation

Procedia PDF Downloads 245
25058 The Impact of System and Data Quality on Organizational Success in the Kingdom of Bahrain

Authors: Amal M. Alrayes

Abstract:

Data and system quality play a central role in organizational success, and the quality of any existing information system has a major influence on the effectiveness of overall system performance.Given the importance of system and data quality to an organization, it is relevant to highlight their importance on organizational performance in the Kingdom of Bahrain. This research aims to discover whether system quality and data quality are related, and to study the impact of system and data quality on organizational success. A theoretical model based on previous research is used to show the relationship between data and system quality, and organizational impact. We hypothesize, first, that system quality is positively associated with organizational impact, secondly that system quality is positively associated with data quality, and finally that data quality is positively associated with organizational impact. A questionnaire was conducted among public and private organizations in the Kingdom of Bahrain. The results show that there is a strong association between data and system quality, that affects organizational success.

Keywords: data quality, performance, system quality, Kingdom of Bahrain

Procedia PDF Downloads 475
25057 Methodology for Temporary Analysis of Production and Logistic Systems on the Basis of Distance Data

Authors: M. Mueller, M. Kuehn, M. Voelker

Abstract:

In small and medium-sized enterprises (SMEs), the challenge is to create a well-grounded and reliable basis for process analysis, optimization and planning due to a lack of data. SMEs have limited access to methods with which they can effectively and efficiently analyse processes and identify cause-and-effect relationships in order to generate the necessary database and derive optimization potential from it. The implementation of digitalization within the framework of Industry 4.0 thus becomes a particular necessity for SMEs. For these reasons, the abstract presents an analysis methodology that is subject to the objective of developing an SME-appropriate methodology for efficient, temporarily feasible data collection and evaluation in flexible production and logistics systems as a basis for process analysis and optimization. The overall methodology focuses on retrospective, event-based tracing and analysis of material flow objects. The technological basis consists of Bluetooth low energy (BLE)-based transmitters, so-called beacons, and smart mobile devices (SMD), e.g. smartphones as receivers, between which distance data can be measured and derived motion profiles. The distance is determined using the Received Signal Strength Indicator (RSSI), which is a measure of signal field strength between transmitter and receiver. The focus is the development of a software-based methodology for interpretation of relative movements of transmitters and receivers based on distance data. The main research is on selection and implementation of pattern recognition methods for automatic process recognition as well as methods for the visualization of relative distance data. Due to an existing categorization of the database regarding process types, classification methods (e.g. Support Vector Machine) from the field of supervised learning are used. The necessary data quality requires selection of suitable methods as well as filters for smoothing occurring signal variations of the RSSI, the integration of methods for determination of correction factors depending on possible signal interference sources (columns, pallets) as well as the configuration of the used technology. The parameter settings on which respective algorithms are based have a further significant influence on result quality of the classification methods, correction models and methods for visualizing the position profiles used. The accuracy of classification algorithms can be improved up to 30% by selected parameter variation; this has already been proven in studies. Similar potentials can be observed with parameter variation of methods and filters for signal smoothing. Thus, there is increased interest in obtaining detailed results on the influence of parameter and factor combinations on data quality in this area. The overall methodology is realized with a modular software architecture consisting of independently modules for data acquisition, data preparation and data storage. The demonstrator for initialization and data acquisition is available as mobile Java-based application. The data preparation, including methods for signal smoothing, are Python-based with the possibility to vary parameter settings and to store them in the database (SQLite). The evaluation is divided into two separate software modules with database connection: the achievement of an automated assignment of defined process classes to distance data using selected classification algorithms and the visualization as well as reporting in terms of a graphical user interface (GUI).

Keywords: event-based tracing, machine learning, process classification, parameter settings, RSSI, signal smoothing

Procedia PDF Downloads 116
25056 Cloud Computing in Data Mining: A Technical Survey

Authors: Ghaemi Reza, Abdollahi Hamid, Dashti Elham

Abstract:

Cloud computing poses a diversity of challenges in data mining operation arising out of the dynamic structure of data distribution as against the use of typical database scenarios in conventional architecture. Due to immense number of users seeking data on daily basis, there is a serious security concerns to cloud providers as well as data providers who put their data on the cloud computing environment. Big data analytics use compute intensive data mining algorithms (Hidden markov, MapReduce parallel programming, Mahot Project, Hadoop distributed file system, K-Means and KMediod, Apriori) that require efficient high performance processors to produce timely results. Data mining algorithms to solve or optimize the model parameters. The challenges that operation has to encounter is the successful transactions to be established with the existing virtual machine environment and the databases to be kept under the control. Several factors have led to the distributed data mining from normal or centralized mining. The approach is as a SaaS which uses multi-agent systems for implementing the different tasks of system. There are still some problems of data mining based on cloud computing, including design and selection of data mining algorithms.

Keywords: cloud computing, data mining, computing models, cloud services

Procedia PDF Downloads 460
25055 Cross-border Data Transfers to and from South Africa

Authors: Amy Gooden, Meshandren Naidoo

Abstract:

Genetic research and transfers of big data are not confined to a particular jurisdiction, but there is a lack of clarity regarding the legal requirements for importing and exporting such data. Using direct-to-consumer genetic testing (DTC-GT) as an example, this research assesses the status of data sharing into and out of South Africa (SA). While SA laws cover the sending of genetic data out of SA, prohibiting such transfer unless a legal ground exists, the position where genetic data comes into the country depends on the laws of the country from where it is sent – making the legal position less clear.

Keywords: cross-border, data, genetic testing, law, regulation, research, sharing, South Africa

Procedia PDF Downloads 115
25054 The Study of Security Techniques on Information System for Decision Making

Authors: Tejinder Singh

Abstract:

Information system is the flow of data from different levels to different directions for decision making and data operations in information system (IS). Data can be violated by different manner like manual or technical errors, data tampering or loss of integrity. Security system called firewall of IS is effected by such type of violations. The flow of data among various levels of Information System is done by networking system. The flow of data on network is in form of packets or frames. To protect these packets from unauthorized access, virus attacks, and to maintain the integrity level, network security is an important factor. To protect the data to get pirated, various security techniques are used. This paper represents the various security techniques and signifies different harmful attacks with the help of detailed data analysis. This paper will be beneficial for the organizations to make the system more secure, effective, and beneficial for future decisions making.

Keywords: information systems, data integrity, TCP/IP network, vulnerability, decision, data

Procedia PDF Downloads 288
25053 Data Integration with Geographic Information System Tools for Rural Environmental Monitoring

Authors: Tamas Jancso, Andrea Podor, Eva Nagyne Hajnal, Peter Udvardy, Gabor Nagy, Attila Varga, Meng Qingyan

Abstract:

The paper deals with the conditions and circumstances of integration of remotely sensed data for rural environmental monitoring purposes. The main task is to make decisions during the integration process when we have data sources with different resolution, location, spectral channels, and dimension. In order to have exact knowledge about the integration and data fusion possibilities, it is necessary to know the properties (metadata) that characterize the data. The paper explains the joining of these data sources using their attribute data through a sample project. The resulted product will be used for rural environmental analysis.

Keywords: remote sensing, GIS, metadata, integration, environmental analysis

Procedia PDF Downloads 106
25052 Comparative Study of Stability of Crude and Purified Red Pigments of Pokeberry (Phytolacca Americana L.) Fruits

Authors: Nani Mchedlishvili, Nino Omiadze, Marine Abutidze, Jose Neptuno Rodriguez-Lopez, Tinatin Sadunishvili, Nikoloz Pruidze, Giorgi Kvesitadze

Abstract:

Recently, there is an increased interest in the development of food natural colorants as alternatives to synthetic dyes because of both legislative action and consumer concern. Betalains are widely used in the food industry as an alternative of synthetic colorants. The interest of betalains are caused not only by their coloring effect but also by their beneficial properties. The aim of the work was to study of stability of crude and purified red pigments of pokeberry (Phytolacca america L.). The pokeberry fruit juice was filtrated and concentrated by rotary vacuum evaporator up to 25% and the concentrated juice was passed through the Sepadex-25(fine) column (20×1.1 cm). From the column the pigment elution rate was 18 ml/hr. 1.5ml fractions of pigment were collected. In the fractions the coloring substances were determined using CuS04 x 7 H2O as a standard. From the Sephadex G-25 column only one fraction of the betalain red pigment was eluted with the absorption maximum at 538 nm. The degree of pigment purification was 1.6 and pigment yield from the column was 15 %. It was shown that thermostability of pokeberry fruit red pigment was significantly decreased after the purification. For example, during incubation at 100C for 10 min crude pigment retained 98 % of its color while under the same conditions only 72% of the color of purified pigment was retained. The purified pigment was found to be characterized by less storage stability too. The storage of the initial crude juice and the pigment fraction obtained after the gelfiltration for 10 days at 4°C showed the lost of color by 29 and 74 % respectively. From the results obtained, it can be concluded that during the gelfiltration the pokeberry fruit red pigment gets separated from such substances that cause its stabilization in the crude juice.

Keywords: betalains, gelfiltration, pokeberry fruit, stability

Procedia PDF Downloads 278
25051 Analysis of Genomics Big Data in Cloud Computing Using Fuzzy Logic

Authors: Mohammad Vahed, Ana Sadeghitohidi, Majid Vahed, Hiroki Takahashi

Abstract:

In the genomics field, the huge amounts of data have produced by the next-generation sequencers (NGS). Data volumes are very rapidly growing, as it is postulated that more than one billion bases will be produced per year in 2020. The growth rate of produced data is much faster than Moore's law in computer technology. This makes it more difficult to deal with genomics data, such as storing data, searching information, and finding the hidden information. It is required to develop the analysis platform for genomics big data. Cloud computing newly developed enables us to deal with big data more efficiently. Hadoop is one of the frameworks distributed computing and relies upon the core of a Big Data as a Service (BDaaS). Although many services have adopted this technology, e.g. amazon, there are a few applications in the biology field. Here, we propose a new algorithm to more efficiently deal with the genomics big data, e.g. sequencing data. Our algorithm consists of two parts: First is that BDaaS is applied for handling the data more efficiently. Second is that the hybrid method of MapReduce and Fuzzy logic is applied for data processing. This step can be parallelized in implementation. Our algorithm has great potential in computational analysis of genomics big data, e.g. de novo genome assembly and sequence similarity search. We will discuss our algorithm and its feasibility.

Keywords: big data, fuzzy logic, MapReduce, Hadoop, cloud computing

Procedia PDF Downloads 285
25050 Forthcoming Big Data on Smart Buildings and Cities: An Experimental Study on Correlations among Urban Data

Authors: Yu-Mi Song, Sung-Ah Kim, Dongyoun Shin

Abstract:

Cities are complex systems of diverse and inter-tangled activities. These activities and their complex interrelationships create diverse urban phenomena. And such urban phenomena have considerable influences on the lives of citizens. This research aimed to develop a method to reveal the causes and effects among diverse urban elements in order to enable better understanding of urban activities and, therefrom, to make better urban planning strategies. Specifically, this study was conducted to solve a data-recommendation problem found on a Korean public data homepage. First, a correlation analysis was conducted to find the correlations among random urban data. Then, based on the results of that correlation analysis, the weighted data network of each urban data was provided to people. It is expected that the weights of urban data thereby obtained will provide us with insights into cities and show us how diverse urban activities influence each other and induce feedback.

Keywords: big data, machine learning, ontology model, urban data model

Procedia PDF Downloads 397
25049 Low Impact Development Strategies Applied in the Water System Planning in the Coastal Eco-Green Campus

Authors: Ying Li, Zaisheng Hong, Weihong Wang

Abstract:

With the rapid enlargement of the size of Chinese universities, newly built campuses are springing up everywhere in recent years. It is urged to build eco-green campus because the role of higher education institutions in the transition to a more sustainable society has been highlighted for almost three decades. On condition that a new campus is usually built on an undeveloped site, where the basic infrastructure is not completed, finding proper strategies in planning and design of the campus becomes a primary concern. Low Impact Development (LID) options have been proposed as an alternative approach to make better use of rainwater in planning and design of an undeveloped site. On the basis of analyzing the natural circumstance, geographic condition, and other relative information, four main LID approaches are coordinated in this study of Hebei Union University, which are ‘Storage’, ‘Retaining’, ‘Infiltration’ and ‘Purification’. ‘Storage’ refers to a big central lake in the campus for rainwater harvesting. ‘Retaining’ means rainwater gardens scattered in the campus, also being known as bioretention areas which mimic the naturally created pools of water, to decrease surface flow runoff. ‘Infiltration’ is designed of grassed swales, which also play a part of floodway channel. ‘Purification’ is known as either natural or artificial wetland to reduce pollutants such as nitrogen and phosphorous in the waterbody. With above mentioned measures dealing with the synthetic use of rainwater in the acid & alkali area in the coastal district, an eco-green campus construction and an ecological sustainability will be realized, which will give us more enlightenment and reference.

Keywords: newly built campus, low impact development, planning design, rainwater reuse

Procedia PDF Downloads 235
25048 Data-driven Decision-Making in Digital Entrepreneurship

Authors: Abeba Nigussie Turi, Xiangming Samuel Li

Abstract:

Data-driven business models are more typical for established businesses than early-stage startups that strive to penetrate a market. This paper provided an extensive discussion on the principles of data analytics for early-stage digital entrepreneurial businesses. Here, we developed data-driven decision-making (DDDM) framework that applies to startups prone to multifaceted barriers in the form of poor data access, technical and financial constraints, to state some. The startup DDDM framework proposed in this paper is novel in its form encompassing startup data analytics enablers and metrics aligning with startups' business models ranging from customer-centric product development to servitization which is the future of modern digital entrepreneurship.

Keywords: startup data analytics, data-driven decision-making, data acquisition, data generation, digital entrepreneurship

Procedia PDF Downloads 304
25047 Low-Cost Reusable Thermal Energy Storage Particle for Concentrating Solar Power

Authors: Kyu Bum Han, Eunjin Jeon, Kimberly Watts, Brenda Payan Medina

Abstract:

Gen3 Concentrating Solar Power (CSP) high-temperature thermal systems have the potential to lower the cost of a CSP system. When compared to the other systems (chloride salt blends and supercritical fluids), the particle transport system can avoid many of the issues associated with high fluid temperature systems at high temperature because of its ability to operate at ambient pressure with limited corrosion or thermal stability risk. Furthermore, identifying and demonstrating low-cost particles that have excellent optical properties and durability can significantly reduce the levelized cost of electricity (LCOE) of particle receivers. The currently available thermal transfer particle in the study and market is oxidized at about 700oC, which reduces its durability, generates particle loss by high friction loads, and causes the color change. To meet the CSP SunShot goal, the durability of particles must be improved by identifying particles that are less abrasive to other structural materials. Furthermore, the particles must be economically affordable and the solar absorptance of the particles must be increased while minimizing thermal emittance. We are studying a novel thermal transfer particle, which has low cost, high durability, and high solar absorptance at high temperatures. The particle minimizes thermal emittance and will be less abrasive to other structural materials. Additionally, the particle demonstrates reusability, which significantly lowers the LCOE. This study will contribute to two principal disciplines of energy science: materials synthesis and manufacturing. Developing this particle for thermal transfer will have a positive impact on the ceramic study and industry as well as the society.

Keywords: concentrating solar power, thermal energy storage, particle, reusability, economics

Procedia PDF Downloads 212