Search results for: data protection laws
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 27076

Search results for: data protection laws

25636 Big Data: Appearance and Disappearance

Authors: James Moir

Abstract:

The mainstay of Big Data is prediction in that it allows practitioners, researchers, and policy analysts to predict trends based upon the analysis of large and varied sources of data. These can range from changing social and political opinions, patterns in crimes, and consumer behaviour. Big Data has therefore shifted the criterion of success in science from causal explanations to predictive modelling and simulation. The 19th-century science sought to capture phenomena and seek to show the appearance of it through causal mechanisms while 20th-century science attempted to save the appearance and relinquish causal explanations. Now 21st-century science in the form of Big Data is concerned with the prediction of appearances and nothing more. However, this pulls social science back in the direction of a more rule- or law-governed reality model of science and away from a consideration of the internal nature of rules in relation to various practices. In effect Big Data offers us no more than a world of surface appearance and in doing so it makes disappear any context-specific conceptual sensitivity.

Keywords: big data, appearance, disappearance, surface, epistemology

Procedia PDF Downloads 415
25635 From Data Processing to Experimental Design and Back Again: A Parameter Identification Problem Based on FRAP Images

Authors: Stepan Papacek, Jiri Jablonsky, Radek Kana, Ctirad Matonoha, Stefan Kindermann

Abstract:

FRAP (Fluorescence Recovery After Photobleaching) is a widely used measurement technique to determine the mobility of fluorescent molecules within living cells. While the experimental setup and protocol for FRAP experiments are usually fixed, data processing part is still under development. In this paper, we formulate and solve the problem of data selection which enhances the processing of FRAP images. We introduce the concept of the irrelevant data set, i.e., the data which are almost not reducing the confidence interval of the estimated parameters and thus could be neglected. Based on sensitivity analysis, we both solve the problem of the optimal data space selection and we find specific conditions for optimizing an important experimental design factor, e.g., the radius of bleach spot. Finally, a theorem announcing less precision of the integrated data approach compared to the full data case is proven; i.e., we claim that the data set represented by the FRAP recovery curve lead to a larger confidence interval compared to the spatio-temporal (full) data.

Keywords: FRAP, inverse problem, parameter identification, sensitivity analysis, optimal experimental design

Procedia PDF Downloads 273
25634 Exploring the Feasibility of Utilizing Blockchain in Cloud Computing and AI-Enabled BIM for Enhancing Data Exchange in Construction Supply Chain Management

Authors: Tran Duong Nguyen, Marwan Shagar, Qinghao Zeng, Aras Maqsoodi, Pardis Pishdad, Eunhwa Yang

Abstract:

Construction supply chain management (CSCM) involves the collaboration of many disciplines and actors, which generates vast amounts of data. However, inefficient, fragmented, and non-standardized data storage often hinders this data exchange. The industry has adopted building information modeling (BIM) -a digital representation of a facility's physical and functional characteristics to improve collaboration, enhance transmission security, and provide a common data exchange platform. Still, the volume and complexity of data require tailored information categorization, aligning with stakeholders' preferences and demands. To address this, artificial intelligence (AI) can be integrated to handle this data’s magnitude and complexities. This research aims to develop an integrated and efficient approach for data exchange in CSCM by utilizing AI. The paper covers five main objectives: (1) Investigate existing framework and BIM adoption; (2) Identify challenges in data exchange; (3) Propose an integrated framework; (4) Enhance data transmission security; and (5) Develop data exchange in CSCM. The proposed framework demonstrates how integrating BIM and other technologies, such as cloud computing, blockchain, and AI applications, can significantly improve the efficiency and accuracy of data exchange in CSCM.

Keywords: construction supply chain management, BIM, data exchange, artificial intelligence

Procedia PDF Downloads 15
25633 Representation Data without Lost Compression Properties in Time Series: A Review

Authors: Nabilah Filzah Mohd Radzuan, Zalinda Othman, Azuraliza Abu Bakar, Abdul Razak Hamdan

Abstract:

Uncertain data is believed to be an important issue in building up a prediction model. The main objective in the time series uncertainty analysis is to formulate uncertain data in order to gain knowledge and fit low dimensional model prior to a prediction task. This paper discusses the performance of a number of techniques in dealing with uncertain data specifically those which solve uncertain data condition by minimizing the loss of compression properties.

Keywords: compression properties, uncertainty, uncertain time series, mining technique, weather prediction

Procedia PDF Downloads 423
25632 Chemical Warfare Agent Simulant by Photocatalytic Filtering Reactor: Effect of Operating Parameters

Authors: Youcef Serhane, Abdelkrim Bouzaza, Dominique Wolbert, Aymen Amin Assadi

Abstract:

Throughout history, the use of chemical weapons is not exclusive to combats between army corps; some of these weapons are also found in very targeted intelligence operations (political assassinations), organized crime, and terrorist organizations. To improve the speed of action, important technological devices have been developed in recent years, in particular in the field of protection and decontamination techniques to better protect and neutralize a chemical threat. In order to assess certain protective, decontaminating technologies or to improve medical countermeasures, tests must be conducted. In view of the great toxicity of toxic chemical agents from (real) wars, simulants can be used, chosen according to the desired application. Here, we present an investigation about using a photocatalytic filtering reactor (PFR) for highly contaminated environments containing diethyl sulfide (DES). This target pollutant is used as a simulant of CWA, namely of Yperite (Mustard Gas). The influence of the inlet concentration (until high concentrations of DES (1200 ppmv, i.e., 5 g/m³ of air) has been studied. Also, the conversion rate was monitored under different relative humidity and different flow rates (respiratory flow - standards: ISO / DIS 8996 and NF EN 14387 + A1). In order to understand the efficacity of pollutant neutralization by PFR, a kinetic model based on the Langmuir–Hinshelwood (L–H) approach and taking into account the mass transfer step was developed. This allows us to determine the adsorption and kinetic degradation constants with no influence of mass transfer. The obtained results confirm that this small configuration of reactor presents an extremely promising way for the use of photocatalysis for treatment to deal with highly contaminated environments containing real chemical warfare agents. Also, they can give birth to an individual protection device (an autonomous cartridge for a gas mask).

Keywords: photocatalysis, photocatalytic filtering reactor, diethylsulfide, chemical warfare agents

Procedia PDF Downloads 101
25631 Study and Modeling of Flood Watershed in Arid and Semi Arid Regions of Algeria

Authors: Belagoune Fares, Boutoutaou Djamel

Abstract:

The study on floods in Algeria established by the National Agency of Water Resources (ANRH) shows that the country is confronted with the phenomenon of very destructive floods and floods especially in arid and semiarid regions. Flooding of rivers in these areas is less known. They are characterized by their sudden duration (rain showers, thunderstorm).The duration of the flood is of the order of minutes to hours. The human and material damage caused by these floods were still high. The study area encompasses three watersheds in semi-arid and arid south and Algeria. THERE are pools of Chott-Melghir (68,751 km2), highland Constantine-07 (9578 km2) and El Hodna-05 basin (25,843 km2). The total area of this zone is about 104,500km2.Studies of protection against floods and design studies of hydraulic structures (spillway, storm basin, etc.) require the raw data which is often unknown in several places particularly at ungauged wadis of these areas. This makes it very difficult to schedules and managers working in the field of hydraulic studies. The objective of this study and propose a methodology for determining flows in the absence of observations in the semi-arid and arid south eastern Algeria. The objective of the study is to propose a methodology for these areas of flood calculation for ungauged rivers.

Keywords: flood, watershed, specific flow, coefficient of variation, arid

Procedia PDF Downloads 503
25630 Simultaneous Extraction and Estimation of Steroidal Glycosides and Aglycone of Solanum

Authors: Karishma Chester, Sarvesh Paliwal, Sayeed Ahmad

Abstract:

Solanumnigrum L. (Family: Solanaceae), is an important Indian medicinal plant and have been used in various traditional formulations for hepato-protection. It has been reported to contain significant amount of steroidal glycosides such as solamargine and solasonine as well as their aglycone part solasodine. Being important pharmacologically active metabolites of several members of Solanaceae these markers have been attempted various times for their extraction and quantification but separately for glycoside and aglycone part because of their opposite polarity. Here, we propose for the first time simultaneous extraction and quantification of aglycone (solasodine)and glycosides (solamargine and solasonine) inleaves and berries of S.nigrumusing solvent extraction followed by HPTLC analysis. Simultaneous extraction was carried out by sonication in mixture of chloroform and methanol as solvent. The quantification was done using silica gel 60F254HPTLC plates as stationary phase and chloroform: methanol: acetone: 0.5 % ammonia (7: 2.5: 1: 0.4 v/v/v/v) as mobile phaseat 400 nm, after derivatization with an isaldehydesul furic acid reagent. The method was validated as per ICH guideline for calibration, linearity, precision, recovery, robustness, specificity, LOD, and LOQ. The statistical data obtained for validation showed that method can be used routinely for quality control of various solanaceous drugs reported for these markers as well as traditional formulations containing those plants as an ingredient.

Keywords: solanumnigrum, solasodine, solamargine, solasonine, quantification

Procedia PDF Downloads 328
25629 Data Mining As A Tool For Knowledge Management: A Review

Authors: Maram Saleh

Abstract:

Knowledge has become an essential resource in today’s economy and become the most important asset of maintaining competition advantage in organizations. The importance of knowledge has made organizations to manage their knowledge assets and resources through all multiple knowledge management stages such as: Knowledge Creation, knowledge storage, knowledge sharing and knowledge use. Researches on data mining are continues growing over recent years on both business and educational fields. Data mining is one of the most important steps of the knowledge discovery in databases process aiming to extract implicit, unknown but useful knowledge and it is considered as significant subfield in knowledge management. Data miming have the great potential to help organizations to focus on extracting the most important information on their data warehouses. Data mining tools and techniques can predict future trends and behaviors, allowing businesses to make proactive, knowledge-driven decisions. This review paper explores the applications of data mining techniques in supporting knowledge management process as an effective knowledge discovery technique. In this paper, we identify the relationship between data mining and knowledge management, and then focus on introducing some application of date mining techniques in knowledge management for some real life domains.

Keywords: Data Mining, Knowledge management, Knowledge discovery, Knowledge creation.

Procedia PDF Downloads 202
25628 Anomaly Detection Based Fuzzy K-Mode Clustering for Categorical Data

Authors: Murat Yazici

Abstract:

Anomalies are irregularities found in data that do not adhere to a well-defined standard of normal behavior. The identification of outliers or anomalies in data has been a subject of study within the statistics field since the 1800s. Over time, a variety of anomaly detection techniques have been developed in several research communities. The cluster analysis can be used to detect anomalies. It is the process of associating data with clusters that are as similar as possible while dissimilar clusters are associated with each other. Many of the traditional cluster algorithms have limitations in dealing with data sets containing categorical properties. To detect anomalies in categorical data, fuzzy clustering approach can be used with its advantages. The fuzzy k-Mode (FKM) clustering algorithm, which is one of the fuzzy clustering approaches, by extension to the k-means algorithm, is reported for clustering datasets with categorical values. It is a form of clustering: each point can be associated with more than one cluster. In this paper, anomaly detection is performed on two simulated data by using the FKM cluster algorithm. As a significance of the study, the FKM cluster algorithm allows to determine anomalies with their abnormality degree in contrast to numerous anomaly detection algorithms. According to the results, the FKM cluster algorithm illustrated good performance in the anomaly detection of data, including both one anomaly and more than one anomaly.

Keywords: fuzzy k-mode clustering, anomaly detection, noise, categorical data

Procedia PDF Downloads 48
25627 Big Data Analytics and Data Security in the Cloud via Fully Homomorphic Encyption Scheme

Authors: Victor Onomza Waziri, John K. Alhassan, Idris Ismaila, Noel Dogonyara

Abstract:

This paper describes the problem of building secure computational services for encrypted information in the Cloud. Computing without decrypting the encrypted data; therefore, it meets the yearning of computational encryption algorithmic aspiration model that could enhance the security of big data for privacy or confidentiality, availability and integrity of the data and user’s security. The cryptographic model applied for the computational process of the encrypted data is the Fully Homomorphic Encryption Scheme. We contribute a theoretical presentations in a high-level computational processes that are based on number theory that is derivable from abstract algebra which can easily be integrated and leveraged in the Cloud computing interface with detail theoretic mathematical concepts to the fully homomorphic encryption models. This contribution enhances the full implementation of big data analytics based on cryptographic security algorithm.

Keywords: big data analytics, security, privacy, bootstrapping, Fully Homomorphic Encryption Scheme

Procedia PDF Downloads 477
25626 De-Commoditisation of Food: How Organic Farmers from the Madrid Region Reconnect Products and Places through Web Marketing

Authors: Salvatore Pinna

Abstract:

The growth of organic farming practices in the last few decades is continuing to stimulate the international debate about this alternative food market. As a part of a PhD project research about embeddedness in Alternative Food Networks (AFNs), this paper focuses on the promotional aspects of organic farms websites from the Madrid region. As a theoretical tool, some knowledge categories drawn on the geographic studies literature are used to classify the many ideas expressed in the web pages. By analysing texts and pictures of 30 websites, the study aims to question how and to what extent actors from organic world communicate to the potential customers their personal beliefs about farming practices, products qualities, and ecological and social benefits. Moreover, the paper raises the question of whether organic farming laws and regulations lack of completeness about the social and cultural aspects of food.

Keywords: alternative food networks, de-commoditisation, organic farming, madrid, reconnection of food

Procedia PDF Downloads 348
25625 Gender Discrimination and Wellbeing in Family Sphere Due to Male Migration and Remittances: A Study of Doaba Region of Punjab

Authors: Atinder Pal Kaur

Abstract:

A central characteristic of people is their apparent movement from one station to other for their sustenance. Human migration has become one of the most challenging issues faced by the world today. Migration represents an important dimension in world-wide setting; and remittances received by families constitute a major agent in integrating societies in the all over the world, both economically and socially. This paper is an attempt to explore the impact of male migration and remittances upon the family system. This paper brings out how the women play the role of head of the household and take all the economic decisions, but still faces discrimination in the family, that bring loneliness and emotional breakdown on their personal front. For the purpose of this study, data was collected using 30 interviews and 10 case studies in the Doaba region of Punjab. The respondents were classified into two age groups 20-35 years and above 40 years aged women whose husbands migrated abroad. The findings of this study revealed that even though the women were taking some of the economic decisions, but in majority of the cases the patriarchal structure still existed and power remained in the hands of their husbands or in-laws. It was found that women of different age groups reported differently in terms of authority that they have regarding remittances and its consequences in their emotional well-being. The distinction related to their participation in public and private spheres still exists and public spheres are mostly dominated by male members of the family. It can be concluded that freedom of women to take decision on their own is still restricted and they are subjugated to follow their husband or in-law’s opinion in matters related to both public and private spheres. However, old age group women enjoyed more independence and freedom to take decision in comparison to young age women. Loneliness and depression were more common in the young age respondent’s group than in old age women.

Keywords: gender discrimination, migration, patriarchal structure, remittances

Procedia PDF Downloads 261
25624 An Approximation of Daily Rainfall by Using a Pixel Value Data Approach

Authors: Sarisa Pinkham, Kanyarat Bussaban

Abstract:

The research aims to approximate the amount of daily rainfall by using a pixel value data approach. The daily rainfall maps from the Thailand Meteorological Department in period of time from January to December 2013 were the data used in this study. The results showed that this approach can approximate the amount of daily rainfall with RMSE=3.343.

Keywords: daily rainfall, image processing, approximation, pixel value data

Procedia PDF Downloads 383
25623 Optimal Placement of Phasor Measurement Units Using Gravitational Search Method

Authors: Satyendra Pratap Singh, S. P. Singh

Abstract:

This paper presents a methodology using Gravitational Search Algorithm for optimal placement of Phasor Measurement Units (PMUs) in order to achieve complete observability of the power system. The objective of proposed algorithm is to minimize the total number of PMUs at the power system buses, which in turn minimize installation cost of the PMUs. In this algorithm, the searcher agents are collection of masses which interact with each other using Newton’s laws of gravity and motion. This new Gravitational Search Algorithm based method has been applied to the IEEE 14-bus, IEEE 30-bus and IEEE 118-bus test systems. Case studies reveal optimal number of PMUs with better observability by proposed method.

Keywords: gravitational search algorithm (GSA), law of motion, law of gravity, observability, phasor measurement unit

Procedia PDF Downloads 500
25622 A Next-Generation Blockchain-Based Data Platform: Leveraging Decentralized Storage and Layer 2 Scaling for Secure Data Management

Authors: Kenneth Harper

Abstract:

The rapid growth of data-driven decision-making across various industries necessitates advanced solutions to ensure data integrity, scalability, and security. This study introduces a decentralized data platform built on blockchain technology to improve data management processes in high-volume environments such as healthcare and financial services. The platform integrates blockchain networks using Cosmos SDK and Polkadot Substrate alongside decentralized storage solutions like IPFS and Filecoin, and coupled with decentralized computing infrastructure built on top of Avalanche. By leveraging advanced consensus mechanisms, we create a scalable, tamper-proof architecture that supports both structured and unstructured data. Key features include secure data ingestion, cryptographic hashing for robust data lineage, and Zero-Knowledge Proof mechanisms that enhance privacy while ensuring compliance with regulatory standards. Additionally, we implement performance optimizations through Layer 2 scaling solutions, including ZK-Rollups, which provide low-latency data access and trustless data verification across a distributed ledger. The findings from this exercise demonstrate significant improvements in data accessibility, reduced operational costs, and enhanced data integrity when tested in real-world scenarios. This platform reference architecture offers a decentralized alternative to traditional centralized data storage models, providing scalability, security, and operational efficiency.

Keywords: blockchain, cosmos SDK, decentralized data platform, IPFS, ZK-Rollups

Procedia PDF Downloads 12
25621 The Effect of Measurement Distribution on System Identification and Detection of Behavior of Nonlinearities of Data

Authors: Mohammad Javad Mollakazemi, Farhad Asadi, Aref Ghafouri

Abstract:

In this paper, we considered and applied parametric modeling for some experimental data of dynamical system. In this study, we investigated the different distribution of output measurement from some dynamical systems. Also, with variance processing in experimental data we obtained the region of nonlinearity in experimental data and then identification of output section is applied in different situation and data distribution. Finally, the effect of the spanning the measurement such as variance to identification and limitation of this approach is explained.

Keywords: Gaussian process, nonlinearity distribution, particle filter, system identification

Procedia PDF Downloads 507
25620 Building a Scalable Telemetry Based Multiclass Predictive Maintenance Model in R

Authors: Jaya Mathew

Abstract:

Many organizations are faced with the challenge of how to analyze and build Machine Learning models using their sensitive telemetry data. In this paper, we discuss how users can leverage the power of R without having to move their big data around as well as a cloud based solution for organizations willing to host their data in the cloud. By using ScaleR technology to benefit from parallelization and remote computing or R Services on premise or in the cloud, users can leverage the power of R at scale without having to move their data around.

Keywords: predictive maintenance, machine learning, big data, cloud based, on premise solution, R

Procedia PDF Downloads 370
25619 Trusting the Big Data Analytics Process from the Perspective of Different Stakeholders

Authors: Sven Gehrke, Johannes Ruhland

Abstract:

Data is the oil of our time, without them progress would come to a hold [1]. On the other hand, the mistrust of data mining is increasing [2]. The paper at hand shows different aspects of the concept of trust and describes the information asymmetry of the typical stakeholders of a data mining project using the CRISP-DM phase model. Based on the identified influencing factors in relation to trust, problematic aspects of the current approach are verified using various interviews with the stakeholders. The results of the interviews confirm the theoretically identified weak points of the phase model with regard to trust and show potential research areas.

Keywords: trust, data mining, CRISP DM, stakeholder management

Procedia PDF Downloads 90
25618 Factors of Adoption of the International Financial Reporting Standard for Small and Medium Sized Entities

Authors: Uyanga Jadamba

Abstract:

Globalisation of the world economy has necessitated the development and implementation of a comparable and understandable reporting language suitable for use by all reporting entities. The International Accounting Standard Board (IASB) provides an international reporting language that lets all users understand the financial information of their business and potentially allows them to have access to finance at an international level. The study is based on logistic regression analysis to investigate the factors for the adoption of theInternational Financial Reporting Standard for Small and Medium sized Entities (IFRS for SMEs). The study started with a list of 217 countries from World Bank data. Due to the lack of availability of data, the final sample consisted of 136 countries, including 60 countries that have adopted the IFRS for SMEs and 76 countries that have not adopted it yet. As a result, the study included a period from 2010 to 2020 and obtained 1360 observations. The findings confirm that the adoption of the IFRS for SMEs is significantly related to the existence of national reporting standards, law enforcement quality, common law (legal system), and extent of disclosure. It means that the likelihood of adoption of the IFRS for SMEs decreases if the country already has a national reporting standard for SMEs, which suggests that implementation and transitional costs are relatively high in order to change the reporting standards. The result further suggests that the new standard adoption is easier in countries with constructive law enforcement and effective application of laws. The finding also shows that the adoption increases if countries have a common law system which suggests that efficient reportingregulations are more widespread in these countries. Countries with a high extent of disclosing their financial information are more likely to adopt the standard than others. The findings lastly show that the audit qualityand primary education levelhave no significant impact on the adoption.One possible explanation for this could be that accounting professionalsfrom in developing countries lacked complete knowledge of the international reporting standards even though there was a requirement to comply with them. The study contributes to the literature by providing factors that impact the adoption of the IFRS for SMEs. It helps policymakers to better understand and apply the standard to improve the transparency of financial statements. The benefit of adopting the IFRS for SMEs is significant due to the relaxed and tailored reporting requirements for SMEs, reduced burden on professionals to comply with the standard, and provided transparent financial information to gain access to finance.The results of the study are useful toemerging economies where SMEs are dominant in the economy in informing its evaluation of the adoption of the IFRS for SMEs.

Keywords: IFRS for SMEs, international financial reporting standard, adoption, institutional factors

Procedia PDF Downloads 77
25617 Wireless Transmission of Big Data Using Novel Secure Algorithm

Authors: K. Thiagarajan, K. Saranya, A. Veeraiah, B. Sudha

Abstract:

This paper presents a novel algorithm for secure, reliable and flexible transmission of big data in two hop wireless networks using cooperative jamming scheme. Two hop wireless networks consist of source, relay and destination nodes. Big data has to transmit from source to relay and from relay to destination by deploying security in physical layer. Cooperative jamming scheme determines transmission of big data in more secure manner by protecting it from eavesdroppers and malicious nodes of unknown location. The novel algorithm that ensures secure and energy balance transmission of big data, includes selection of data transmitting region, segmenting the selected region, determining probability ratio for each node (capture node, non-capture and eavesdropper node) in every segment, evaluating the probability using binary based evaluation. If it is secure transmission resume with the two- hop transmission of big data, otherwise prevent the attackers by cooperative jamming scheme and transmit the data in two-hop transmission.

Keywords: big data, two-hop transmission, physical layer wireless security, cooperative jamming, energy balance

Procedia PDF Downloads 483
25616 Towards Accurate Velocity Profile Models in Turbulent Open-Channel Flows: Improved Eddy Viscosity Formulation

Authors: W. Meron Mebrahtu, R. Absi

Abstract:

Velocity distribution in turbulent open-channel flows is organized in a complex manner. This is due to the large spatial and temporal variability of fluid motion resulting from the free-surface turbulent flow condition. This phenomenon is complicated further due to the complex geometry of channels and the presence of solids transported. Thus, several efforts were made to understand the phenomenon and obtain accurate mathematical models that are suitable for engineering applications. However, predictions are inaccurate because oversimplified assumptions are involved in modeling this complex phenomenon. Therefore, the aim of this work is to study velocity distribution profiles and obtain simple, more accurate, and predictive mathematical models. Particular focus will be made on the acceptable simplification of the general transport equations and an accurate representation of eddy viscosity. Wide rectangular open-channel seems suitable to begin the study; other assumptions are smooth-wall, and sediment-free flow under steady and uniform flow conditions. These assumptions will allow examining the effect of the bottom wall and the free surface only, which is a necessary step before dealing with more complex flow scenarios. For this flow condition, two ordinary differential equations are obtained for velocity profiles; from the Reynolds-averaged Navier-Stokes (RANS) equation and equilibrium consideration between turbulent kinetic energy (TKE) production and dissipation. Then different analytic models for eddy viscosity, TKE, and mixing length were assessed. Computation results for velocity profiles were compared to experimental data for different flow conditions and the well-known linear, log, and log-wake laws. Results show that the model based on the RANS equation provides more accurate velocity profiles. In the viscous sublayer and buffer layer, the method based on Prandtl’s eddy viscosity model and Van Driest mixing length give a more precise result. For the log layer and outer region, a mixing length equation derived from Von Karman’s similarity hypothesis provides the best agreement with measured data except near the free surface where an additional correction based on a damping function for eddy viscosity is used. This method allows more accurate velocity profiles with the same value of the damping coefficient that is valid under different flow conditions. This work continues with investigating narrow channels, complex geometries, and the effect of solids transported in sewers.

Keywords: accuracy, eddy viscosity, sewers, velocity profile

Procedia PDF Downloads 108
25615 One Step Further: Pull-Process-Push Data Processing

Authors: Romeo Botes, Imelda Smit

Abstract:

In today’s modern age of technology vast amounts of data needs to be processed in real-time to keep users satisfied. This data comes from various sources and in many formats, including electronic and mobile devices such as GPRS modems and GPS devices. They make use of different protocols including TCP, UDP, and HTTP/s for data communication to web servers and eventually to users. The data obtained from these devices may provide valuable information to users, but are mostly in an unreadable format which needs to be processed to provide information and business intelligence. This data is not always current, it is mostly historical data. The data is not subject to implementation of consistency and redundancy measures as most other data usually is. Most important to the users is that the data are to be pre-processed in a readable format when it is entered into the database. To accomplish this, programmers build processing programs and scripts to decode and process the information stored in databases. Programmers make use of various techniques in such programs to accomplish this, but sometimes neglect the effect some of these techniques may have on database performance. One of the techniques generally used,is to pull data from the database server, process it and push it back to the database server in one single step. Since the processing of the data usually takes some time, it keeps the database busy and locked for the period of time that the processing takes place. Because of this, it decreases the overall performance of the database server and therefore the system’s performance. This paper follows on a paper discussing the performance increase that may be achieved by utilizing array lists along with a pull-process-push data processing technique split in three steps. The purpose of this paper is to expand the number of clients when comparing the two techniques to establish the impact it may have on performance of the CPU storage and processing time.

Keywords: performance measures, algorithm techniques, data processing, push data, process data, array list

Procedia PDF Downloads 238
25614 Extreme Temperature Forecast in Mbonge, Cameroon Through Return Level Analysis of the Generalized Extreme Value (GEV) Distribution

Authors: Nkongho Ayuketang Arreyndip, Ebobenow Joseph

Abstract:

In this paper, temperature extremes are forecast by employing the block maxima method of the generalized extreme value (GEV) distribution to analyse temperature data from the Cameroon Development Corporation (CDC). By considering two sets of data (raw data and simulated data) and two (stationary and non-stationary) models of the GEV distribution, return levels analysis is carried out and it was found that in the stationary model, the return values are constant over time with the raw data, while in the simulated data the return values show an increasing trend with an upper bound. In the non-stationary model, the return levels of both the raw data and simulated data show an increasing trend with an upper bound. This clearly shows that although temperatures in the tropics show a sign of increase in the future, there is a maximum temperature at which there is no exceedance. The results of this paper are very vital in agricultural and environmental research.

Keywords: forecasting, generalized extreme value (GEV), meteorology, return level

Procedia PDF Downloads 475
25613 Impact of Stack Caches: Locality Awareness and Cost Effectiveness

Authors: Abdulrahman K. Alshegaifi, Chun-Hsi Huang

Abstract:

Treating data based on its location in memory has received much attention in recent years due to its different properties, which offer important aspects for cache utilization. Stack data and non-stack data may interfere with each other’s locality in the data cache. One of the important aspects of stack data is that it has high spatial and temporal locality. In this work, we simulate non-unified cache design that split data cache into stack and non-stack caches in order to maintain stack data and non-stack data separate in different caches. We observe that the overall hit rate of non-unified cache design is sensitive to the size of non-stack cache. Then, we investigate the appropriate size and associativity for stack cache to achieve high hit ratio especially when over 99% of accesses are directed to stack cache. The result shows that on average more than 99% of stack cache accuracy is achieved by using 2KB of capacity and 1-way associativity. Further, we analyze the improvement in hit rate when adding small, fixed, size of stack cache at level1 to unified cache architecture. The result shows that the overall hit rate of unified cache design with adding 1KB of stack cache is improved by approximately, on average, 3.9% for Rijndael benchmark. The stack cache is simulated by using SimpleScalar toolset.

Keywords: hit rate, locality of program, stack cache, stack data

Procedia PDF Downloads 299
25612 Analyzing Strategic Alliances of Museums: The Case of Girona (Spain)

Authors: Raquel Camprubí

Abstract:

Cultural tourism has been postulated as relevant motivation for tourist over the world during the last decades. In this context, museums are the main attraction for cultural tourists who are seeking to connect with the history and culture of the visited place. From the point of view of an urban destination, museums and other cultural resources are essential to have a strong tourist supply at the destination, in order to be capable of catching attention and interest of cultural tourists. In particular, museums’ challenge is to be prepared to offer the best experience to their visitors without to forget their mission-based mainly on protection of its collection and other social goals. Thus, museums individually want to be competitive and have good positioning to achieve their strategic goals. The life cycle of the destination and the level of maturity of its tourism product influence the need of tourism agents to cooperate and collaborate among them, in order to rejuvenate their product and become more competitive as a destination. Additionally, prior studies have considered an approach of different models of a public and private partnership, and collaborative and cooperative relations developed among the agents of a tourism destination. However, there are no studies that pay special attention to museums and the strategic alliances developed to obtain mutual benefits. Considering this background, the purpose of this study is to analyze in what extent museums of a given urban destination have established strategic links and relations among them, in order to improve their competitive position at both individual and destination level. In order to achieve the aim of this study, the city of Girona (Spain) and the museums located in this city are taken as a case study. Data collection was conducted using in-depth interviews, in order to collect all the qualitative data related to nature, strengthen and purpose of the relational ties established among the museums of the city or other relevant tourism agents of the city. To conduct data analysis, a Social Network Analysis (SNA) approach was taken using UCINET software. Position of the agents in the network and structure of the network was analyzed, and qualitative data from interviews were used to interpret SNA results. Finding reveals the existence of strong ties among some of the museums of the city, particularly to create and promote joint products. Nevertheless, there were detected outsiders who have an individual strategy, without collaboration and cooperation with other museums or agents of the city. Results also show that some relational ties have an institutional origin, while others are the result of a long process of cooperation with common projects. Conclusions put in evidence that collaboration and cooperation of museums had been positive to increase the attractiveness of the museum and the city as a cultural destination. Future research and managerial implications are also mentioned.

Keywords: cultural tourism, competitiveness, museums, Social Network analysis

Procedia PDF Downloads 114
25611 Experimental Research on Ductility of Regional Confined Concrete Beam

Authors: Qinggui Wu, Xinming Cao, Guyue Guo, Jiajun Ding

Abstract:

In efforts to study the shear ductility of regional confined concrete beam, 5 reinforced concrete beams were tested to examine its shear performance. These beams has the same shear span ratio, concrete strength, different ratios of tension reinforcement and shapes of stirrup. The purpose of the test is studying the effects of stirrup shape and tension reinforcement ratio on failure mode and shear ductility. The test shows that the regional confined part can be used as an independent part and the rest of the beam is good to work together so that the ductility of the beam is more one time higher than that of the normal confined concrete beam. The related laws of the effect of tension reinforcement ratio and stirrup shapes on beam’s shear ductility are founded.

Keywords: ratio of tension reinforcement, stirrup shapes, shear ductility, failure mode

Procedia PDF Downloads 330
25610 Autonomic Threat Avoidance and Self-Healing in Database Management System

Authors: Wajahat Munir, Muhammad Haseeb, Adeel Anjum, Basit Raza, Ahmad Kamran Malik

Abstract:

Databases are the key components of the software systems. Due to the exponential growth of data, it is the concern that the data should be accurate and available. The data in databases is vulnerable to internal and external threats, especially when it contains sensitive data like medical or military applications. Whenever the data is changed by malicious intent, data analysis result may lead to disastrous decisions. Autonomic self-healing is molded toward computer system after inspiring from the autonomic system of human body. In order to guarantee the accuracy and availability of data, we propose a technique which on a priority basis, tries to avoid any malicious transaction from execution and in case a malicious transaction affects the system, it heals the system in an isolated mode in such a way that the availability of system would not be compromised. Using this autonomic system, the management cost and time of DBAs can be minimized. In the end, we test our model and present the findings.

Keywords: autonomic computing, self-healing, threat avoidance, security

Procedia PDF Downloads 502
25609 Information Extraction Based on Search Engine Results

Authors: Mohammed R. Elkobaisi, Abdelsalam Maatuk

Abstract:

The search engines are the large scale information retrieval tools from the Web that are currently freely available to all. This paper explains how to convert the raw resulted number of search engines into useful information. This represents a new method for data gathering comparing with traditional methods. When a query is submitted for a multiple numbers of keywords, this take a long time and effort, hence we develop a user interface program to automatic search by taking multi-keywords at the same time and leave this program to collect wanted data automatically. The collected raw data is processed using mathematical and statistical theories to eliminate unwanted data and converting it to usable data.

Keywords: search engines, information extraction, agent system

Procedia PDF Downloads 423
25608 Implementation and Performance Analysis of Data Encryption Standard and RSA Algorithm with Image Steganography and Audio Steganography

Authors: S. C. Sharma, Ankit Gambhir, Rajeev Arya

Abstract:

In today’s era data security is an important concern and most demanding issues because it is essential for people using online banking, e-shopping, reservations etc. The two major techniques that are used for secure communication are Cryptography and Steganography. Cryptographic algorithms scramble the data so that intruder will not able to retrieve it; however steganography covers that data in some cover file so that presence of communication is hidden. This paper presents the implementation of Ron Rivest, Adi Shamir, and Leonard Adleman (RSA) Algorithm with Image and Audio Steganography and Data Encryption Standard (DES) Algorithm with Image and Audio Steganography. The coding for both the algorithms have been done using MATLAB and its observed that these techniques performed better than individual techniques. The risk of unauthorized access is alleviated up to a certain extent by using these techniques. These techniques could be used in Banks, RAW agencies etc, where highly confidential data is transferred. Finally, the comparisons of such two techniques are also given in tabular forms.

Keywords: audio steganography, data security, DES, image steganography, intruder, RSA, steganography

Procedia PDF Downloads 284
25607 Experiments on Weakly-Supervised Learning on Imperfect Data

Authors: Yan Cheng, Yijun Shao, James Rudolph, Charlene R. Weir, Beth Sahlmann, Qing Zeng-Treitler

Abstract:

Supervised predictive models require labeled data for training purposes. Complete and accurate labeled data, i.e., a ‘gold standard’, is not always available, and imperfectly labeled data may need to serve as an alternative. An important question is if the accuracy of the labeled data creates a performance ceiling for the trained model. In this study, we trained several models to recognize the presence of delirium in clinical documents using data with annotations that are not completely accurate (i.e., weakly-supervised learning). In the external evaluation, the support vector machine model with a linear kernel performed best, achieving an area under the curve of 89.3% and accuracy of 88%, surpassing the 80% accuracy of the training sample. We then generated a set of simulated data and carried out a series of experiments which demonstrated that models trained on imperfect data can (but do not always) outperform the accuracy of the training data, e.g., the area under the curve for some models is higher than 80% when trained on the data with an error rate of 40%. Our experiments also showed that the error resistance of linear modeling is associated with larger sample size, error type, and linearity of the data (all p-values < 0.001). In conclusion, this study sheds light on the usefulness of imperfect data in clinical research via weakly-supervised learning.

Keywords: weakly-supervised learning, support vector machine, prediction, delirium, simulation

Procedia PDF Downloads 195