Search results for: big data markets
25137 Seismic Data Scaling: Uncertainties, Potential and Applications in Workstation Interpretation
Authors: Ankur Mundhra, Shubhadeep Chakraborty, Y. R. Singh, Vishal Das
Abstract:
Seismic data scaling affects the dynamic range of a data and with present day lower costs of storage and higher reliability of Hard Disk data, scaling is not suggested. However, in dealing with data of different vintages, which perhaps were processed in 16 bits or even 8 bits and are need to be processed with 32 bit available data, scaling is performed. Also, scaling amplifies low amplitude events in deeper region which disappear due to high amplitude shallow events that saturate amplitude scale. We have focused on significance of scaling data to aid interpretation. This study elucidates a proper seismic loading procedure in workstations without using default preset parameters as available in most software suites. Differences and distribution of amplitude values at different depth for seismic data are probed in this exercise. Proper loading parameters are identified and associated steps are explained that needs to be taken care of while loading data. Finally, the exercise interprets the un-certainties which might arise when correlating scaled and unscaled versions of seismic data with synthetics. As, seismic well tie correlates the seismic reflection events with well markers, for our study it is used to identify regions which are enhanced and/or affected by scaling parameter(s).Keywords: clipping, compression, resolution, seismic scaling
Procedia PDF Downloads 47025136 Creative Self-efficacy and Innovation Speed of New Ventures: The Mediating Role of Entrepreneurial Bricolage
Authors: Yi-Wen Chen, Hsueh-Liang Fan
Abstract:
Evidence shows that start-ups success is positively correlated with innovation speed. However, new ventures are seldom able to acquire abundant resources for new product development (NPD), which means that entrepreneurs may depend on personal creativity instead of physical investments to achieve and accelerate speed of first product launch. This study accentuates the role of entrepreneurial bricolage, which defined as making do by applying combinations of the resources at hand to new problems and opportunities, in the relations of creative self-efficacy and innovation speed. This study uses structural equation modeling to test the hypotheses in a sample of 203 start-ups operating various creative markets. Results reveal that creative self-efficacy is positively and directly associated with innovation speed, whereas entrepreneurial bricolage plays a full mediator. These findings offer important theoretical and practical implications.Keywords: creative self-efficacy, innovation speed, entrepreneurial bricolage, new ventures
Procedia PDF Downloads 53025135 Association of Social Data as a Tool to Support Government Decision Making
Authors: Diego Rodrigues, Marcelo Lisboa, Elismar Batista, Marcos Dias
Abstract:
Based on data on child labor, this work arises questions about how to understand and locate the factors that make up the child labor rates, and which properties are important to analyze these cases. Using data mining techniques to discover valid patterns on Brazilian social databases were evaluated data of child labor in the State of Tocantins (located north of Brazil with a territory of 277000 km2 and comprises 139 counties). This work aims to detect factors that are deterministic for the practice of child labor and their relationships with financial indicators, educational, regional and social, generating information that is not explicit in the government database, thus enabling better monitoring and updating policies for this purpose.Keywords: social data, government decision making, association of social data, data mining
Procedia PDF Downloads 36925134 A Particle Filter-Based Data Assimilation Method for Discrete Event Simulation
Authors: Zhi Zhu, Boquan Zhang, Tian Jing, Jingjing Li, Tao Wang
Abstract:
Data assimilation is a model and data hybrid-driven method that dynamically fuses new observation data with a numerical model to iteratively approach the real system state. It is widely used in state prediction and parameter inference of continuous systems. Because of the discrete event system’s non-linearity and non-Gaussianity, traditional Kalman Filter based on linear and Gaussian assumptions cannot perform data assimilation for such systems, so particle filter has gradually become a technical approach for discrete event simulation data assimilation. Hence, we proposed a particle filter-based discrete event simulation data assimilation method and took the unmanned aerial vehicle (UAV) maintenance service system as a proof of concept to conduct simulation experiments. The experimental results showed that the filtered state data is closer to the real state of the system, which verifies the effectiveness of the proposed method. This research can provide a reference framework for the data assimilation process of other complex nonlinear systems, such as discrete-time and agent simulation.Keywords: discrete event simulation, data assimilation, particle filter, model and data-driven
Procedia PDF Downloads 1425133 Outlier Detection in Stock Market Data using Tukey Method and Wavelet Transform
Authors: Sadam Alwadi
Abstract:
Outlier values become a problem that frequently occurs in the data observation or recording process. Thus, the need for data imputation has become an essential matter. In this work, it will make use of the methods described in the prior work to detect the outlier values based on a collection of stock market data. In order to implement the detection and find some solutions that maybe helpful for investors, real closed price data were obtained from the Amman Stock Exchange (ASE). Tukey and Maximum Overlapping Discrete Wavelet Transform (MODWT) methods will be used to impute the detect the outlier values.Keywords: outlier values, imputation, stock market data, detecting, estimation
Procedia PDF Downloads 8125132 Shopping Tourism for Emerging Markets: Examining Shopping Tourism in the UK as an Attraction Tool for Wealthy Tourists
Authors: Ali Abdallah, Shaima Al Mohannadi
Abstract:
This study explores shopping tourism in the UK and examines it as an attraction tool for wealthy tourists to the UK’s capital city London. The study aims to identify the scope of shopping tourism used by countries such as the UK as a tool for attracting wealthy tourists. This study adopts the quantitative research approach through surveys in attaining the results required. Results demonstrate how the UK tourism market is an experience-based market and has recently become an attraction for luxurious brand shoppers. The term Trexit is introduced as a new form of tourism generated by the Brexit. If addressed appropriately the Trexit can assist in any negative economic retaliations of the Brexit. The study concludes that shopping tourism is yet to further incline in years to come, however, government support and cooperative planning with the retail industry is required as a means of further strengthening this developing sector.Keywords: Brexit tourism, luxury shopping, UK tourism, wealthy tourists
Procedia PDF Downloads 16325131 PEINS: A Generic Compression Scheme Using Probabilistic Encoding and Irrational Number Storage
Authors: P. Jayashree, S. Rajkumar
Abstract:
With social networks and smart devices generating a multitude of data, effective data management is the need of the hour for networks and cloud applications. Some applications need effective storage while some other applications need effective communication over networks and data reduction comes as a handy solution to meet out both requirements. Most of the data compression techniques are based on data statistics and may result in either lossy or lossless data reductions. Though lossy reductions produce better compression ratios compared to lossless methods, many applications require data accuracy and miniature details to be preserved. A variety of data compression algorithms does exist in the literature for different forms of data like text, image, and multimedia data. In the proposed work, a generic progressive compression algorithm, based on probabilistic encoding, called PEINS is projected as an enhancement over irrational number stored coding technique to cater to storage issues of increasing data volumes as a cost effective solution, which also offers data security as a secondary outcome to some extent. The proposed work reveals cost effectiveness in terms of better compression ratio with no deterioration in compression time.Keywords: compression ratio, generic compression, irrational number storage, probabilistic encoding
Procedia PDF Downloads 29525130 Iot Device Cost Effective Storage Architecture and Real-Time Data Analysis/Data Privacy Framework
Authors: Femi Elegbeleye, Omobayo Esan, Muienge Mbodila, Patrick Bowe
Abstract:
This paper focused on cost effective storage architecture using fog and cloud data storage gateway and presented the design of the framework for the data privacy model and data analytics framework on a real-time analysis when using machine learning method. The paper began with the system analysis, system architecture and its component design, as well as the overall system operations. The several results obtained from this study on data privacy model shows that when two or more data privacy model is combined we tend to have a more stronger privacy to our data, and when fog storage gateway have several advantages over using the traditional cloud storage, from our result shows fog has reduced latency/delay, low bandwidth consumption, and energy usage when been compare with cloud storage, therefore, fog storage will help to lessen excessive cost. This paper dwelt more on the system descriptions, the researchers focused on the research design and framework design for the data privacy model, data storage, and real-time analytics. This paper also shows the major system components and their framework specification. And lastly, the overall research system architecture was shown, its structure, and its interrelationships.Keywords: IoT, fog, cloud, data analysis, data privacy
Procedia PDF Downloads 9925129 Comparison of Selected Pier-Scour Equations for Wide Piers Using Field Data
Authors: Nordila Ahmad, Thamer Mohammad, Bruce W. Melville, Zuliziana Suif
Abstract:
Current methods for predicting local scour at wide bridge piers, were developed on the basis of laboratory studies and very limited scour prediction were tested with field data. Laboratory wide pier scour equation from previous findings with field data were presented. A wide range of field data were used and it consists of both live-bed and clear-water scour. A method for assessing the quality of the data was developed and applied to the data set. Three other wide pier-scour equations from the literature were used to compare the performance of each predictive method. The best-performing scour equation were analyzed using statistical analysis. Comparisons of computed and observed scour depths indicate that the equation from the previous publication produced the smallest discrepancy ratio and RMSE value when compared with the large amount of laboratory and field data.Keywords: field data, local scour, scour equation, wide piers
Procedia PDF Downloads 41425128 Cultural Heritage Resources for Tourism, Two Countries – Two Approaches: A Comparative Analysis of Cultural Tourism Products in Turkey and Austria
Authors: Irfan Arikan, George Christian Steckenbauer
Abstract:
Turkey and Austria are examples for highly developed tourism destinations, where tourism providers use cultural heritage and regional natural resources to develop modern tourism products in order to be successful on increasingly competitive international tourism markets. The use and exploitation of these resources follow on the one hand international standards of tourism marketing (as ‘sustainability’). Therefore, we find highly comparable internationalized products in these destinations (like hotel products, museums, spas etc.). On the other hand, development standards and processes strongly depend on local, regional and national cultures, which influence the way how people work, cooperate, think and create. Thus, cultural factors also influence the attitude towards cultural heritage and natural resources and the way, how these resources are used for the creation of tourism products. This leads to differences in the development of tourism products on several levels: 1. In the selection of cultural heritage and natural resources for the product development process 2. In the processes, how tourism products are created 3. In the way, how providers and marketing organisations work with tourism products based on cultural heritage or natural resources. Aim of this paper is to discover differences in these dimensions by analysing and comparing examples of tourism products in Turkey and Austria, both countries with a highly developed, high professional tourism industry and rich experience of stakeholders in tourism industry in the field of product development and marketing. The cases are selected from the following fields: + Cultural tourism / heritage tourism + City tourism + Industrial heritage tourism + Nature and outdoor tourism + Health tourism The cases are analysed based on available secondary data (as several cases are scientifically described) and expert interviews with local and regional stakeholders of tourism industry and tourism experts. The available primary and secondary data will be analysed and displayed in a comparative structure that allows to derive answers to the above stated research question. The result of the project therefore will be a more precise picture about the influence of cultural differences on the use and exploitation of resources in the field of tourism that allows developing recommendations for tourism industry, which must be taken into consideration to assure cultural and natural resources are treated in a sustainable and responsible way. The authors will edit these culture-cross recommendations in form of a ‘check-list’ that can be used as a ‘guideline’ for tourism professionals in the field of product development and marketing and therefore connects theoretical research to the field of practical application and closes the gap between academic research and the field of tourism practice.Keywords: cultural heritage, natural resources, Austria, Turkey
Procedia PDF Downloads 49225127 The Maximum Throughput Analysis of UAV Datalink 802.11b Protocol
Authors: Inkyu Kim, SangMan Moon
Abstract:
This IEEE 802.11b protocol provides up to 11Mbps data rate, whereas aerospace industry wants to seek higher data rate COTS data link system in the UAV. The Total Maximum Throughput (TMT) and delay time are studied on many researchers in the past years This paper provides theoretical data throughput performance of UAV formation flight data link using the existing 802.11b performance theory. We operate the UAV formation flight with more than 30 quad copters with 802.11b protocol. We may be predicting that UAV formation flight numbers have to bound data link protocol performance limitations.Keywords: UAV datalink, UAV formation flight datalink, UAV WLAN datalink application, UAV IEEE 802.11b datalink application
Procedia PDF Downloads 39225126 Methods for Distinction of Cattle Using Supervised Learning
Authors: Radoslav Židek, Veronika Šidlová, Radovan Kasarda, Birgit Fuerst-Waltl
Abstract:
Machine learning represents a set of topics dealing with the creation and evaluation of algorithms that facilitate pattern recognition, classification, and prediction, based on models derived from existing data. The data can present identification patterns which are used to classify into groups. The result of the analysis is the pattern which can be used for identification of data set without the need to obtain input data used for creation of this pattern. An important requirement in this process is careful data preparation validation of model used and its suitable interpretation. For breeders, it is important to know the origin of animals from the point of the genetic diversity. In case of missing pedigree information, other methods can be used for traceability of animal´s origin. Genetic diversity written in genetic data is holding relatively useful information to identify animals originated from individual countries. We can conclude that the application of data mining for molecular genetic data using supervised learning is an appropriate tool for hypothesis testing and identifying an individual.Keywords: genetic data, Pinzgau cattle, supervised learning, machine learning
Procedia PDF Downloads 55025125 Router 1X3 - RTL Design and Verification
Authors: Nidhi Gopal
Abstract:
Routing is the process of moving a packet of data from source to destination and enables messages to pass from one computer to another and eventually reach the target machine. A router is a networking device that forwards data packets between computer networks. It is connected to two or more data lines from different networks (as opposed to a network switch, which connects data lines from one single network). This paper mainly emphasizes upon the study of router device, its top level architecture, and how various sub-modules of router i.e. Register, FIFO, FSM and Synchronizer are synthesized, and simulated and finally connected to its top module.Keywords: data packets, networking, router, routing
Procedia PDF Downloads 81425124 Noise Reduction in Web Data: A Learning Approach Based on Dynamic User Interests
Authors: Julius Onyancha, Valentina Plekhanova
Abstract:
One of the significant issues facing web users is the amount of noise in web data which hinders the process of finding useful information in relation to their dynamic interests. Current research works consider noise as any data that does not form part of the main web page and propose noise web data reduction tools which mainly focus on eliminating noise in relation to the content and layout of web data. This paper argues that not all data that form part of the main web page is of a user interest and not all noise data is actually noise to a given user. Therefore, learning of noise web data allocated to the user requests ensures not only reduction of noisiness level in a web user profile, but also a decrease in the loss of useful information hence improves the quality of a web user profile. Noise Web Data Learning (NWDL) tool/algorithm capable of learning noise web data in web user profile is proposed. The proposed work considers elimination of noise data in relation to dynamic user interest. In order to validate the performance of the proposed work, an experimental design setup is presented. The results obtained are compared with the current algorithms applied in noise web data reduction process. The experimental results show that the proposed work considers the dynamic change of user interest prior to elimination of noise data. The proposed work contributes towards improving the quality of a web user profile by reducing the amount of useful information eliminated as noise.Keywords: web log data, web user profile, user interest, noise web data learning, machine learning
Procedia PDF Downloads 26525123 Data Mining and Knowledge Management Application to Enhance Business Operations: An Exploratory Study
Authors: Zeba Mahmood
Abstract:
The modern business organizations are adopting technological advancement to achieve competitive edge and satisfy their consumer. The development in the field of Information technology systems has changed the way of conducting business today. Business operations today rely more on the data they obtained and this data is continuously increasing in volume. The data stored in different locations is difficult to find and use without the effective implementation of Data mining and Knowledge management techniques. Organizations who smartly identify, obtain and then convert data in useful formats for their decision making and operational improvements create additional value for their customers and enhance their operational capabilities. Marketers and Customer relationship departments of firm use Data mining techniques to make relevant decisions, this paper emphasizes on the identification of different data mining and Knowledge management techniques that are applied to different business industries. The challenges and issues of execution of these techniques are also discussed and critically analyzed in this paper.Keywords: knowledge, knowledge management, knowledge discovery in databases, business, operational, information, data mining
Procedia PDF Downloads 53825122 Indexing and Incremental Approach Using Map Reduce Bipartite Graph (MRBG) for Mining Evolving Big Data
Authors: Adarsh Shroff
Abstract:
Big data is a collection of dataset so large and complex that it becomes difficult to process using data base management tools. To perform operations like search, analysis, visualization on big data by using data mining; which is the process of extraction of patterns or knowledge from large data set. In recent years, the data mining applications become stale and obsolete over time. Incremental processing is a promising approach to refreshing mining results. It utilizes previously saved states to avoid the expense of re-computation from scratch. This project uses i2MapReduce, an incremental processing extension to Map Reduce, the most widely used framework for mining big data. I2MapReduce performs key-value pair level incremental processing rather than task level re-computation, supports not only one-step computation but also more sophisticated iterative computation, which is widely used in data mining applications, and incorporates a set of novel techniques to reduce I/O overhead for accessing preserved fine-grain computation states. To optimize the mining results, evaluate i2MapReduce using a one-step algorithm and three iterative algorithms with diverse computation characteristics for efficient mining.Keywords: big data, map reduce, incremental processing, iterative computation
Procedia PDF Downloads 35125121 Exchange Rate Variation and Balance of Payments: The Nigerian Experience (1970-2012)
Authors: Vitus Onyebuchim Onyemailu, Olive Obianuju Okalibe
Abstract:
The study tried to examine relationship between exchange rate variations on the balance of payments in Nigeria from 1970 to 2012. Using time series on econometric measures such as Granger causality and ordinary least square (OLS), the study found that exchange rate movements especially the depreciation of naira has not contributed significantly on the balance of payments under the year of the study. The granger result conform the Marshall-Lerner short and long run prepositions that exchange rate devaluation enhances balance of payments. On disaggregation exchange rate granger causes current and capital account balances give the Nigeria data from 1970 to 2012. Overall in the long run OLS regression analysis, exchange rate on semi log functional form, exchange rate variation did not record significant effect on balance of payment equation. This height was also maintained in the current or trade balance which does not match the Marshall-Lerner. The capital account balance in reverse reported a significant impact of exchange rate variability on the capital account balance. Finally, on exchange rate determination equation, where many fundamentals were considered including lagged of exchange rate. Thus, the lagged of exchange rate recorded a positive and significant influence on the present exchange rate. This means that players in the financial markets usually out plays authority’s policy’s stances through their speculative tendencies. The work therefore, recommend that effort should be made by the authorities to providing enabling environment for production of goods and services to triumph in order to take advantages of steady devaluation of its currency. This is done by providing infrastructure, provision of science and technology. Thus, when this is done Nigeria would be able to have competitive power against the rest of the world.Keywords: exchange rate variation, balance of payments, current account, capital account, Marshall-Lerner hypothesis
Procedia PDF Downloads 39725120 Analyzing Large Scale Recurrent Event Data with a Divide-And-Conquer Approach
Authors: Jerry Q. Cheng
Abstract:
Currently, in analyzing large-scale recurrent event data, there are many challenges such as memory limitations, unscalable computing time, etc. In this research, a divide-and-conquer method is proposed using parametric frailty models. Specifically, the data is randomly divided into many subsets, and the maximum likelihood estimator from each individual data set is obtained. Then a weighted method is proposed to combine these individual estimators as the final estimator. It is shown that this divide-and-conquer estimator is asymptotically equivalent to the estimator based on the full data. Simulation studies are conducted to demonstrate the performance of this proposed method. This approach is applied to a large real dataset of repeated heart failure hospitalizations.Keywords: big data analytics, divide-and-conquer, recurrent event data, statistical computing
Procedia PDF Downloads 16625119 An Analysis of Eco-efficiency and GHG Emission of Olive Oil Production in Northeast of Portugal
Authors: M. Feliciano, F. Maia, A. Gonçalves
Abstract:
Olive oil production sector plays an important role in Portuguese economy. It had a major growth over the last decade, increasing its weight in the overall national exports. International market penetration for Mediterranean traditional products is increasingly more demanding, especially in the Northern European markets, where consumers are looking for more sustainable products. Trying to support this growing demand this study addresses olive oil production under the environmental and eco-efficiency perspectives. The analysis considers two consecutive product life cycle stages: olive trees farming; and olive oil extraction in mills. Addressing olive farming, data collection covered two different organizations: a middle-size farm (~12ha) (F1) and a large-size farm (~100ha) (F2). Results from both farms show that olive collection activities are responsible for the largest amounts of Green House Gases (GHG) emissions. In this activities, estimate for the Carbon Footprint per olive was higher in F2 (188g CO2e/kgolive) than in F1 (148g CO2e/kgolive). Considering olive oil extraction, two different mills were considered: one using a two-phase system (2P) and other with a three-phase system (3P). Results from the study of two mills show that there is a much higher use of water in 3P. Energy intensity (EI) is similar in both mills. When evaluating the GHG generated, two conditions are evaluated: a biomass neutral condition resulting on a carbon footprint higher in 3P (184g CO2e/Lolive oil) than in 2P (92g CO2e/Lolive oil); and a non-neutral biomass condition in which 2P increase its carbon footprint to 273g CO2e/Lolive oil. When addressing the carbon footprint of possible combinations among studied subsystems, results suggest that olive harvesting is the major source for GHG.Keywords: carbon footprint, environmental indicators, farming subsystem, industrial subsystem, olive oil
Procedia PDF Downloads 28725118 Neoliberalism and Environmental Justice: A Critical Examination of Corporate Greenwashing
Authors: Arnav M. Raval
Abstract:
This paper critically examines the neoliberal economic model and its role in enabling corporate greenwashing, a practice where corporations deceptively market themselves as environmentally responsible while continuing harmful environmental practices. Through a rigorous focus on the neoliberal emphasis of free markets, deregulation, and minimal government intervention, this paper explores how these policies have set the stage for corporations to externalize environmental costs and engage in superficial sustainability initiatives. Within this framework, companies often bypass meaningful environmental reform, opting for strategies that enhance their public image without addressing their actual environmental impacts. The paper also draws on the works of critical theorists Theodor Adorno, Max Horkheimer, and Herbert Marcuse, particularly their critiques of capitalist society and its tendency to commodify social values. This paper argues that neoliberal capitalism has commodified environmentalism, transforming genuine ecological responsibility into a marketable product. Through corporate social responsibility initiatives, corporations have created the illusion of sustainability while masking deeper environmental harm. Under neoliberalism, these initiatives often serve as public relations tools rather than genuine commitments to environmental justice and sustainability. This commodification has become particularly dangerous because as it manipulates consumer perceptions and diverts attention away from the structural causes of environmental degradation. The analysis also examines how greenwashing practices have disproportionately affected marginalized communities, particularly in the global South, where environmental costs are often externalized. As these corporations promote their “sustainability” in wealthier markets, these marginalized communities bear the brunt of their pollution, resource depletion, and other forms of environmental degradation. This dynamic underscores the inherent injustice within neoliberal environmental policies, as those most vulnerable to environmental risks are often neglected, as companies reap the benefits of corporate sustainability efforts at their expense. Finally, this paper calls for a fundamental transition away from neoliberal market-driven solutions, which prioritize corporate profit over genuine ecological reform. It advocates for stronger regulatory frameworks, transparent third-party certifications, and a more collective approach to environmental governance. In order to ensure genuine corporate accountability, governments and institutions must move beyond superficial green initiatives and market-based solutions, shifting toward policies that enforce real environmental responsibility and prioritize environmental justice for all communities. Through the critique of the neoliberal system and its commodification of environmentalism, this paper has highlighted the urgent need to rethink how environmental responsibility is defined and enacted in the corporate world. Without systemic change, greenwashing will continue to undermine both ecological sustainability and social justice, leaving the most vulnerable populations to suffer the consequences.Keywords: critical theory, environmental justice, greenwashing, neoliberalism
Procedia PDF Downloads 1725117 Adoption of Big Data by Global Chemical Industries
Authors: Ashiff Khan, A. Seetharaman, Abhijit Dasgupta
Abstract:
The new era of big data (BD) is influencing chemical industries tremendously, providing several opportunities to reshape the way they operate and help them shift towards intelligent manufacturing. Given the availability of free software and the large amount of real-time data generated and stored in process plants, chemical industries are still in the early stages of big data adoption. The industry is just starting to realize the importance of the large amount of data it owns to make the right decisions and support its strategies. This article explores the importance of professional competencies and data science that influence BD in chemical industries to help it move towards intelligent manufacturing fast and reliable. This article utilizes a literature review and identifies potential applications in the chemical industry to move from conventional methods to a data-driven approach. The scope of this document is limited to the adoption of BD in chemical industries and the variables identified in this article. To achieve this objective, government, academia, and industry must work together to overcome all present and future challenges.Keywords: chemical engineering, big data analytics, industrial revolution, professional competence, data science
Procedia PDF Downloads 8625116 The Impact of Information Communication Technology on Promoting Travel Trade Industry in a Developing Economy, Case Study Nigeria
Authors: Murtala Mohammed Alamai, Abdullahi Marshal Idris, Adama Idris
Abstract:
Today, marketing does not only involve selecting target markets, but it also involves communicating with the customers through various means to put across your selling point. Modern marketing involves driving new product development based on customer needs by getting feedback from them. Utilizing the latest technology for better communication with the customers is the latest advancement in Marketing in the 21st century. The survey approach was used where a sample of tourist destinations across the six geographical zones of the country at random was done to ascertain the use of information communication systems in promoting their products and or services, the findings revealed that only a few utilize these modern advanced means in marketing and promoting their products and a need to develop effective and up to date online services for marketing was proffered as solutions to the findings observed.Keywords: information, communication, travel, trade, promotion
Procedia PDF Downloads 32325115 Evaluating the Cost of Quality: A Case Study of a South African Foundry Business
Authors: Chipo Mugova, Zuko Mjobo
Abstract:
The aim of this study was to evaluate the cost of quality (COQ) at a local foundry business to identify the contribution of its units and processes to quality costs within the foundry’s operations. The foundry selected for detailed case study is one of major businesses that have been targeted by the government to produce components for building and re-furbishing wagons and trains. The study aimed at identifying areas in the foundry’s processes in which investment needs to be made to reduce quality costs. This is in alignment with government’s vision of promoting local business to support local markets leading to creation of jobs, and hence reduction of unemployment rate in South Africa. The methodology adopted used cost of quality models. Results from the study indicated that internal failure costs were significantly higher than all other cost of quality categories, taking more than 60% of the business’s income.Keywords: appraisal costs, cost of quality, failure costs, local content, prevention costs
Procedia PDF Downloads 34125114 Secure Multiparty Computations for Privacy Preserving Classifiers
Authors: M. Sumana, K. S. Hareesha
Abstract:
Secure computations are essential while performing privacy preserving data mining. Distributed privacy preserving data mining involve two to more sites that cannot pool in their data to a third party due to the violation of law regarding the individual. Hence in order to model the private data without compromising privacy and information loss, secure multiparty computations are used. Secure computations of product, mean, variance, dot product, sigmoid function using the additive and multiplicative homomorphic property is discussed. The computations are performed on vertically partitioned data with a single site holding the class value.Keywords: homomorphic property, secure product, secure mean and variance, secure dot product, vertically partitioned data
Procedia PDF Downloads 41225113 The Importance of Fire Safety in Egypt
Authors: Omar Shakra
Abstract:
This paper contains a huge number of benefits that we can use it in several places and times in fire safety protection in the Middle East especially in Egypt . People here in Egypt did not consider the safety and fire protection as important as it is. But on the other hand, its very important for them to contain the fire systems and safety in every facility, the companies , hospitals , police stations , and even the super markets must use the fire system. It makes the facility safe to the visitors while they are using it.From my point of view as the owner Fire Safety Company called Deluge Egypt , i can say that not all of the companies use the fire system protection according to the high cost they prefer to build their company without the protection, and this is make the building totally unsafe to be used from the visitors or client.So, i am looking for new methods and technology to invest in Egypt, and this is through attending this Conference and let the audiences know more about the services i provide and [to let them know about the importance of the Fire Safety in Egypt. The Objectives of my research 1- The system that i used in my Company. 2- The benefits of the Fire System Protection. 3-The importance of the Fire System and safety. 4-The use of the new Technologies. 5-The hardships that i found while having new deals with new clients.Keywords: fire, system, protection, fire hydrants, security, alarms
Procedia PDF Downloads 10925112 Understanding Health Behavior Using Social Network Analysis
Authors: Namrata Mishra
Abstract:
Health of a person plays a vital role in the collective health of his community and hence the well-being of the society as a whole. But, in today’s fast paced technology driven world, health issues are increasingly being associated with human behaviors – their lifestyle. Social networks have tremendous impact on the health behavior of individuals. Many researchers have used social network analysis to understand human behavior that implicates their social and economic environments. It would be interesting to use a similar analysis to understand human behaviors that have health implications. This paper focuses on concepts of those behavioural analyses that have health implications using social networks analysis and provides possible algorithmic approaches. The results of these approaches can be used by the governing authorities for rolling out health plans, benefits and take preventive measures, while the pharmaceutical companies can target specific markets, helping health insurance companies to better model their insurance plans.Keywords: breadth first search, directed graph, health behaviors, social network analysis
Procedia PDF Downloads 47125111 Cross Project Software Fault Prediction at Design Phase
Authors: Pradeep Singh, Shrish Verma
Abstract:
Software fault prediction models are created by using the source code, processed metrics from the same or previous version of code and related fault data. Some company do not store and keep track of all artifacts which are required for software fault prediction. To construct fault prediction model for such company, the training data from the other projects can be one potential solution. The earlier we predict the fault the less cost it requires to correct. The training data consists of metrics data and related fault data at function/module level. This paper investigates fault predictions at early stage using the cross-project data focusing on the design metrics. In this study, empirical analysis is carried out to validate design metrics for cross project fault prediction. The machine learning techniques used for evaluation is Naïve Bayes. The design phase metrics of other projects can be used as initial guideline for the projects where no previous fault data is available. We analyze seven data sets from NASA Metrics Data Program which offer design as well as code metrics. Overall, the results of cross project is comparable to the within company data learning.Keywords: software metrics, fault prediction, cross project, within project.
Procedia PDF Downloads 34425110 A Comparative Study between Japan and the European Union on Software Vulnerability Public Policies
Authors: Stefano Fantin
Abstract:
The present analysis outcomes from the research undertaken in the course of the European-funded project EUNITY, which targets the gaps in research and development on cybersecurity and privacy between Europe and Japan. Under these auspices, the research presents a study on the policy approach of Japan, the EU and a number of Member States of the Union with regard to the handling and discovery of software vulnerabilities, with the aim of identifying methodological differences and similarities. This research builds upon a functional comparative analysis of both public policies and legal instruments from the identified jurisdictions. The result of this analysis is based on semi-structured interviews with EUNITY partners, as well as by the participation of the researcher to a recent report from the Center for EU Policy Study on software vulnerability. The European Union presents a rather fragmented legal framework on software vulnerabilities. The presence of a number of different legislations at the EU level (including Network and Information Security Directive, Critical Infrastructure Directive, Directive on the Attacks at Information Systems and the Proposal for a Cybersecurity Act) with no clear focus on such a subject makes it difficult for both national governments and end-users (software owners, researchers and private citizens) to gain a clear understanding of the Union’s approach. Additionally, the current data protection reform package (general data protection regulation), seems to create legal uncertainty around security research. To date, at the member states level, a few efforts towards transparent practices have been made, namely by the Netherlands, France, and Latvia. This research will explain what policy approach such countries have taken. Japan has started implementing a coordinated vulnerability disclosure policy in 2004. To date, two amendments can be registered on the framework (2014 and 2017). The framework is furthermore complemented by a series of instruments allowing researchers to disclose responsibly any new discovery. However, the policy has started to lose its efficiency due to a significant increase in reports made to the authority in charge. To conclude, the research conducted reveals two asymmetric policy approaches, time-wise and content-wise. The analysis therein will, therefore, conclude with a series of policy recommendations based on the lessons learned from both regions, towards a common approach to the security of European and Japanese markets, industries and citizens.Keywords: cybersecurity, vulnerability, European Union, Japan
Procedia PDF Downloads 15625109 Comparing Emotion Recognition from Voice and Facial Data Using Time Invariant Features
Authors: Vesna Kirandziska, Nevena Ackovska, Ana Madevska Bogdanova
Abstract:
The problem of emotion recognition is a challenging problem. It is still an open problem from the aspect of both intelligent systems and psychology. In this paper, both voice features and facial features are used for building an emotion recognition system. A Support Vector Machine classifiers are built by using raw data from video recordings. In this paper, the results obtained for the emotion recognition are given, and a discussion about the validity and the expressiveness of different emotions is presented. A comparison between the classifiers build from facial data only, voice data only and from the combination of both data is made here. The need for a better combination of the information from facial expression and voice data is argued.Keywords: emotion recognition, facial recognition, signal processing, machine learning
Procedia PDF Downloads 31625108 Developing Digital Skills in Museum Professionals through Digital Education: International Good Practices and Effective Learning Experiences
Authors: Antonella Poce, Deborah Seid Howes, Maria Rosaria Re, Mara Valente
Abstract:
The Creative Industries education contexts, Museum Education in particular, generally presents a low emphasis on the use of new digital technologies, digital abilities and transversal skills development. The spread of the Covid-19 pandemic has underlined the importance of these abilities and skills in cultural heritage education contexts: gaining digital skills, museum professionals will improve their career opportunities with access to new distribution markets through internet access and e-commerce, new entrepreneurial tools, or adding new forms of digital expression to their work. However, the use of web, mobile, social, and analytical tools is becoming more and more essential in the Heritage field, and museums, in particular, to face the challenges posed by the current worldwide health emergency. Recent studies highlight the need for stronger partnerships between the cultural and creative sectors, social partners and education and training providers in order to provide these sectors with the combination of skills needed for creative entrepreneurship in a rapidly changing environment. Considering the above conditions, the paper presents different examples of digital learning experiences carried out in Italian and USA contexts with the aim of promoting digital skills in museum professionals. In particular, a quali-quantitative research study has been conducted on two international Postgraduate courses, “Advanced Studies in Museum Education” (2 years) and “Museum Education” (1 year), in order to identify the educational effectiveness of the online learning strategies used (e.g., OBL, Digital Storytelling, peer evaluation) for the development of digital skills and the acquisition of specific content. More than 50 museum professionals participating in the mentioned educational pathways took part in the learning activity, providing evaluation data useful for research purposes.Keywords: digital skills, museum professionals, technology, education
Procedia PDF Downloads 177