Search results for: panel data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25258

Search results for: panel data

24418 The Impact of Corporate Governance Mechanisms on Earnings Management Practices: Evidence from Jordan

Authors: Lara Al-Haddad, Mark Whittington

Abstract:

This paper aims to examine the impact of two influential internal corporate governance mechanisms, namely board characteristics and ownership structure on the use of real activities-based and accrual-based earnings management by Jordanian public firms. Using panel data from Jordanian public firms after the introduction of the Jordanian Corporate Governance Code (JCGC) in 2009, the study finds both institutional ownership and managerial ownership constrain the use of real and accrual earnings manipulations. On the other side, both independent directors and largest shareholders are found to exaggerate the incidence of using real and accrual earnings management. The study also examines the trade-off between real and accrual earnings management and found that Jordanian firms use a combination of real and accrual-based earnings management to obtain the greatest effect on earnings reporting strategies. For the purpose of this study, three types of real earnings management are considered: sales manipulation, overproduction, and the abnormal reduction of discretionary expenditures. The abnormal discretionary accrual is considered for accruals management. While for the internal corporate governance mechanisms; board characteristics are examined by using board independence, board size, and CEO-duality; and ownership structure is examined by using managerial ownership, institutional ownership, foreign ownership and largest shareholder ownership. To the best knowledge of the researchers, this study is the first to examine the relationship between board characteristics and real earnings management in Jordan. Further, it is the first to examine the relationship between corporate governance mechanisms and discretionary accruals after the introduction of the Jordanian Corporate Governance Code in 2009. Thus, the findings of this study have important policy implications for policymakers, regulators, standard setters, audit professional, and investors in their attempts to constrain the practice of earnings management, whether real or accrual, and to improve the financial reporting quality in Jordan.

Keywords: board characteristics, Jordan, ownership structure, real earnings management

Procedia PDF Downloads 338
24417 Protecting Privacy and Data Security in Online Business

Authors: Bilquis Ferdousi

Abstract:

With the exponential growth of the online business, the threat to consumers’ privacy and data security has become a serious challenge. This literature review-based study focuses on a better understanding of those threats and what legislative measures have been taken to address those challenges. Research shows that people are increasingly involved in online business using different digital devices and platforms, although this practice varies based on age groups. The threat to consumers’ privacy and data security is a serious hindrance in developing trust among consumers in online businesses. There are some legislative measures taken at the federal and state level to protect consumers’ privacy and data security. The study was based on an extensive review of current literature on protecting consumers’ privacy and data security and legislative measures that have been taken.

Keywords: privacy, data security, legislation, online business

Procedia PDF Downloads 98
24416 Flowing Online Vehicle GPS Data Clustering Using a New Parallel K-Means Algorithm

Authors: Orhun Vural, Oguz Bayat, Rustu Akay, Osman N. Ucan

Abstract:

This study presents a new parallel approach clustering of GPS data. Evaluation has been made by comparing execution time of various clustering algorithms on GPS data. This paper aims to propose a parallel based on neighborhood K-means algorithm to make it faster. The proposed parallelization approach assumes that each GPS data represents a vehicle and to communicate between vehicles close to each other after vehicles are clustered. This parallelization approach has been examined on different sized continuously changing GPS data and compared with serial K-means algorithm and other serial clustering algorithms. The results demonstrated that proposed parallel K-means algorithm has been shown to work much faster than other clustering algorithms.

Keywords: parallel k-means algorithm, parallel clustering, clustering algorithms, clustering on flowing data

Procedia PDF Downloads 217
24415 An Analysis of Privacy and Security for Internet of Things Applications

Authors: Dhananjay Singh, M. Abdullah-Al-Wadud

Abstract:

The Internet of Things is a concept of a large scale ecosystem of wireless actuators. The actuators are defined as things in the IoT, those which contribute or produces some data to the ecosystem. However, ubiquitous data collection, data security, privacy preserving, large volume data processing, and intelligent analytics are some of the key challenges into the IoT technologies. In order to solve the security requirements, challenges and threats in the IoT, we have discussed a message authentication mechanism for IoT applications. Finally, we have discussed data encryption mechanism for messages authentication before propagating into IoT networks.

Keywords: Internet of Things (IoT), message authentication, privacy, security

Procedia PDF Downloads 378
24414 Industrial Practical Training for Mechanical Engineering Students: A Multidisciplinary Approach

Authors: Bashiru Olayinka Adisa, Najeem Lateef

Abstract:

The integrated knowledge in the application of mechanical engineering, microprocessor and electronic sensor technologies is becoming the basic skill of a modern engineer in machinery based processes. To meet this objective, we have developed a cross-disciplinary industrial training to teach essential hard technical and soft project skills to the mechanical engineering students in mid-curriculum. Ten groups of students were selected to participate in a 150 hour program. The students were required to design and build a robot with ability to follow tracks and pick/place target blocks in specific locations. The students were trained to integrate the knowledge of computer aid design, electronics, sensor theories and motor technology to fabricate a workable robot as a major outcome of this course. On completion of the project, students competed for top robot honors by demonstrating their robots' movements and performance in pick/place to a panel of judges.

Keywords: electronics, sensor theories and motor, robot, technology

Procedia PDF Downloads 276
24413 Cognitive Science Based Scheduling in Grid Environment

Authors: N. D. Iswarya, M. A. Maluk Mohamed, N. Vijaya

Abstract:

Grid is infrastructure that allows the deployment of distributed data in large size from multiple locations to reach a common goal. Scheduling data intensive applications becomes challenging as the size of data sets are very huge in size. Only two solutions exist in order to tackle this challenging issue. First, computation which requires huge data sets to be processed can be transferred to the data site. Second, the required data sets can be transferred to the computation site. In the former scenario, the computation cannot be transferred since the servers are storage/data servers with little or no computational capability. Hence, the second scenario can be considered for further exploration. During scheduling, transferring huge data sets from one site to another site requires more network bandwidth. In order to mitigate this issue, this work focuses on incorporating cognitive science in scheduling. Cognitive Science is the study of human brain and its related activities. Current researches are mainly focused on to incorporate cognitive science in various computational modeling techniques. In this work, the problem solving approach of human brain is studied and incorporated during the data intensive scheduling in grid environments. Here, a cognitive engine is designed and deployed in various grid sites. The intelligent agents present in CE will help in analyzing the request and creating the knowledge base. Depending upon the link capacity, decision will be taken whether to transfer data sets or to partition the data sets. Prediction of next request is made by the agents to serve the requesting site with data sets in advance. This will reduce the data availability time and data transfer time. Replica catalog and Meta data catalog created by the agents assist in decision making process.

Keywords: data grid, grid workflow scheduling, cognitive artificial intelligence

Procedia PDF Downloads 391
24412 Heritage and Tourism in the Era of Big Data: Analysis of Chinese Cultural Tourism in Catalonia

Authors: Xinge Liao, Francesc Xavier Roige Ventura, Dolores Sanchez Aguilera

Abstract:

With the development of the Internet, the study of tourism behavior has rapidly expanded from the traditional physical market to the online market. Data on the Internet is characterized by dynamic changes, and new data appear all the time. In recent years the generation of a large volume of data was characterized, such as forums, blogs, and other sources, which have expanded over time and space, together they constitute large-scale Internet data, known as Big Data. This data of technological origin that derives from the use of devices and the activity of multiple users is becoming a source of great importance for the study of geography and the behavior of tourists. The study will focus on cultural heritage tourist practices in the context of Big Data. The research will focus on exploring the characteristics and behavior of Chinese tourists in relation to the cultural heritage of Catalonia. Geographical information, target image, perceptions in user-generated content will be studied through data analysis from Weibo -the largest social networks of blogs in China. Through the analysis of the behavior of heritage tourists in the Big Data environment, this study will understand the practices (activities, motivations, perceptions) of cultural tourists and then understand the needs and preferences of tourists in order to better guide the sustainable development of tourism in heritage sites.

Keywords: Barcelona, Big Data, Catalonia, cultural heritage, Chinese tourism market, tourists’ behavior

Procedia PDF Downloads 133
24411 Towards A Framework for Using Open Data for Accountability: A Case Study of A Program to Reduce Corruption

Authors: Darusalam, Jorish Hulstijn, Marijn Janssen

Abstract:

Media has revealed a variety of corruption cases in the regional and local governments all over the world. Many governments pursued many anti-corruption reforms and have created a system of checks and balances. Three types of corruption are faced by citizens; administrative corruption, collusion and extortion. Accountability is one of the benchmarks for building transparent government. The public sector is required to report the results of the programs that have been implemented so that the citizen can judge whether the institution has been working such as economical, efficient and effective. Open Data is offering solutions for the implementation of good governance in organizations who want to be more transparent. In addition, Open Data can create transparency and accountability to the community. The objective of this paper is to build a framework of open data for accountability to combating corruption. This paper will investigate the relationship between open data, and accountability as part of anti-corruption initiatives. This research will investigate the impact of open data implementation on public organization.

Keywords: open data, accountability, anti-corruption, framework

Procedia PDF Downloads 326
24410 Financial Inclusion and Modernization: Secure Energy Performance in Shanghai Cooperation Organization

Authors: Shama Urooj

Abstract:

The present work investigates the relationship among financial inclusion, modernization, and energy performance in SCO member countries during the years 2011–2021. PCA is used to create composite indexes of financial inclusion, modernization, and energy performance. We used panel regression models that are both reliable and heteroscedasticity-consistent to look at the relationship among variables. The findings indicate that financial inclusion (FI) and modernization, along with the increased FDI, all appear to contribute to the energy performance in the SCO member countries. However, per capita GDP has a negative impact on energy performance. These results are unbiased and consistent with the robust results obtained by applying different econometric models. Feasible Generalized Least Square (FGLS) estimation is also used for checking the uniformity of the main model results. This research work concludes that there has been no policy coherence in SCO member countries regarding the coordination of growing financial inclusion and modernization for energy sustainability in recent years. In order to improve energy performance with modern development, policies regarding financial inclusion and modernization need be integrated both at national as well as international levels.

Keywords: financial inclusion, energy performance, modernization, technological development, SCO.

Procedia PDF Downloads 70
24409 Mobile Technology as a Catalyst for Creative Teaching: A Developmental Based Research Study in a Large Public School in Mozambique

Authors: L. O'Sullivan, C. Murphy

Abstract:

This study examined the impact, if any, of mobile technology on the achievement of United Nations Sustainable Development Goal 4: Quality Education for All. It focused specifically on teachers and their practice, in a school with large class sizes and limited teaching resources. Teachers in third grade in a large public school in Mozambique were provided with an iPad connected to a projector, powered by a mobile solar-panel. Teachers also participated in ten days of professional development workshops over thirteen months. Teacher discussions, micro-teaching sessions and classes in the school were video-recorded, and data was triangulated using surveys and additional documents including class plans, digital artifacts created by teachers, workshop notes and researcher field notes. The catalyst for teachers’ creativity development was to use the photographic capabilities of the iPad to capture the local context and make lessons relevant to the lived experience of the students. In the transition stage, teachers worked with lesson plans and support from the professional development workshops to make small incremental changes to their practice, which scaffolded their growing competence in the creative use of the technology as a tool for teaching and developing new teaching resources. Over the full period of the study, these small changes in practice resulted in a cultural shift in how teachers approached all lessons, even those in which they were not using the technology. They developed into working as a community of practice. The digital lessons created were re-used and further developed by other teachers, providing a relevant and valuable bank of content in a context lacking in books and other teaching resources. This study demonstrated that mobile technology proved to be a successful catalyst for impacting creative teaching practice in this context, and supports the Quality Education for All Sustainable Development Goal.

Keywords: mobile technology, creative teaching, sub-Saharan Africa, quality education for all

Procedia PDF Downloads 124
24408 Open-Source YOLO CV For Detection of Dust on Solar PV Surface

Authors: Jeewan Rai, Kinzang, Yeshi Jigme Choden

Abstract:

Accumulation of dust on solar panels impacts the overall efficiency and the amount of energy they produce. While various techniques exist for detecting dust to schedule cleaning, many of these methods use MATLAB image processing tools and other licensed software, which can be financially burdensome. This study will investigate the efficiency of a free open-source computer vision library using the YOLO algorithm. The proposed approach has been tested on images of solar panels with varying dust levels through an experiment setup. The experimental findings illustrated the effectiveness of using the YOLO-based image classification method and the overall dust detection approach with an accuracy of 90% in distinguishing between clean and dusty panels. This open-source solution provides a cost effective and accessible alternative to commercial image processing tools, offering solutions for optimizing solar panel maintenance and enhancing energy production.

Keywords: YOLO, openCV, dust detection, solar panels, computer vision, image processing

Procedia PDF Downloads 18
24407 Syndromic Surveillance Framework Using Tweets Data Analytics

Authors: David Ming Liu, Benjamin Hirsch, Bashir Aden

Abstract:

Syndromic surveillance is to detect or predict disease outbreaks through the analysis of medical sources of data. Using social media data like tweets to do syndromic surveillance becomes more and more popular with the aid of open platform to collect data and the advantage of microblogging text and mobile geographic location features. In this paper, a Syndromic Surveillance Framework is presented with machine learning kernel using tweets data analytics. Influenza and the three cities Abu Dhabi, Al Ain and Dubai of United Arabic Emirates are used as the test disease and trial areas. Hospital cases data provided by the Health Authority of Abu Dhabi (HAAD) are used for the correlation purpose. In our model, Latent Dirichlet allocation (LDA) engine is adapted to do supervised learning classification and N-Fold cross validation confusion matrix are given as the simulation results with overall system recall 85.595% performance achieved.

Keywords: Syndromic surveillance, Tweets, Machine Learning, data mining, Latent Dirichlet allocation (LDA), Influenza

Procedia PDF Downloads 107
24406 Analysis of Urban Population Using Twitter Distribution Data: Case Study of Makassar City, Indonesia

Authors: Yuyun Wabula, B. J. Dewancker

Abstract:

In the past decade, the social networking app has been growing very rapidly. Geolocation data is one of the important features of social media that can attach the user's location coordinate in the real world. This paper proposes the use of geolocation data from the Twitter social media application to gain knowledge about urban dynamics, especially on human mobility behavior. This paper aims to explore the relation between geolocation Twitter with the existence of people in the urban area. Firstly, the study will analyze the spread of people in the particular area, within the city using Twitter social media data. Secondly, we then match and categorize the existing place based on the same individuals visiting. Then, we combine the Twitter data from the tracking result and the questionnaire data to catch the Twitter user profile. To do that, we used the distribution frequency analysis to learn the visitors’ percentage. To validate the hypothesis, we compare it with the local population statistic data and land use mapping released by the city planning department of Makassar local government. The results show that there is the correlation between Twitter geolocation and questionnaire data. Thus, integration the Twitter data and survey data can reveal the profile of the social media users.

Keywords: geolocation, Twitter, distribution analysis, human mobility

Procedia PDF Downloads 311
24405 Analysis and Rule Extraction of Coronary Artery Disease Data Using Data Mining

Authors: Rezaei Hachesu Peyman, Oliyaee Azadeh, Salahzadeh Zahra, Alizadeh Somayyeh, Safaei Naser

Abstract:

Coronary Artery Disease (CAD) is one major cause of disability in adults and one main cause of death in developed. In this study, data mining techniques including Decision Trees, Artificial neural networks (ANNs), and Support Vector Machine (SVM) analyze CAD data. Data of 4948 patients who had suffered from heart diseases were included in the analysis. CAD is the target variable, and 24 inputs or predictor variables are used for the classification. The performance of these techniques is compared in terms of sensitivity, specificity, and accuracy. The most significant factor influencing CAD is chest pain. Elderly males (age > 53) have a high probability to be diagnosed with CAD. SVM algorithm is the most useful way for evaluation and prediction of CAD patients as compared to non-CAD ones. Application of data mining techniques in analyzing coronary artery diseases is a good method for investigating the existing relationships between variables.

Keywords: classification, coronary artery disease, data-mining, knowledge discovery, extract

Procedia PDF Downloads 654
24404 Sensor Data Analysis for a Large Mining Major

Authors: Sudipto Shanker Dasgupta

Abstract:

One of the largest mining companies wanted to look at health analytics for their driverless trucks. These trucks were the key to their supply chain logistics. The automated trucks had multi-level sub-assemblies which would send out sensor information. The use case that was worked on was to capture the sensor signal from the truck subcomponents and analyze the health of the trucks from repair and replacement purview. Open source software was used to stream the data into a clustered Hadoop setup in Amazon Web Services cloud and Apache Spark SQL was used to analyze the data. All of this was achieved through a 10 node amazon 32 core, 64 GB RAM setup real-time analytics was achieved on ‘300 million records’. To check the scalability of the system, the cluster was increased to 100 node setup. This talk will highlight how Open Source software was used to achieve the above use case and the insights on the high data throughput on a cloud set up.

Keywords: streaming analytics, data science, big data, Hadoop, high throughput, sensor data

Procedia PDF Downloads 400
24403 Corporate Social Responsibility (CSR) and Energy Efficiency: Empirical Evidence from the Manufacturing Sector of India

Authors: Baikunthanath Sahoo, Santosh Kumar Sahu, Krishna Malakar

Abstract:

With the essence of global environmental sustainability and green business management, the wind of business research moved towards Corporate Social Responsibility. In addition to international and national treaties, businesses have also started realising environmental protection and energy efficiency through CSR as part of business strategy in response to climate change. Considering the ambitious emission reduction target and rapid economic development of India, this study is an attempt to explore the effect of CSR on the energy efficiency management of manufacturing firms in India. By using firm-level data, the panel fixed effect model shows that the CSR dummy variable is negatively influencing the energy intensity or technically, they are energy efficient. The result demonstrates that in the presence of CSR, all the production economic variables are significant. The result also shows that doing environmental expenditure does not improve energy efficiency might be because very few firms are motivated to do such expenditure and also not common to all sectors. The interactive effect model result conforms that without considering CSR dummy as an intervening variable only Manufacturers of Chemical and Chemical products, Manufacturers of Pharmaceutical, medical chemical, and botanical products firms energy intensity low but after considering CSR in their business practices all six sub-sector firms become energy efficient. The empirical result also validate that firms are continuously engaged in CSR activities they are highly energy efficient. It is an important motivational factor for firms to become economically and environmentally sustainable in the corporate world. This analysis would help business practitioners to know how to manage today’s profitability and tomorrow’s sustainability to achieve a comparative advantage in the emerging market economy. The paper concludes that reducing energy consumption as part of their social responsibility to care for the environment, will need collaborative efforts of business society and policy bodies.

Keywords: CSR, Energy Efficiency, Indian manufacturing Sector, Business strategy

Procedia PDF Downloads 78
24402 An Investigation of Water Atomizer in Ejected Gas of a Vehicle Engine

Authors: Chun-Wei Liu, Feng-Tsai Weng

Abstract:

People faced pollution threaten in modern age although the standard of exhaust gas of vehicles has been established. The goal of this study is to investigate the effect of water atomizer in a vehicle emission system. Diluted 20% ammonia water was used in spraying system. Micro particles produced by exhausted gas from engine of vehicle which were cumulated through atomized spray in a self-development collector. In experiments, a self-designed atomization model plate and a gas tank controlled by the micro-processor using Pulse Width Modulation (PWM) logic was prepared for exhaust test. The gas from gasoline-engine of vehicle was purified with the model panel collector. A soft well named ANSYS was utilized for analyzing the distribution condition of rejected gas. Micro substance and percentage of CO, HC, CO2, NOx in exhausted gas were investigated at different engine speed, and atomizer vibration frequency. Exceptional results in the vehicle engine emissions measurement were obtained. The temperature of exhausted gas can be decreased 3oC. Micro substances PM10 can be decreased and the percentage of CO can be decreased more than 55% at 2500RPM by proposed system. Value of CO, HC, CO2 and NOX was all decreased when atomizers were used with water.

Keywords: atomizer, CO, HC, NOx, PM2.5

Procedia PDF Downloads 450
24401 Data-Centric Anomaly Detection with Diffusion Models

Authors: Sheldon Liu, Gordon Wang, Lei Liu, Xuefeng Liu

Abstract:

Anomaly detection, also referred to as one-class classification, plays a crucial role in identifying product images that deviate from the expected distribution. This study introduces Data-centric Anomaly Detection with Diffusion Models (DCADDM), presenting a systematic strategy for data collection and further diversifying the data with image generation via diffusion models. The algorithm addresses data collection challenges in real-world scenarios and points toward data augmentation with the integration of generative AI capabilities. The paper explores the generation of normal images using diffusion models. The experiments demonstrate that with 30% of the original normal image size, modeling in an unsupervised setting with state-of-the-art approaches can achieve equivalent performances. With the addition of generated images via diffusion models (10% equivalence of the original dataset size), the proposed algorithm achieves better or equivalent anomaly localization performance.

Keywords: diffusion models, anomaly detection, data-centric, generative AI

Procedia PDF Downloads 78
24400 Instrument Development and Validation for Quality Early Childhood Curriculum in the Malaysian Context

Authors: Sadiah Baharom, Che Nidzam Che Ahmad, Saipol Barin Ramli, Asmayati Yahaya, Sopia Md Yassin

Abstract:

The early childhood care and education (ECCE) in Malaysia aspire to develop children who are intellectually, emotionally, physically and spiritually balanced. This aspiration can only materialise if the early childhood program developed comprehensive and is of high quality comparable to international standards. As such, there is a pressing need to assess the quality of the program in an all-encompassing manner. The overall research project aims at developing a comprehensive and integrated model of high-quality Malaysian ECCE. One of the major objectives of this project is to assess and evaluate the scope and quality of the existing ECCE programs in Malaysia. To this end, a specific aspect of this objective is to develop and validate an instrument to assess and evaluate the ECCE curriculum of the country. Thus this paper describes the development and validation of an instrument to explore the quality of early childhood care and education curriculum currently implemented in the country’s ECCE centres. The generation of the constructs and items were based on a set of criteria mapped against existing ECCE practice, document analyses, expert interviews and panel discussions. The items went through expert validation and were field tested on 597 ECCE teachers. The data obtained went through an exploratory factor analysis to validate the constructs of the instrument followed by reliability studies on internal consistency based on the Cronbach Alpha values. The final set of items for the ECCE curriculum instrument, earmarked for the main study, consists of four constructs namely philosophy and core values, curriculum content, curriculum review and unique features. Each construct consists of between 21 to 3 items with a total of 36 items in all. The reliability coefficients for each construct range from 0.65 to 0.961. These values are within the acceptable limits for a reliable instrument to be used in the main study.

Keywords: early childhood and care education, instrument development, reliability studies, validity studies

Procedia PDF Downloads 197
24399 Regulation on the Protection of Personal Data Versus Quality Data Assurance in the Healthcare System Case Report

Authors: Elizabeta Krstić Vukelja

Abstract:

Digitization of personal data is a consequence of the development of information and communication technologies that create a new work environment with many advantages and challenges, but also potential threats to privacy and personal data protection. Regulation (EU) 2016/679 of the European Parliament and of the Council is becoming a law and obligation that should address the issues of personal data protection and information security. The existence of the Regulation leads to the conclusion that national legislation in the field of virtual environment, protection of the rights of EU citizens and processing of their personal data is insufficiently effective. In the health system, special emphasis is placed on the processing of special categories of personal data, such as health data. The healthcare industry is recognized as a particularly sensitive area in which a large amount of medical data is processed, the digitization of which enables quick access and quick identification of the health insured. The protection of the individual requires quality IT solutions that guarantee the technical protection of personal categories. However, the real problems are the technical and human nature and the spatial limitations of the application of the Regulation. Some conclusions will be drawn by analyzing the implementation of the basic principles of the Regulation on the example of the Croatian health care system and comparing it with similar activities in other EU member states.

Keywords: regulation, healthcare system, personal dana protection, quality data assurance

Procedia PDF Downloads 35
24398 Parallel Vector Processing Using Multi Level Orbital DATA

Authors: Nagi Mekhiel

Abstract:

Many applications use vector operations by applying single instruction to multiple data that map to different locations in conventional memory. Transferring data from memory is limited by access latency and bandwidth affecting the performance gain of vector processing. We present a memory system that makes all of its content available to processors in time so that processors need not to access the memory, we force each location to be available to all processors at a specific time. The data move in different orbits to become available to other processors in higher orbits at different time. We use this memory to apply parallel vector operations to data streams at first orbit level. Data processed in the first level move to upper orbit one data element at a time, allowing a processor in that orbit to apply another vector operation to deal with serial code limitations inherited in all parallel applications and interleaved it with lower level vector operations.

Keywords: Memory Organization, Parallel Processors, Serial Code, Vector Processing

Procedia PDF Downloads 261
24397 Learning at Workplace: Competences and Contexts in Sensory Evaluation

Authors: Ulriikka Savela-Huovinen, Hanni Muukkonen, Auli Toom

Abstract:

The development of workplace as a learning environment has been emphasized in research field of workplace learning. The prior literature on sensory performance emphasized the individual’s competences as assessor, while the competences in the collaborative interactional and knowledge creation practices as workplace learning method are not often mentioned. In the present study aims to find out what kinds of competences and contexts are central when assessor conducts food sensory evaluation in authentic professional context. The aim was to answer the following questions: first, what kinds of competences does sensory evaluation require according to assessors? And second, what kinds of contexts for sensory evaluation do assessors report? Altogether thirteen assessors from three Finnish food companies were interviewed by using semi-structural thematic interviews to map practices and development intentions as well as to explicate already established practices. The qualitative data were analyzed by following the principles of abductive and inductive content analysis. Analysis phases were combined and their results were considered together as a cross-analysis. When evaluated independently required competences were perception, knowledge of specific domains and methods and cognitive skills e.g. memory. Altogether, 42% of analysis units described individual evaluation contexts, 53% of analysis units described collaborative interactional contexts, and 5% of analysis units described collaborative knowledge creation contexts. Related to collaboration, analysis reviewed learning, sharing and reviewing both external and in-house consumer feedback, developing methods to moderate small-panel evaluation and developing product vocabulary collectively between the assessors. Knowledge creation contexts individualized from daily practices especially in cases product defects were sought and discussed. The study findings contribute to the explanation that sensory assessors learn extensively from one another in the collaborative interactional and knowledge creation context. Assessors learning and abilities to work collaboratively in the interactional and knowledge creation contexts need to be ensured in the development of the expertise.

Keywords: assessor, collaboration, competences, contexts, learning and practices, sensory evaluation

Procedia PDF Downloads 234
24396 Reconstructability Analysis for Landslide Prediction

Authors: David Percy

Abstract:

Landslides are a geologic phenomenon that affects a large number of inhabited places and are constantly being monitored and studied for the prediction of future occurrences. Reconstructability analysis (RA) is a methodology for extracting informative models from large volumes of data that work exclusively with discrete data. While RA has been used in medical applications and social science extensively, we are introducing it to the spatial sciences through applications like landslide prediction. Since RA works exclusively with discrete data, such as soil classification or bedrock type, working with continuous data, such as porosity, requires that these data are binned for inclusion in the model. RA constructs models of the data which pick out the most informative elements, independent variables (IVs), from each layer that predict the dependent variable (DV), landslide occurrence. Each layer included in the model retains its classification data as a primary encoding of the data. Unlike other machine learning algorithms that force the data into one-hot encoding type of schemes, RA works directly with the data as it is encoded, with the exception of continuous data, which must be binned. The usual physical and derived layers are included in the model, and testing our results against other published methodologies, such as neural networks, yields accuracy that is similar but with the advantage of a completely transparent model. The results of an RA session with a data set are a report on every combination of variables and their probability of landslide events occurring. In this way, every combination of informative state combinations can be examined.

Keywords: reconstructability analysis, machine learning, landslides, raster analysis

Procedia PDF Downloads 57
24395 Environmental and Socioeconomic Determinants of Climate Change Resilience in Rural Nigeria: Empirical Evidence towards Resilience Building

Authors: Ignatius Madu

Abstract:

The study aims at assessing the environmental and socioeconomic determinants of climate change resilience in rural Nigeria. This is necessary because researches and development efforts on building climate change resilience of rural areas in developing countries are usually made without the knowledge of the impacts of the inherent rural characteristics that determine resilient capacities of the households. This has, in many cases, led to costly mistakes, delayed responses, inaccurate outcomes, and other difficulties. Consequently, this assessment becomes crucial not only to policymakers and people living in risk-prone environments in rural areas but also to fill the research gap. To achieve the aim, secondary data were obtained from the Annual Abstract of Statistics 2017, LSMS-Integrated Surveys on Agriculture and General Household Survey Panel 2015/2016, and National Agriculture Sample Survey (NASS), 2010/2011.Resilience was calculated by weighting and adding the adaptive, absorptive and anticipatory measures of households variables aggregated at state levels and then regressed against rural environmental and socioeconomic characteristics influencing it. From the regression, the coefficients of the variables were used to compute the impacts of the variables using the Stochastic Regression of Impacts on Population, Affluence and Technology (STIRPAT) Model. The results showed that the northern States are generally low in resilient indices and are impacted less by the development indicators. The major determining factors are percentage of non-poor, environmental protection, road transport development, landholding, agricultural input, population density, dependency ratio (inverse), household asserts, education and maternal care. The paper concludes that any effort to a successful resilient building in rural areas of the country should first address these key factors that enhance rural development and wellbeing since it is better to take action before shocks take place.

Keywords: climate change resilience; spatial impacts; STIRPAT model; Nigeria

Procedia PDF Downloads 145
24394 Data Analytics in Hospitality Industry

Authors: Tammy Wee, Detlev Remy, Arif Perdana

Abstract:

In the recent years, data analytics has become the buzzword in the hospitality industry. The hospitality industry is another example of a data-rich industry that has yet fully benefited from the insights of data analytics. Effective use of data analytics can change how hotels operate, market and position themselves competitively in the hospitality industry. However, at the moment, the data obtained by individual hotels remain under-utilized. This research is a preliminary research on data analytics in the hospitality industry, using an in-depth face-to-face interview on one hotel as a start to a multi-level research. The main case study of this research, hotel A, is a chain brand of international hotel that has been systematically gathering and collecting data on its own customer for the past five years. The data collection points begin from the moment a guest book a room until the guest leave the hotel premises, which includes room reservation, spa booking, and catering. Although hotel A has been gathering data intelligence on its customer for some time, they have yet utilized the data to its fullest potential, and they are aware of their limitation as well as the potential of data analytics. Currently, the utilization of data analytics in hotel A is limited in the area of customer service improvement, namely to enhance the personalization of service for each individual customer. Hotel A is able to utilize the data to improve and enhance their service which in turn, encourage repeated customers. According to hotel A, 50% of their guests returned to their hotel, and 70% extended nights because of the personalized service. Apart from using the data analytics for enhancing customer service, hotel A also uses the data in marketing. Hotel A uses the data analytics to predict or forecast the change in consumer behavior and demand, by tracking their guest’s booking preference, payment preference and demand shift between properties. However, hotel A admitted that the data they have been collecting was not fully utilized due to two challenges. The first challenge of using data analytics in hotel A is the data is not clean. At the moment, the data collection of one guest profile is meaningful only for one department in the hotel but meaningless for another department. Cleaning up the data and getting standards correctly for usage by different departments are some of the main concerns of hotel A. The second challenge of using data analytics in hotel A is the non-integral internal system. At the moment, the internal system used by hotel A do not integrate with each other well, limiting the ability to collect data systematically. Hotel A is considering another system to replace the current one for more comprehensive data collection. Hotel proprietors recognized the potential of data analytics as reported in this research, however, the current challenges of implementing a system to collect data come with a cost. This research has identified the current utilization of data analytics and the challenges faced when it comes to implementing data analytics.

Keywords: data analytics, hospitality industry, customer relationship management, hotel marketing

Procedia PDF Downloads 173
24393 Financial Markets Performance: From COVID-19 Crisis to Hopes of Recovery with the Containment Polices

Authors: Engy Eissa, Dina M. Yousri

Abstract:

COVID-19 has hit massively the world economy, financial markets and even societies’ livelihood. The infectious disease caused by the most recently discovered coronavirus was claimed responsible for a shrink in the global economy by 4.4% in 2020. Shortly after the first case in Wuhan was identified, a quick surge in the number of confirmed cases in China was evident and a vast spread worldwide is recorded with cases surpassing the 500,000 cases. Irrespective of the disease’s trajectory in each country, a call for immediate action and prompt government intervention was needed. Given that there is no one-size-fits-all approach across the world, a number of containment and adoption policies were embraced. It was starting by enforcing complete lockdown like China to even stricter policies targeted containing the spread of the virus, augmenting the efficiency of health systems, and controlling the economic outcomes arising from this crisis. Hence, this paper has three folds; first, it examines the impact of containment policies taken by governments on controlling the number of cases and deaths in the given countries. Second, to assess the ramifications of COVID-19 on financial markets measured by stock returns. Third, to study the impact of containment policies measured by the government response index, the stringency index, the containment health index, and the economic support index on financial markets performance. Using a sample of daily data covering the period 31st of January 2020 to 15th of April 2021 for the 10 most hit countries in wave one by COVID-19 namely; Brazil, India, Turkey, Russia, UK, USA, France, Germany, Spain, and Italy. The aforementioned relationships were tested using Panel VAR Regression. The preliminary results showed that the number of daily deaths had an impact on the stock returns; moreover, the health containment policies and the economic support provided by the governments had a significant effect on lowering the impact of COVID-19 on stock returns.

Keywords: COVID-19, government policies, stock returns, VAR

Procedia PDF Downloads 178
24392 Realization of a (GIS) for Drilling (DWS) through the Adrar Region

Authors: Djelloul Benatiallah, Ali Benatiallah, Abdelkader Harouz

Abstract:

Geographic Information Systems (GIS) include various methods and computer techniques to model, capture digitally, store, manage, view and analyze. Geographic information systems have the characteristic to appeal to many scientific and technical field, and many methods. In this article we will present a complete and operational geographic information system, following the theoretical principles of data management and adapting to spatial data, especially data concerning the monitoring of drinking water supply wells (DWS) Adrar region. The expected results of this system are firstly an offer consulting standard features, updating and editing beneficiaries and geographical data, on the other hand, provides specific functionality contractors entered data, calculations parameterized and statistics.

Keywords: GIS, DWS, drilling, Adrar

Procedia PDF Downloads 305
24391 Generic Data Warehousing for Consumer Electronics Retail Industry

Authors: S. Habte, K. Ouazzane, P. Patel, S. Patel

Abstract:

The dynamic and highly competitive nature of the consumer electronics retail industry means that businesses in this industry are experiencing different decision making challenges in relation to pricing, inventory control, consumer satisfaction and product offerings. To overcome the challenges facing retailers and create opportunities, we propose a generic data warehousing solution which can be applied to a wide range of consumer electronics retailers with a minimum configuration. The solution includes a dimensional data model, a template SQL script, a high level architectural descriptions, ETL tool developed using C#, a set of APIs, and data access tools. It has been successfully applied by ASK Outlets Ltd UK resulting in improved productivity and enhanced sales growth.

Keywords: consumer electronics, data warehousing, dimensional data model, generic, retail industry

Procedia PDF Downloads 405
24390 Sequential Data Assimilation with High-Frequency (HF) Radar Surface Current

Authors: Lei Ren, Michael Hartnett, Stephen Nash

Abstract:

The abundant measured surface current from HF radar system in coastal area is assimilated into model to improve the modeling forecasting ability. A simple sequential data assimilation scheme, Direct Insertion (DI), is applied to update model forecast states. The influence of Direct Insertion data assimilation over time is analyzed at one reference point. Vector maps of surface current from models are compared with HF radar measurements. Root-Mean-Squared-Error (RMSE) between modeling results and HF radar measurements is calculated during the last four days with no data assimilation.

Keywords: data assimilation, CODAR, HF radar, surface current, direct insertion

Procedia PDF Downloads 567
24389 Measured versus Default Interstate Traffic Data in New Mexico, USA

Authors: M. A. Hasan, M. R. Islam, R. A. Tarefder

Abstract:

This study investigates how the site specific traffic data differs from the Mechanistic Empirical Pavement Design Software default values. Two Weigh-in-Motion (WIM) stations were installed in Interstate-40 (I-40) and Interstate-25 (I-25) to developed site specific data. A computer program named WIM Data Analysis Software (WIMDAS) was developed using Microsoft C-Sharp (.Net) for quality checking and processing of raw WIM data. A complete year data from November 2013 to October 2014 was analyzed using the developed WIM Data Analysis Program. After that, the vehicle class distribution, directional distribution, lane distribution, monthly adjustment factor, hourly distribution, axle load spectra, average number of axle per vehicle, axle spacing, lateral wander distribution, and wheelbase distribution were calculated. Then a comparative study was done between measured data and AASHTOWare default values. It was found that the measured general traffic inputs for I-40 and I-25 significantly differ from the default values.

Keywords: AASHTOWare, traffic, weigh-in-motion, axle load distribution

Procedia PDF Downloads 337