Search results for: business data
25517 Regulating Information Asymmetries at Online Platforms for Short-Term Vacation Rental in European Union– Legal Conondrum Continues
Authors: Vesna Lukovic
Abstract:
Online platforms as new business models play an important role in today’s economy and the functioning of the EU’s internal market. In the travel industry, algorithms used by online platforms for short-stay accommodation provide suggestions and price information to travelers. Those suggestions and recommendations are displayed in search results via recommendation (ranking) systems. There has been a growing consensus that the current legal framework was not sufficient to resolve problems arising from platform practices. In order to enhance the potential of the EU’s Single Market, smaller businesses should be protected, and their rights strengthened vis-à-vis large online platforms. The Regulation (EU) 2019/1150 of the European Parliament and of the Council on promoting fairness and transparency for business users of online intermediation services aims to level the playing field in that respect. This research looks at Airbnb through the lenses of this regulation. The research explores key determinants and finds that although regulation is an important step in the right direction, it is not enough. It does not entail sufficient clarity obligations that would make online platforms an intermediary service which both accommodation providers and travelers could use with ease.Keywords: algorithm, online platforms, ranking, consumers, EU regulation
Procedia PDF Downloads 12825516 AI-Driven Solutions for Optimizing Master Data Management
Authors: Srinivas Vangari
Abstract:
In the era of big data, ensuring the accuracy, consistency, and reliability of critical data assets is crucial for data-driven enterprises. Master Data Management (MDM) plays a crucial role in this endeavor. This paper investigates the role of Artificial Intelligence (AI) in enhancing MDM, focusing on how AI-driven solutions can automate and optimize various stages of the master data lifecycle. By integrating AI (Quantitative and Qualitative Analysis) into processes such as data creation, maintenance, enrichment, and usage, organizations can achieve significant improvements in data quality and operational efficiency. Quantitative analysis is employed to measure the impact of AI on key metrics, including data accuracy, processing speed, and error reduction. For instance, our study demonstrates an 18% improvement in data accuracy and a 75% reduction in duplicate records across multiple systems post-AI implementation. Furthermore, AI’s predictive maintenance capabilities reduced data obsolescence by 22%, as indicated by statistical analyses of data usage patterns over a 12-month period. Complementing this, a qualitative analysis delves into the specific AI-driven strategies that enhance MDM practices, such as automating data entry and validation, which resulted in a 28% decrease in manual errors. Insights from case studies highlight how AI-driven data cleansing processes reduced inconsistencies by 25% and how AI-powered enrichment strategies improved data relevance by 24%, thus boosting decision-making accuracy. The findings demonstrate that AI significantly enhances data quality and integrity, leading to improved enterprise performance through cost reduction, increased compliance, and more accurate, real-time decision-making. These insights underscore the value of AI as a critical tool in modern data management strategies, offering a competitive edge to organizations that leverage its capabilities.Keywords: artificial intelligence, master data management, data governance, data quality
Procedia PDF Downloads 1625515 The Effectiveness of E-Training on the Attitude and Skill Competencies of Vocational High School Teachers during Covid-19 Pandemic in Indonesia
Authors: Sabli, Eddy Rismunandar, Akhirudin, Nana Halim, Zulfikar, Nining Dwirosanti, Wila Ningsih, Pipih Siti Sofiah, Danik Dania Asadayanti, Dewi Eka Arini Algozi, Gita Mahardika Pamuji, Ajun, Mangasa Aritonang, Nanang Rukmana, Arief Rachman Wonodhipo, Victor Imanuel Nahumury, Lili Husada, Wawan Saepul Irwan, Al Mukhlas Fikri
Abstract:
Covid-19 pandemic has widely impacted the lives. An adaptive strategy must be quickly formulated to maintain the quality of education, especially by vocational school which technical skill competencies are highly needed. This study aimed to evaluate the effectiveness of e-training on the attitude and skill competencies of vocational high school teachers in Indonesia. A total of 720 Indonesian vocational high school teachers with various programs, including hospitality, administration, online business and marketing, culinary arts, fashion, cashier, tourism, haircut, and accounting participated e-training for a month. The training used electronic learning management system to provide materials (modules, presentation slides, and tutorial videos), tasks, and evaluations. Tutorial class was carried out by video conference meeting. Attitude and skill competencies were evaluated before and after the training. Meanwhile, the teachers also gave satisfactory feedback on the quality of the organizer and tutors. Data analysis used paired sample t-test and Anova with Tukey’s post hoc test. The results showed that e-training significantly increased the score of attitude and skill competencies of the teachers (p <0,05). Moreover, the remarkable increase was found among hospitality (57,5%), cashier (50,1%), and online business and marketing (48,7%) teachers. However, the effect among fashion, tourism and haircut teachers was less obvious. In addition, the satisfactory score on the quality of the organizer and tutors were 88,9 (very good), and 93,5 (excellence), respectively. The study concludes that a well-organized e-training could increase the attitude and skill competencies of Indonesian vocational high school teachers during Covid-19 pandemic.Keywords: E-training, skill, teacher, vocational high school
Procedia PDF Downloads 14625514 Genetic Data of Deceased People: Solving the Gordian Knot
Authors: Inigo de Miguel Beriain
Abstract:
Genetic data of deceased persons are of great interest for both biomedical research and clinical use. This is due to several reasons. On the one hand, many of our diseases have a genetic component; on the other hand, we share genes with a good part of our biological family. Therefore, it would be possible to improve our response considerably to these pathologies if we could use these data. Unfortunately, at the present moment, the status of data on the deceased is far from being satisfactorily resolved by the EU data protection regulation. Indeed, the General Data Protection Regulation has explicitly excluded these data from the category of personal data. This decision has given rise to a fragmented legal framework on this issue. Consequently, each EU member state offers very different solutions. For instance, Denmark considers the data as personal data of the deceased person for a set period of time while some others, such as Spain, do not consider this data as such, but have introduced some specifically focused regulations on this type of data and their access by relatives. This is an extremely dysfunctional scenario from multiple angles, not least of which is scientific cooperation at the EU level. This contribution attempts to outline a solution to this dilemma through an alternative proposal. Its main hypothesis is that, in reality, health data are, in a sense, a rara avis within data in general because they do not refer to one person but to several. Hence, it is possible to think that all of them can be considered data subjects (although not all of them can exercise the corresponding rights in the same way). When the person from whom the data were obtained dies, the data remain as personal data of his or her biological relatives. Hence, the general regime provided for in the GDPR may apply to them. As these are personal data, we could go back to thinking in terms of a general prohibition of data processing, with the exceptions provided for in Article 9.2 and on the legal bases included in Article 6. This may be complicated in practice, given that, since we are dealing with data that refer to several data subjects, it may be complex to refer to some of these bases, such as consent. Furthermore, there are theoretical arguments that may oppose this hypothesis. In this contribution, it is shown, however, that none of these objections is of sufficient substance to delegitimize the argument exposed. Therefore, the conclusion of this contribution is that we can indeed build a general framework on the processing of personal data of deceased persons in the context of the GDPR. This would constitute a considerable improvement over the current regulatory framework, although it is true that some clarifications will be necessary for its practical application.Keywords: collective data conceptual issues, data from deceased people, genetic data protection issues, GDPR and deceased people
Procedia PDF Downloads 15425513 Court-Annexed Mediation for International Commercial Disputes in Asia: Strengths and Weaknesses
Authors: Thu Thuy Nguyen
Abstract:
In recent years, mediation has gained a great attention from many jurisdictions thanks to its advantages. With respect to Asia, mediation has a long history of development in this region with various types to amicably settle disputes in civil and commercial issues. The modern mediation system in several Asian countries and territories comprises three main categories, namely court-annexed mediation, mediation within arbitral proceedings and institutional mediation. Court-annexed mediation (or in-court mediation) is mediation conducted by the court in the course of judicial procedures. In dealing with cross-border business disputes, in-court mediation exposes a number of advantages in comparison with two other types of mediation, especially in terms of enforcement of final result. However, the confidentiality of mediation process in subsequent judicial proceedings, qualifications of court judges and the issue of recognition and enforcement of foreign judgment are normally seen as drawbacks of court-annexed mediation as in court-annexed mediation judges will be casts as dual roles as both mediator and ultimate adjudicator in the same dispute. This paper will examine the strengths and weaknesses of in-court mediation in settling transnational business disputes in selected Asian countries, including China, Hong Kong, Japan, Singapore and Vietnam.Keywords: court-annexed mediation, international commercial disputes, Asia, strengths and weaknesses
Procedia PDF Downloads 30525512 Organisational Effectiveness and Its Implications for Seaports
Authors: Shadi Alghaffari, Hong-Oanh Nguyen, Peggy Chen, Hossein Enshaei
Abstract:
The main purpose of this study was to explore the role of organisational effectiveness (OE) in seaports. OE is an important managerial concept, one that is necessary for leaders and directors in any organisation to understand the output of their work. OE has been applied in many organisations; however, it is a vital concept in the port business. This paper examines various approaches and applications of the OE concept to business management, and describes benefits that are important and applicable to seaport management. This research reviews and classifies articles published in relevant journals and books between 1950 and 2016; from the general literature on OE to the narrower field of OE in seaports. Based on the extensive literature review, this study identifies and discusses several issues relevant to both practices and theories of this concept. The review concludes by presenting a gap in the literature, as it found only a limited amount of research that endeavours to clarify OE in the seaport sector. As a result of this gap, seaports suffer from a lack of empirical study and are largely neglected in this subject area. The implementation of OE in this research has led to the maritime sector interfacing with different disciplines in order to acquire the advantage of enhancing managerial knowledge and competing successfully in the international marketplace.Keywords: literature review, maritime, organisational effectiveness, seaport management
Procedia PDF Downloads 34125511 Entrepreneurial Leadership in a Startup Context: A Comparative Study on Two Egyptian Startup Businesses
Authors: Nada Basset
Abstract:
Problem Statement: The study examines the important role of leading change inside start-ups and highlights the challenges faced by an entrepreneur during the startup phase of the business. Research Methods/Procedures/Approaches: A qualitative research approach is taken, using the case study analysis method. A comparative study was made between two day care nurseries in Greater Cairo. Non-probability purposive sampling was used and a triangulation of semi-structured interviews, document analysis and participant-observation were applied simultaneously. The in-depth case study analysis took place over a longitudinal study of four calendar months. Results/Findings: Findings demonstrated that leading change in an entrepreneurial setup must be initiated by the entrepreneur, who must also be the owner of the change process. Another important finding showed that the culture of change, although created by the entrepreneur, needs the support and engagement of followers, who should be sharing the same value system and vision of the entrepreneur. Conclusions and Implications: An important implication suggests that during the first year of a start-up lifecycle, special emphasis must be made to the recruitment and selection of personnel, who should play a role into setting the new start-up culture and help it grow or shrink. Another drawn conclusion is that the success of the change must be measured in both quantitative and qualitative terms. Increasing revenues and customer attrition rates -as quantitative KPIs- must be aligned with other qualitative KPIs like customer satisfaction, employee satisfaction, and organizational commitment and business reputation. Originality of Paper: The paper addresses change management in an entrepreneurial concept, with an empirical application on an Egyptian start-up model providing a service to both adults and children. This privileges the research as the constructs measured merged together the level of satisfaction of employees, decision-makers (parents of children), and the users (children).Keywords: leadership, change management, entrepreneurship, startup business
Procedia PDF Downloads 18125510 Widely Diversified Macroeconomies in the Super-Long Run Casts a Doubt on Path-Independent Equilibrium Growth Model
Authors: Ichiro Takahashi
Abstract:
One of the major assumptions of mainstream macroeconomics is the path independence of capital stock. This paper challenges this assumption by employing an agent-based approach. The simulation results showed the existence of multiple "quasi-steady state" equilibria of the capital stock, which may cast serious doubt on the validity of the assumption. The finding would give a better understanding of many phenomena that involve hysteresis, including the causes of poverty. The "market-clearing view" has been widely shared among major schools of macroeconomics. They understand that the capital stock, the labor force, and technology, determine the "full-employment" equilibrium growth path and demand/supply shocks can move the economy away from the path only temporarily: the dichotomy between the short-run business cycles and the long-run equilibrium path. The view then implicitly assumes the long-run capital stock to be independent of how the economy has evolved. In contrast, "Old Keynesians" have recognized fluctuations in output as arising largely from fluctuations in real aggregate demand. It will then be an interesting question to ask if an agent-based macroeconomic model, which is known to have path dependence, can generate multiple full-employment equilibrium trajectories of the capital stock in the super-long run. If the answer is yes, the equilibrium level of capital stock, an important supply-side factor, would no longer be independent of the business cycle phenomenon. This paper attempts to answer the above question by using the agent-based macroeconomic model developed by Takahashi and Okada (2010). The model would serve this purpose well because it has neither population growth nor technology progress. The objective of the paper is twofold: (1) to explore the causes of long-term business cycle, and (2) to examine the super-long behaviors of the capital stock of full-employment economies. (1) The simulated behaviors of the key macroeconomic variables such as output, employment, real wages showed widely diversified macro-economies. They were often remarkably stable but exhibited both short-term and long-term fluctuations. The long-term fluctuations occur through the following two adjustments: the quantity and relative cost adjustments of capital stock. The first one is obvious and assumed by many business cycle theorists. The reduced aggregate demand lowers prices, which raises real wages, thereby decreasing the relative cost of capital stock with respect to labor. (2) The long-term business cycles/fluctuations were synthesized with the hysteresis of real wages, interest rates, and investments. In particular, a sequence of the simulation runs with a super-long simulation period generated a wide range of perfectly stable paths, many of which achieved full employment: all the macroeconomic trajectories, including capital stock, output, and employment, were perfectly horizontal over 100,000 periods. Moreover, the full-employment level of capital stock was influenced by the history of unemployment, which was itself path-dependent. Thus, an experience of severe unemployment in the past kept the real wage low, which discouraged a relatively costly investment in capital stock. Meanwhile, a history of good performance sometimes brought about a low capital stock due to a high-interest rate that was consistent with a strong investment.Keywords: agent-based macroeconomic model, business cycle, hysteresis, stability
Procedia PDF Downloads 20725509 Steps towards the Development of National Health Data Standards in Developing Countries
Authors: Abdullah I. Alkraiji, Thomas W. Jackson, Ian Murray
Abstract:
The proliferation of health data standards today is somewhat overlapping and conflicting, resulting in market confusion and leading to increasing proprietary interests. The government role and support in standardization for health data are thought to be crucial in order to establish credible standards for the next decade, to maximize interoperability across the health sector, and to decrease the risks associated with the implementation of non-standard systems. The normative literature missed out the exploration of the different steps required to be undertaken by the government towards the development of national health data standards. Based on the lessons learned from a qualitative study investigating the different issues to the adoption of health data standards in the major tertiary hospitals in Saudi Arabia and the opinions and feedback from different experts in the areas of data exchange and standards and medical informatics in Saudi Arabia and UK, a list of steps required towards the development of national health data standards was constructed. Main steps are the existence of: a national formal reference for health data standards, an agreed national strategic direction for medical data exchange, a national medical information management plan and a national accreditation body, and more important is the change management at the national and organizational level. The outcome of this study can be used by academics and practitioners to develop the planning of health data standards, and in particular those in developing countries.Keywords: interoperabilty, medical data exchange, health data standards, case study, Saudi Arabia
Procedia PDF Downloads 33825508 A Proposal for U-City (Smart City) Service Method Using Real-Time Digital Map
Authors: SangWon Han, MuWook Pyeon, Sujung Moon, DaeKyo Seo
Abstract:
Recently, technologies based on three-dimensional (3D) space information are being developed and quality of life is improving as a result. Research on real-time digital map (RDM) is being conducted now to provide 3D space information. RDM is a service that creates and supplies 3D space information in real time based on location/shape detection. Research subjects on RDM include the construction of 3D space information with matching image data, complementing the weaknesses of image acquisition using multi-source data, and data collection methods using big data. Using RDM will be effective for space analysis using 3D space information in a U-City and for other space information utilization technologies.Keywords: RDM, multi-source data, big data, U-City
Procedia PDF Downloads 43225507 Small Businesses as Vehicles for Job Creation in North-West Nigeria
Authors: Mustapha Shitu Suleiman, Francis Neshamba, Nestor Valero-Silva
Abstract:
Small businesses are considered as engine of economic growth, contributing to employment generation, wealth creation, and poverty alleviation and food security in both developed and developing countries. Nigeria is facing many socio-economic problems and it is believed that by supporting small business development, as propellers of new ideas and more effective users of resources, often driven by individual creativity and innovation, Nigeria would be able to address some of its economic and social challenges, such as unemployment and economic diversification. Using secondary literature, this paper examines the role small businesses can play in the creation of jobs in North-West Nigeria to overcome issues of unemployment, which is the most devastating economic challenge facing the region. Most studies in this area have focused on Nigeria as a whole and only a few studies provide a regional focus, hence, this study will contribute to knowledge by filling this gap by concentrating on North-West Nigeria. It is hoped that with the present administration’s determination to improve the economy, small businesses would be used as vehicles for diversification of the economy away from crude oil to create jobs that would lead to a reduction in the country’s high unemployment level.Keywords: job creation, north-west, Nigeria, small business, unemployment
Procedia PDF Downloads 30525506 Identifying Model to Predict Deterioration of Water Mains Using Robust Analysis
Authors: Go Bong Choi, Shin Je Lee, Sung Jin Yoo, Gibaek Lee, Jong Min Lee
Abstract:
In South Korea, it is difficult to obtain data for statistical pipe assessment. In this paper, to address these issues, we find that various statistical model presented before is how data mixed with noise and are whether apply in South Korea. Three major type of model is studied and if data is presented in the paper, we add noise to data, which affects how model response changes. Moreover, we generate data from model in paper and analyse effect of noise. From this we can find robustness and applicability in Korea of each model.Keywords: proportional hazard model, survival model, water main deterioration, ecological sciences
Procedia PDF Downloads 74125505 A Recommender System for Job Seekers to Show up Companies Based on Their Psychometric Preferences and Company Sentiment Scores
Authors: A. Ashraff
Abstract:
The increasing importance of the web as a medium for electronic and business transactions has served as a catalyst or rather a driving force for the introduction and implementation of recommender systems. Recommender Systems play a major role in processing and analyzing thousands of data rows or reviews and help humans make a purchase decision of a product or service. It also has the ability to predict whether a particular user would rate a product or service based on the user’s profile behavioral pattern. At present, Recommender Systems are being used extensively in every domain known to us. They are said to be ubiquitous. However, in the field of recruitment, it’s not being utilized exclusively. Recent statistics show an increase in staff turnover, which has negatively impacted the organization as well as the employee. The reasons being company culture, working flexibility (work from home opportunity), no learning advancements, and pay scale. Further investigations revealed that there are lacking guidance or support, which helps a job seeker find the company that will suit him best, and though there’s information available about companies, job seekers can’t read all the reviews by themselves and get an analytical decision. In this paper, we propose an approach to study the available review data on IT companies (score their reviews based on user review sentiments) and gather information on job seekers, which includes their Psychometric evaluations. Then presents the job seeker with useful information or rather outputs on which company is most suitable for the job seeker. The theoretical approach, Algorithmic approach and the importance of such a system will be discussed in this paper.Keywords: psychometric tests, recommender systems, sentiment analysis, hybrid recommender systems
Procedia PDF Downloads 10525504 Automated Testing to Detect Instance Data Loss in Android Applications
Authors: Anusha Konduru, Zhiyong Shan, Preethi Santhanam, Vinod Namboodiri, Rajiv Bagai
Abstract:
Mobile applications are increasing in a significant amount, each to address the requirements of many users. However, the quick developments and enhancements are resulting in many underlying defects. Android apps create and handle a large variety of 'instance' data that has to persist across runs, such as the current navigation route, workout results, antivirus settings, or game state. Due to the nature of Android, an app can be paused, sent into the background, or killed at any time. If the instance data is not saved and restored between runs, in addition to data loss, partially-saved or corrupted data can crash the app upon resume or restart. However, it is difficult for the programmer to manually test this issue for all the activities. This results in the issue of data loss that the data entered by the user are not saved when there is any interruption. This issue can degrade user experience because the user needs to reenter the information each time there is an interruption. Automated testing to detect such data loss is important to improve the user experience. This research proposes a tool, DroidDL, a data loss detector for Android, which detects the instance data loss from a given android application. We have tested 395 applications and found 12 applications with the issue of data loss. This approach is proved highly accurate and reliable to find the apps with this defect, which can be used by android developers to avoid such errors.Keywords: Android, automated testing, activity, data loss
Procedia PDF Downloads 23725503 Mainstreaming Willingness among Black Owned Informal Small Micro Micro Enterprises in South Africa
Authors: Harris Maduku, Irrshad Kaseeram
Abstract:
The objective of this paper is to understand the factors behind the formalisation willingness of South African black owned SMMEs. Cross-sectional data were collected using a questionnaire from 390 informal businesses in Johannesburg and Pretoria using stratified random sampling and clustered sampling. This study employed a multinomial logistic regression to quantitatively understand what encourages informal SMMEs to be willing to mainstreaming their operations. We find government support, corruption, employment compensation, family labour, success perception, education status, age and financing as key drivers on willingness of SMMEs to formalize their operations. The findings of our study points to government departments to invest more on both financial and non-financial strategies like capacity building and business education on informal SMMEs to cultivate their willingness to mainstream.Keywords: mainstreaming, transition, informal, willingness, multinomial logit
Procedia PDF Downloads 15325502 Big Data: Appearance and Disappearance
Authors: James Moir
Abstract:
The mainstay of Big Data is prediction in that it allows practitioners, researchers, and policy analysts to predict trends based upon the analysis of large and varied sources of data. These can range from changing social and political opinions, patterns in crimes, and consumer behaviour. Big Data has therefore shifted the criterion of success in science from causal explanations to predictive modelling and simulation. The 19th-century science sought to capture phenomena and seek to show the appearance of it through causal mechanisms while 20th-century science attempted to save the appearance and relinquish causal explanations. Now 21st-century science in the form of Big Data is concerned with the prediction of appearances and nothing more. However, this pulls social science back in the direction of a more rule- or law-governed reality model of science and away from a consideration of the internal nature of rules in relation to various practices. In effect Big Data offers us no more than a world of surface appearance and in doing so it makes disappear any context-specific conceptual sensitivity.Keywords: big data, appearance, disappearance, surface, epistemology
Procedia PDF Downloads 41925501 From Data Processing to Experimental Design and Back Again: A Parameter Identification Problem Based on FRAP Images
Authors: Stepan Papacek, Jiri Jablonsky, Radek Kana, Ctirad Matonoha, Stefan Kindermann
Abstract:
FRAP (Fluorescence Recovery After Photobleaching) is a widely used measurement technique to determine the mobility of fluorescent molecules within living cells. While the experimental setup and protocol for FRAP experiments are usually fixed, data processing part is still under development. In this paper, we formulate and solve the problem of data selection which enhances the processing of FRAP images. We introduce the concept of the irrelevant data set, i.e., the data which are almost not reducing the confidence interval of the estimated parameters and thus could be neglected. Based on sensitivity analysis, we both solve the problem of the optimal data space selection and we find specific conditions for optimizing an important experimental design factor, e.g., the radius of bleach spot. Finally, a theorem announcing less precision of the integrated data approach compared to the full data case is proven; i.e., we claim that the data set represented by the FRAP recovery curve lead to a larger confidence interval compared to the spatio-temporal (full) data.Keywords: FRAP, inverse problem, parameter identification, sensitivity analysis, optimal experimental design
Procedia PDF Downloads 27625500 Exploring the Feasibility of Utilizing Blockchain in Cloud Computing and AI-Enabled BIM for Enhancing Data Exchange in Construction Supply Chain Management
Authors: Tran Duong Nguyen, Marwan Shagar, Qinghao Zeng, Aras Maqsoodi, Pardis Pishdad, Eunhwa Yang
Abstract:
Construction supply chain management (CSCM) involves the collaboration of many disciplines and actors, which generates vast amounts of data. However, inefficient, fragmented, and non-standardized data storage often hinders this data exchange. The industry has adopted building information modeling (BIM) -a digital representation of a facility's physical and functional characteristics to improve collaboration, enhance transmission security, and provide a common data exchange platform. Still, the volume and complexity of data require tailored information categorization, aligning with stakeholders' preferences and demands. To address this, artificial intelligence (AI) can be integrated to handle this data’s magnitude and complexities. This research aims to develop an integrated and efficient approach for data exchange in CSCM by utilizing AI. The paper covers five main objectives: (1) Investigate existing framework and BIM adoption; (2) Identify challenges in data exchange; (3) Propose an integrated framework; (4) Enhance data transmission security; and (5) Develop data exchange in CSCM. The proposed framework demonstrates how integrating BIM and other technologies, such as cloud computing, blockchain, and AI applications, can significantly improve the efficiency and accuracy of data exchange in CSCM.Keywords: construction supply chain management, BIM, data exchange, artificial intelligence
Procedia PDF Downloads 2425499 Representation Data without Lost Compression Properties in Time Series: A Review
Authors: Nabilah Filzah Mohd Radzuan, Zalinda Othman, Azuraliza Abu Bakar, Abdul Razak Hamdan
Abstract:
Uncertain data is believed to be an important issue in building up a prediction model. The main objective in the time series uncertainty analysis is to formulate uncertain data in order to gain knowledge and fit low dimensional model prior to a prediction task. This paper discusses the performance of a number of techniques in dealing with uncertain data specifically those which solve uncertain data condition by minimizing the loss of compression properties.Keywords: compression properties, uncertainty, uncertain time series, mining technique, weather prediction
Procedia PDF Downloads 42525498 Internal Product Management: The Key to Achieving Digital Maturity and Business Agility for Manufacturing IT Organizations
Authors: Frederick Johnson
Abstract:
Product management has a long and well-established history within the consumer goods industry, despite being one of the most obscure aspects of brand management. Many global manufacturing organizations are now opting for external cloud-based Manufacturing Execution Systems (MES) to replace costly and outdated monolithic MES solutions. Other global manufacturing leaders are restructuring their organizations to support human-centered values, agile methodologies, and fluid operating principles. Still, industry-leading organizations struggle to apply the appropriate framework for managing evolving external MES solutions as internal "digital products." Product management complements these current trends in technology and philosophical thinking in the market. This paper discusses the central problems associated with adopting product management processes by analyzing its traditional theories and characteristics. Considering these ideas, the article then constructs a translated internal digital product management framework by combining new and existing approaches and principles. The report concludes by demonstrating the framework's capabilities and potential effectiveness in achieving digital maturity and business agility within a manufacturing environment.Keywords: internal product management, digital transformation, manufacturing information technology, manufacturing execution systems
Procedia PDF Downloads 13325497 Anomaly Detection Based Fuzzy K-Mode Clustering for Categorical Data
Authors: Murat Yazici
Abstract:
Anomalies are irregularities found in data that do not adhere to a well-defined standard of normal behavior. The identification of outliers or anomalies in data has been a subject of study within the statistics field since the 1800s. Over time, a variety of anomaly detection techniques have been developed in several research communities. The cluster analysis can be used to detect anomalies. It is the process of associating data with clusters that are as similar as possible while dissimilar clusters are associated with each other. Many of the traditional cluster algorithms have limitations in dealing with data sets containing categorical properties. To detect anomalies in categorical data, fuzzy clustering approach can be used with its advantages. The fuzzy k-Mode (FKM) clustering algorithm, which is one of the fuzzy clustering approaches, by extension to the k-means algorithm, is reported for clustering datasets with categorical values. It is a form of clustering: each point can be associated with more than one cluster. In this paper, anomaly detection is performed on two simulated data by using the FKM cluster algorithm. As a significance of the study, the FKM cluster algorithm allows to determine anomalies with their abnormality degree in contrast to numerous anomaly detection algorithms. According to the results, the FKM cluster algorithm illustrated good performance in the anomaly detection of data, including both one anomaly and more than one anomaly.Keywords: fuzzy k-mode clustering, anomaly detection, noise, categorical data
Procedia PDF Downloads 5125496 Big Data Analytics and Data Security in the Cloud via Fully Homomorphic Encyption Scheme
Authors: Victor Onomza Waziri, John K. Alhassan, Idris Ismaila, Noel Dogonyara
Abstract:
This paper describes the problem of building secure computational services for encrypted information in the Cloud. Computing without decrypting the encrypted data; therefore, it meets the yearning of computational encryption algorithmic aspiration model that could enhance the security of big data for privacy or confidentiality, availability and integrity of the data and user’s security. The cryptographic model applied for the computational process of the encrypted data is the Fully Homomorphic Encryption Scheme. We contribute a theoretical presentations in a high-level computational processes that are based on number theory that is derivable from abstract algebra which can easily be integrated and leveraged in the Cloud computing interface with detail theoretic mathematical concepts to the fully homomorphic encryption models. This contribution enhances the full implementation of big data analytics based on cryptographic security algorithm.Keywords: big data analytics, security, privacy, bootstrapping, Fully Homomorphic Encryption Scheme
Procedia PDF Downloads 47925495 An Approximation of Daily Rainfall by Using a Pixel Value Data Approach
Authors: Sarisa Pinkham, Kanyarat Bussaban
Abstract:
The research aims to approximate the amount of daily rainfall by using a pixel value data approach. The daily rainfall maps from the Thailand Meteorological Department in period of time from January to December 2013 were the data used in this study. The results showed that this approach can approximate the amount of daily rainfall with RMSE=3.343.Keywords: daily rainfall, image processing, approximation, pixel value data
Procedia PDF Downloads 38625494 A Next-Generation Blockchain-Based Data Platform: Leveraging Decentralized Storage and Layer 2 Scaling for Secure Data Management
Authors: Kenneth Harper
Abstract:
The rapid growth of data-driven decision-making across various industries necessitates advanced solutions to ensure data integrity, scalability, and security. This study introduces a decentralized data platform built on blockchain technology to improve data management processes in high-volume environments such as healthcare and financial services. The platform integrates blockchain networks using Cosmos SDK and Polkadot Substrate alongside decentralized storage solutions like IPFS and Filecoin, and coupled with decentralized computing infrastructure built on top of Avalanche. By leveraging advanced consensus mechanisms, we create a scalable, tamper-proof architecture that supports both structured and unstructured data. Key features include secure data ingestion, cryptographic hashing for robust data lineage, and Zero-Knowledge Proof mechanisms that enhance privacy while ensuring compliance with regulatory standards. Additionally, we implement performance optimizations through Layer 2 scaling solutions, including ZK-Rollups, which provide low-latency data access and trustless data verification across a distributed ledger. The findings from this exercise demonstrate significant improvements in data accessibility, reduced operational costs, and enhanced data integrity when tested in real-world scenarios. This platform reference architecture offers a decentralized alternative to traditional centralized data storage models, providing scalability, security, and operational efficiency.Keywords: blockchain, cosmos SDK, decentralized data platform, IPFS, ZK-Rollups
Procedia PDF Downloads 2425493 Native Plants Marketing by Entrepreneurs in the Landscaping Industry in Japan
Authors: Yuki Hara
Abstract:
Entrepreneurs are welcomed to the landscaping industry, conserving practically and theoretically biological diversity in landscaping construction, although there are limited reports on corporative trials making a market with a new logistics system of native plants (NP) between landscaping companies and nurserymen. This paper explores the entrepreneurial process of a landscaping company, “5byMidori” for NP marketing. This paper employs a case study design. Data are collected in interviews with the manager and designer of 5byMidori, 2 scientists, 1 organization, and 18 nurserymen, fieldworks at two nurseries, observations of marketing activities in three years, and texts from published documents about the business concept and marketing strategy with NP. These data are analyzed by qualitative methods. The results show that NP is suitable for the vision of 5byMidori improving urban desertified environment with closer urban-rural linkage. Professional landscaping team changes a forestry organization into NP producers conserving a large nursery of a mountain. Multifaceted PR based on the entrepreneurial context and personal background of a landscaping venture can foster team members' businesses and help customers and users to understand the biodiversity value of the product. Wider partnerships with existing nurserymen at other sites in many regions need socio-economic incentives and environmental reliability. In conclusion, the entrepreneurial marketing of a landscaping company needs to add more meanings and a variety of merits in terms of ecosystem services, as NP tends to be in academic definition and independent from the cultures like nurseryman and forestry.Keywords: biological diversity, landscaping industry, marketing, native plants
Procedia PDF Downloads 11925492 The Effect of Measurement Distribution on System Identification and Detection of Behavior of Nonlinearities of Data
Authors: Mohammad Javad Mollakazemi, Farhad Asadi, Aref Ghafouri
Abstract:
In this paper, we considered and applied parametric modeling for some experimental data of dynamical system. In this study, we investigated the different distribution of output measurement from some dynamical systems. Also, with variance processing in experimental data we obtained the region of nonlinearity in experimental data and then identification of output section is applied in different situation and data distribution. Finally, the effect of the spanning the measurement such as variance to identification and limitation of this approach is explained.Keywords: Gaussian process, nonlinearity distribution, particle filter, system identification
Procedia PDF Downloads 51225491 Geographic Information Systems and a Breath of Opportunities for Supply Chain Management: Results from a Systematic Literature Review
Authors: Anastasia Tsakiridi
Abstract:
Geographic information systems (GIS) have been utilized in numerous spatial problems, such as site research, land suitability, and demographic analysis. Besides, GIS has been applied in scientific fields like geography, health, and economics. In business studies, GIS has been used to provide insights and spatial perspectives in demographic trends, spending indicators, and network analysis. To date, the information regarding the available usages of GIS in supply chain management (SCM) and how these analyses can benefit businesses is limited. A systematic literature review (SLR) of the last 5-year peer-reviewed academic literature was conducted, aiming to explore the existing usages of GIS in SCM. The searches were performed in 3 databases (Web of Science, ProQuest, and Business Source Premier) and reported using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) methodology. The analysis resulted in 79 papers. The results indicate that the existing GIS applications used in SCM were in the following domains: a) network/ transportation analysis (in 53 of the papers), b) location – allocation site search/ selection (multiple-criteria decision analysis) (in 45 papers), c) spatial analysis (demographic or physical) (in 34 papers), d) combination of GIS and supply chain/network optimization tools (in 32 papers), and e) visualization/ monitoring or building information modeling applications (in 8 papers). An additional categorization of the literature was conducted by examining the usage of GIS in the supply chain (SC) by the business sectors, as indicated by the volume of the papers. The results showed that GIS is mainly being applied in the SC of the biomass biofuel/wood industry (33 papers). Other industries that are currently utilizing GIS in their SC were the logistics industry (22 papers), the humanitarian/emergency/health care sector (10 papers), the food/agro-industry sector (5 papers), the petroleum/ coal/ shale gas sector (3 papers), the faecal sludge sector (2 papers), the recycle and product footprint industry (2 papers), and the construction sector (2 papers). The results were also presented by the geography of the included studies and the GIS software used to provide critical business insights and suggestions for future research. The results showed that research case studies of GIS in SCM were conducted in 26 countries (mainly in the USA) and that the most prominent GIS software provider was the Environmental Systems Research Institute’s ArcGIS (in 51 of the papers). This study is a systematic literature review of the usage of GIS in SCM. The results showed that the GIS capabilities could offer substantial benefits in SCM decision-making by providing key insights to cost minimization, supplier selection, facility location, SC network configuration, and asset management. However, as presented in the results, only eight industries/sectors are currently using GIS in their SCM activities. These findings may offer essential tools to SC managers who seek to optimize the SC activities and/or minimize logistic costs and to consultants and business owners that want to make strategic SC decisions. Furthermore, the findings may be of interest to researchers aiming to investigate unexplored research areas where GIS may improve SCM.Keywords: supply chain management, logistics, systematic literature review, GIS
Procedia PDF Downloads 14125490 Aligning the Sustainability Policy Areas for Decarbonisation and Value Addition at an Organisational Level
Authors: Bishal Baniya
Abstract:
This paper proposes the sustainability related policy areas for decarbonisation and value addition at an organizational level. General and public sector organizations around the world are usually significant in terms of consuming resources and producing waste – powered through their massive procurement capacity. However, these organizations also possess huge potential to cut resource use and emission as many of these organizations controls supply chain of goods/services. They can therefore be a trend setter and can easily lead other major economic sectors such as manufacturing, construction and mining, transportation, etc. in pursuit towards paradigm shift for sustainability. Whilst the environmental and social awareness has improved in recent years and they have identified policy areas to improve the organizational environmental performance, value addition to the core business of the organization hasn’t been understood and interpreted correctly. This paper therefore investigates ways to align sustainability policy measures in a way that it creates better value proposition relative to benchmark by accounting both eco and social efficiency. Preliminary analysis shows co-benefits other than resource and cost savings fosters the business cases for organizations and this can be achieved by better aligning the policy measures and engaging stakeholders.Keywords: policy measures, environmental performance, value proposition, organisational level
Procedia PDF Downloads 14925489 Building a Scalable Telemetry Based Multiclass Predictive Maintenance Model in R
Authors: Jaya Mathew
Abstract:
Many organizations are faced with the challenge of how to analyze and build Machine Learning models using their sensitive telemetry data. In this paper, we discuss how users can leverage the power of R without having to move their big data around as well as a cloud based solution for organizations willing to host their data in the cloud. By using ScaleR technology to benefit from parallelization and remote computing or R Services on premise or in the cloud, users can leverage the power of R at scale without having to move their data around.Keywords: predictive maintenance, machine learning, big data, cloud based, on premise solution, R
Procedia PDF Downloads 37725488 Trusting the Big Data Analytics Process from the Perspective of Different Stakeholders
Authors: Sven Gehrke, Johannes Ruhland
Abstract:
Data is the oil of our time, without them progress would come to a hold [1]. On the other hand, the mistrust of data mining is increasing [2]. The paper at hand shows different aspects of the concept of trust and describes the information asymmetry of the typical stakeholders of a data mining project using the CRISP-DM phase model. Based on the identified influencing factors in relation to trust, problematic aspects of the current approach are verified using various interviews with the stakeholders. The results of the interviews confirm the theoretically identified weak points of the phase model with regard to trust and show potential research areas.Keywords: trust, data mining, CRISP DM, stakeholder management
Procedia PDF Downloads 93