Search results for: data marketplace
25297 Analyzing the Impact of Unilever's Corporate Social Responsibility (CSR) Strategies on Consumer Attitudes and Loyalty in International Markets: A Focus on Sustainable Marketing Practices
Authors: Lydia Nkechi Philip
Abstract:
Due to its well-documented commitment to sustainability across diverse global markets, Unilever, a multinational consumer goods powerhouse, serves as a compelling case study. The study's goal is to critically examine Unilever's CSR initiatives, assessing their alignment with international standards and the impact on consumer perceptions and loyalty. The study investigates how Unilever's CSR practices resonate with consumers in various regions using a mixed-methods approach that includes surveys and interviews. The conceptual framework considers the role of sustainable marketing practices as a bridge builder in the CSR-consumer relationship. The findings are expected to provide valuable insights for businesses seeking to navigate the complex terrain of global markets while remaining ethical and sustainable. As consumers place a higher value on socially responsible brands, this study examines Unilever's CSR impact on consumer behavior. The abstract captures the essence of the study, providing a sneak peek at the methodology, key objectives, and anticipated contributions to our understanding of CSR's role in shaping consumer attitudes and loyalty in the global marketplace.Keywords: Unilever, consumer loyalty, sustainable marketing practices, consumer loyalties
Procedia PDF Downloads 8525296 Mining Big Data in Telecommunications Industry: Challenges, Techniques, and Revenue Opportunity
Authors: Hoda A. Abdel Hafez
Abstract:
Mining big data represents a big challenge nowadays. Many types of research are concerned with mining massive amounts of data and big data streams. Mining big data faces a lot of challenges including scalability, speed, heterogeneity, accuracy, provenance and privacy. In telecommunication industry, mining big data is like a mining for gold; it represents a big opportunity and maximizing the revenue streams in this industry. This paper discusses the characteristics of big data (volume, variety, velocity and veracity), data mining techniques and tools for handling very large data sets, mining big data in telecommunication and the benefits and opportunities gained from them.Keywords: mining big data, big data, machine learning, telecommunication
Procedia PDF Downloads 41225295 Organisational Effectiveness and Its Implications for Seaports
Authors: Shadi Alghaffari, Hong-Oanh Nguyen, Peggy Chen, Hossein Enshaei
Abstract:
The main purpose of this study was to explore the role of organisational effectiveness (OE) in seaports. OE is an important managerial concept, one that is necessary for leaders and directors in any organisation to understand the output of their work. OE has been applied in many organisations; however, it is a vital concept in the port business. This paper examines various approaches and applications of the OE concept to business management, and describes benefits that are important and applicable to seaport management. This research reviews and classifies articles published in relevant journals and books between 1950 and 2016; from the general literature on OE to the narrower field of OE in seaports. Based on the extensive literature review, this study identifies and discusses several issues relevant to both practices and theories of this concept. The review concludes by presenting a gap in the literature, as it found only a limited amount of research that endeavours to clarify OE in the seaport sector. As a result of this gap, seaports suffer from a lack of empirical study and are largely neglected in this subject area. The implementation of OE in this research has led to the maritime sector interfacing with different disciplines in order to acquire the advantage of enhancing managerial knowledge and competing successfully in the international marketplace.Keywords: literature review, maritime, organisational effectiveness, seaport management
Procedia PDF Downloads 34625294 Empirical Evaluation of Game Components Based on Learning Theory: A Preliminary Study
Authors: Seoi Lee, Dongjoo Chin, Heewon Kim
Abstract:
Gamification refers to a technique that applies game elements to non-gaming elements, such as education and exercise, to make people more engaged in these behaviors. The purpose of this study was to identify effective elements in gamification for changing human behaviors. In order to accomplish this purpose, a survey based on learning theory was developed, especially for assessing antecedents and consequences of behaviors, and 8 popular and 8 unpopular games were selected for comparison. A total of 407 adult males and females were recruited via crowdsourcing Internet marketplace and completed the survey, which consisted of 19 questions for antecedent and 14 questions for consequences. Results showed no significant differences in consequence questions between popular and unpopular games. For antecedent questions, popular games are superior to unpopular games in character customization, play type selection, a sense of belonging, patch update cycle, and influence or dominance. This study is significant in that it reveals the elements of gamification based on learning theory. Future studies need to empirically validate whether these factors affect behavioral change.Keywords: gamification, learning theory, antecedent, consequence, behavior change, behaviorism
Procedia PDF Downloads 22525293 JavaScript Object Notation Data against eXtensible Markup Language Data in Software Applications a Software Testing Approach
Authors: Theertha Chandroth
Abstract:
This paper presents a comparative study on how to check JSON (JavaScript Object Notation) data against XML (eXtensible Markup Language) data from a software testing point of view. JSON and XML are widely used data interchange formats, each with its unique syntax and structure. The objective is to explore various techniques and methodologies for validating comparison and integration between JSON data to XML and vice versa. By understanding the process of checking JSON data against XML data, testers, developers and data practitioners can ensure accurate data representation, seamless data interchange, and effective data validation.Keywords: XML, JSON, data comparison, integration testing, Python, SQL
Procedia PDF Downloads 14425292 Using Machine Learning Techniques to Extract Useful Information from Dark Data
Authors: Nigar Hussain
Abstract:
It is a subset of big data. Dark data means those data in which we fail to use for future decisions. There are many issues in existing work, but some need powerful tools for utilizing dark data. It needs sufficient techniques to deal with dark data. That enables users to exploit their excellence, adaptability, speed, less time utilization, execution, and accessibility. Another issue is the way to utilize dark data to extract helpful information to settle on better choices. In this paper, we proposed upgrade strategies to remove the dark side from dark data. Using a supervised model and machine learning techniques, we utilized dark data and achieved an F1 score of 89.48%.Keywords: big data, dark data, machine learning, heatmap, random forest
Procedia PDF Downloads 3325291 Multi-Source Data Fusion for Urban Comprehensive Management
Authors: Bolin Hua
Abstract:
In city governance, various data are involved, including city component data, demographic data, housing data and all kinds of business data. These data reflects different aspects of people, events and activities. Data generated from various systems are different in form and data source are different because they may come from different sectors. In order to reflect one or several facets of an event or rule, data from multiple sources need fusion together. Data from different sources using different ways of collection raised several issues which need to be resolved. Problem of data fusion include data update and synchronization, data exchange and sharing, file parsing and entry, duplicate data and its comparison, resource catalogue construction. Governments adopt statistical analysis, time series analysis, extrapolation, monitoring analysis, value mining, scenario prediction in order to achieve pattern discovery, law verification, root cause analysis and public opinion monitoring. The result of Multi-source data fusion is to form a uniform central database, which includes people data, location data, object data, and institution data, business data and space data. We need to use meta data to be referred to and read when application needs to access, manipulate and display the data. A uniform meta data management ensures effectiveness and consistency of data in the process of data exchange, data modeling, data cleansing, data loading, data storing, data analysis, data search and data delivery.Keywords: multi-source data fusion, urban comprehensive management, information fusion, government data
Procedia PDF Downloads 39725290 Reviewing Privacy Preserving Distributed Data Mining
Authors: Sajjad Baghernezhad, Saeideh Baghernezhad
Abstract:
Nowadays considering human involved in increasing data development some methods such as data mining to extract science are unavoidable. One of the discussions of data mining is inherent distribution of the data usually the bases creating or receiving such data belong to corporate or non-corporate persons and do not give their information freely to others. Yet there is no guarantee to enable someone to mine special data without entering in the owner’s privacy. Sending data and then gathering them by each vertical or horizontal software depends on the type of their preserving type and also executed to improve data privacy. In this study it was attempted to compare comprehensively preserving data methods; also general methods such as random data, coding and strong and weak points of each one are examined.Keywords: data mining, distributed data mining, privacy protection, privacy preserving
Procedia PDF Downloads 52725289 The Right to Data Portability and Its Influence on the Development of Digital Services
Authors: Roman Bieda
Abstract:
The General Data Protection Regulation (GDPR) will come into force on 25 May 2018 which will create a new legal framework for the protection of personal data in the European Union. Article 20 of GDPR introduces a right to data portability. This right allows for data subjects to receive the personal data which they have provided to a data controller, in a structured, commonly used and machine-readable format, and to transmit this data to another data controller. The right to data portability, by facilitating transferring personal data between IT environments (e.g.: applications), will also facilitate changing the provider of services (e.g. changing a bank or a cloud computing service provider). Therefore, it will contribute to the development of competition and the digital market. The aim of this paper is to discuss the right to data portability and its influence on the development of new digital services.Keywords: data portability, digital market, GDPR, personal data
Procedia PDF Downloads 47825288 Recent Advances in Data Warehouse
Authors: Fahad Hanash Alzahrani
Abstract:
This paper describes some recent advances in a quickly developing area of data storing and processing based on Data Warehouses and Data Mining techniques, which are associated with software, hardware, data mining algorithms and visualisation techniques having common features for any specific problems and tasks of their implementation.Keywords: data warehouse, data mining, knowledge discovery in databases, on-line analytical processing
Procedia PDF Downloads 40525287 How to Use Big Data in Logistics Issues
Authors: Mehmet Akif Aslan, Mehmet Simsek, Eyup Sensoy
Abstract:
Big Data stands for today’s cutting-edge technology. As the technology becomes widespread, so does Data. Utilizing massive data sets enable companies to get competitive advantages over their adversaries. Out of many area of Big Data usage, logistics has significance role in both commercial sector and military. This paper lays out what big data is and how it is used in both military and commercial logistics.Keywords: big data, logistics, operational efficiency, risk management
Procedia PDF Downloads 64425286 Merchants’ Attitudes towards Tourism Development in Mahane Yehuda Market: A Case Study
Authors: Rotem Mashkov, Noam Shoval
Abstract:
In an age when a tourist’s gaze is more focused on the daily lives of locals, it is evident that local food markets are being rediscovered. Traditional urban markets succeed in reinventing themselves as a space for consumption, recreation, and culture, enabling authentic experiences and interpersonal interactions with the local culture. Alongside this, the pressure of tourism development may result in commercialization and retail gentrification to the point of losing the sense of local identity. The issue of finding a balance between tourism development and the preservation of unique local features is at the heart of this study and is being tested using the case of the Mahane Yehuda market in Jerusalem. The research question—how merchants respond to tourism development in the Mahane Yehuda food market— focuses on local traders, a group of players who are usually absent from the research arenas, although they influence tourism development as well as influenced by it. Three main research methods were integrated into this study. The first two methods, a survey of articles survey and comparative mapping of the business mix, were used to characterize the changes in the Mahane Yehuda market both consciously and physically. The third research method, involving in-depth interviews with merchants, was used to examine the traders' attitudes and responses to tourism development. The findings indicate that there has been a turnaround in the market image over the past decade and a half. Additionally, there has been a significant physical change in the business mix, reflected by a decline of 15% in the number of stalls selling food products and delicacies. The data from the interviews on the traders’ attitudes towards tourism development were inconclusive; there were disagreements among the traders about the economic contribution of tourism development in relation to their dependence on the tourism industry. However, there was a consensus on the need for authentic elements in the marketplace. The findings of the study also indicate a strong link between the merchants’ response to tourism development and their stall ownership status as the merchant could exercise their position in various ways depending on the possession type.Keywords: business mix, Jerusalem, local food markets, Mahane Yehuda market, merchants’ attitude, ownership status, retail gentrification, tourism development, traditional urban markets
Procedia PDF Downloads 13925285 New Insights for Soft Skills Development in Vietnamese Business Schools: Defining Essential Soft Skills for Maximizing Graduates’ Career Success
Authors: Hang T. T. Truong, Ronald S. Laura, Kylie Shaw
Abstract:
Within Vietnam's system of higher education, its schools of business play a vital role in supporting the country’s economic objectives. However, the crucial contribution of soft skills for maximal success within the business sector has to date not been adequately recognized by its business schools. This being so, the development of the business school curriculum in Vietnam has not been able to 'catch up', so to say, with the burgeoning need of students for a comprehensive soft skills program designed to meet the national and global business objectives of their potential employers. The burden of the present paper is first to reveal the results of our survey in Vietnam which make explicit the extent to which major Vietnamese industrial employers’ value the potential role that soft skill competencies can play in maximizing business success. Our final task will be to determine which soft skills employers discern as best serving to maximize the economic interests of Vietnam within the global marketplace. Semi-structured telephone interviews have been conducted with the 15 representative Head Employers of Vietnam's reputedly largest and most successful of the diverse business enterprises across Vietnam. The findings of the study indicate that all respondents highly value the increasing importance of soft skills in business success. Our critical analysis of respondent data reveals that 19 essential soft skills are deemed by employers as integral to business workplace efficacy and should thus be integrated into the formal business curriculum. We are confident that our study represents the first comprehensive and specific survey yet undertaken within the business sector in Vietnam which accesses and analyses the opinions of representative employers from major companies across the country in regard to the growing importance of 19 specific soft skills essential for maximizing overall business success. Our research findings also reveal that the integration into business school curriculums nationwide of the soft skills we have identified is of paramount importance to advance the national and global economic interests of Vietnam.Keywords: business curriculum, business graduates, employers’ perception, soft skills
Procedia PDF Downloads 32525284 Predictive Analysis of the Stock Price Market Trends with Deep Learning
Authors: Suraj Mehrotra
Abstract:
The stock market is a volatile, bustling marketplace that is a cornerstone of economics. It defines whether companies are successful or in spiral. A thorough understanding of it is important - many companies have whole divisions dedicated to analysis of both their stock and of rivaling companies. Linking the world of finance and artificial intelligence (AI), especially the stock market, has been a relatively recent development. Predicting how stocks will do considering all external factors and previous data has always been a human task. With the help of AI, however, machine learning models can help us make more complete predictions in financial trends. Taking a look at the stock market specifically, predicting the open, closing, high, and low prices for the next day is very hard to do. Machine learning makes this task a lot easier. A model that builds upon itself that takes in external factors as weights can predict trends far into the future. When used effectively, new doors can be opened up in the business and finance world, and companies can make better and more complete decisions. This paper explores the various techniques used in the prediction of stock prices, from traditional statistical methods to deep learning and neural networks based approaches, among other methods. It provides a detailed analysis of the techniques and also explores the challenges in predictive analysis. For the accuracy of the testing set, taking a look at four different models - linear regression, neural network, decision tree, and naïve Bayes - on the different stocks, Apple, Google, Tesla, Amazon, United Healthcare, Exxon Mobil, J.P. Morgan & Chase, and Johnson & Johnson, the naïve Bayes model and linear regression models worked best. For the testing set, the naïve Bayes model had the highest accuracy along with the linear regression model, followed by the neural network model and then the decision tree model. The training set had similar results except for the fact that the decision tree model was perfect with complete accuracy in its predictions, which makes sense. This means that the decision tree model likely overfitted the training set when used for the testing set.Keywords: machine learning, testing set, artificial intelligence, stock analysis
Procedia PDF Downloads 9725283 Implementation of an IoT Sensor Data Collection and Analysis Library
Authors: Jihyun Song, Kyeongjoo Kim, Minsoo Lee
Abstract:
Due to the development of information technology and wireless Internet technology, various data are being generated in various fields. These data are advantageous in that they provide real-time information to the users themselves. However, when the data are accumulated and analyzed, more various information can be extracted. In addition, development and dissemination of boards such as Arduino and Raspberry Pie have made it possible to easily test various sensors, and it is possible to collect sensor data directly by using database application tools such as MySQL. These directly collected data can be used for various research and can be useful as data for data mining. However, there are many difficulties in using the board to collect data, and there are many difficulties in using it when the user is not a computer programmer, or when using it for the first time. Even if data are collected, lack of expert knowledge or experience may cause difficulties in data analysis and visualization. In this paper, we aim to construct a library for sensor data collection and analysis to overcome these problems.Keywords: clustering, data mining, DBSCAN, k-means, k-medoids, sensor data
Procedia PDF Downloads 38125282 Government (Big) Data Ecosystem: Definition, Classification of Actors, and Their Roles
Authors: Syed Iftikhar Hussain Shah, Vasilis Peristeras, Ioannis Magnisalis
Abstract:
Organizations, including governments, generate (big) data that are high in volume, velocity, veracity, and come from a variety of sources. Public Administrations are using (big) data, implementing base registries, and enforcing data sharing within the entire government to deliver (big) data related integrated services, provision of insights to users, and for good governance. Government (Big) data ecosystem actors represent distinct entities that provide data, consume data, manipulate data to offer paid services, and extend data services like data storage, hosting services to other actors. In this research work, we perform a systematic literature review. The key objectives of this paper are to propose a robust definition of government (big) data ecosystem and a classification of government (big) data ecosystem actors and their roles. We showcase a graphical view of actors, roles, and their relationship in the government (big) data ecosystem. We also discuss our research findings. We did not find too much published research articles about the government (big) data ecosystem, including its definition and classification of actors and their roles. Therefore, we lent ideas for the government (big) data ecosystem from numerous areas that include scientific research data, humanitarian data, open government data, industry data, in the literature.Keywords: big data, big data ecosystem, classification of big data actors, big data actors roles, definition of government (big) data ecosystem, data-driven government, eGovernment, gaps in data ecosystems, government (big) data, public administration, systematic literature review
Procedia PDF Downloads 16525281 Government Big Data Ecosystem: A Systematic Literature Review
Authors: Syed Iftikhar Hussain Shah, Vasilis Peristeras, Ioannis Magnisalis
Abstract:
Data that is high in volume, velocity, veracity and comes from a variety of sources is usually generated in all sectors including the government sector. Globally public administrations are pursuing (big) data as new technology and trying to adopt a data-centric architecture for hosting and sharing data. Properly executed, big data and data analytics in the government (big) data ecosystem can be led to data-driven government and have a direct impact on the way policymakers work and citizens interact with governments. In this research paper, we conduct a systematic literature review. The main aims of this paper are to highlight essential aspects of the government (big) data ecosystem and to explore the most critical socio-technical factors that contribute to the successful implementation of government (big) data ecosystem. The essential aspects of government (big) data ecosystem include definition, data types, data lifecycle models, and actors and their roles. We also discuss the potential impact of (big) data in public administration and gaps in the government data ecosystems literature. As this is a new topic, we did not find specific articles on government (big) data ecosystem and therefore focused our research on various relevant areas like humanitarian data, open government data, scientific research data, industry data, etc.Keywords: applications of big data, big data, big data types. big data ecosystem, critical success factors, data-driven government, egovernment, gaps in data ecosystems, government (big) data, literature review, public administration, systematic review
Procedia PDF Downloads 23425280 A Machine Learning Decision Support Framework for Industrial Engineering Purposes
Authors: Anli Du Preez, James Bekker
Abstract:
Data is currently one of the most critical and influential emerging technologies. However, the true potential of data is yet to be exploited since, currently, about 1% of generated data are ever actually analyzed for value creation. There is a data gap where data is not explored due to the lack of data analytics infrastructure and the required data analytics skills. This study developed a decision support framework for data analytics by following Jabareen’s framework development methodology. The study focused on machine learning algorithms, which is a subset of data analytics. The developed framework is designed to assist data analysts with little experience, in choosing the appropriate machine learning algorithm given the purpose of their application.Keywords: Data analytics, Industrial engineering, Machine learning, Value creation
Procedia PDF Downloads 17125279 Providing Security to Private Cloud Using Advanced Encryption Standard Algorithm
Authors: Annapureddy Srikant Reddy, Atthanti Mahendra, Samala Chinni Krishna, N. Neelima
Abstract:
In our present world, we are generating a lot of data and we, need a specific device to store all these data. Generally, we store data in pen drives, hard drives, etc. Sometimes we may loss the data due to the corruption of devices. To overcome all these issues, we implemented a cloud space for storing the data, and it provides more security to the data. We can access the data with just using the internet from anywhere in the world. We implemented all these with the java using Net beans IDE. Once user uploads the data, he does not have any rights to change the data. Users uploaded files are stored in the cloud with the file name as system time and the directory will be created with some random words. Cloud accepts the data only if the size of the file is less than 2MB.Keywords: cloud space, AES, FTP, NetBeans IDE
Procedia PDF Downloads 20825278 Exploring the Role of Extracurricular Activities (ECAs) in Fostering University Students’ Soft Skills
Authors: Hanae Ait Hattani, Nohaila Ait Hattani
Abstract:
Globalization, with the rapid technological progress, is affecting every life aspect. The 21st century higher education faces a major challenge in preparing well-rounded and competent graduates to compete in the global marketplace. Worldwide, educational policies work to develop the quality of instruction at all educational levels by focusing on promoting students’ qualifications and skills, considering both academic activities and non-academic attributes. In fact, extracurricular activities (ECAs) complement the academic curriculum and enhance the student experience by improving their interpersonal skills and attitudes. This study comes to examine the potential of extracurricular activities as a vital tool for soft skills’ development. Using empirical research, the study aims to measure and evaluate the extent to which university students’ engagement in extracurricular activities contribute in positively changing their learning experience, fostering their soft skills and fostering their behaviors and attitudes. Findings emanating from a questionnaire and semi-structured interviews add a number of contributions to the literature. They support the assumption suggesting that ECAs can be considered a valuable way to acquire, develop, and demonstrate softs skills that students today need to evidence in a variety of contexts, such as communication skills, team work, leadership, problem-solving, to name but a few.Keywords: extracurricular activities (ECAs), soft skills, education, university, attitude
Procedia PDF Downloads 7425277 Business Intelligence for Profiling of Telecommunication Customer
Authors: Rokhmatul Insani, Hira Laksmiwati Soemitro
Abstract:
Business Intelligence is a methodology that exploits the data to produce information and knowledge systematically, business intelligence can support the decision-making process. Some methods in business intelligence are data warehouse and data mining. A data warehouse can store historical data from transactional data. For data modelling in data warehouse, we apply dimensional modelling by Kimball. While data mining is used to extracting patterns from the data and get insight from the data. Data mining has many techniques, one of which is segmentation. For profiling of telecommunication customer, we use customer segmentation according to customer’s usage of services, customer invoice and customer payment. Customers can be grouped according to their characteristics and can be identified the profitable customers. We apply K-Means Clustering Algorithm for segmentation. The input variable for that algorithm we use RFM (Recency, Frequency and Monetary) model. All process in data mining, we use tools IBM SPSS modeller.Keywords: business intelligence, customer segmentation, data warehouse, data mining
Procedia PDF Downloads 48725276 Imputation Technique for Feature Selection in Microarray Data Set
Authors: Younies Saeed Hassan Mahmoud, Mai Mabrouk, Elsayed Sallam
Abstract:
Analysing DNA microarray data sets is a great challenge, which faces the bioinformaticians due to the complication of using statistical and machine learning techniques. The challenge will be doubled if the microarray data sets contain missing data, which happens regularly because these techniques cannot deal with missing data. One of the most important data analysis process on the microarray data set is feature selection. This process finds the most important genes that affect certain disease. In this paper, we introduce a technique for imputing the missing data in microarray data sets while performing feature selection.Keywords: DNA microarray, feature selection, missing data, bioinformatics
Procedia PDF Downloads 57525275 PDDA: Priority-Based, Dynamic Data Aggregation Approach for Sensor-Based Big Data Framework
Authors: Lutful Karim, Mohammed S. Al-kahtani
Abstract:
Sensors are being used in various applications such as agriculture, health monitoring, air and water pollution monitoring, traffic monitoring and control and hence, play the vital role in the growth of big data. However, sensors collect redundant data. Thus, aggregating and filtering sensors data are significantly important to design an efficient big data framework. Current researches do not focus on aggregating and filtering data at multiple layers of sensor-based big data framework. Thus, this paper introduces (i) three layers data aggregation and framework for big data and (ii) a priority-based, dynamic data aggregation scheme (PDDA) for the lowest layer at sensors. Simulation results show that the PDDA outperforms existing tree and cluster-based data aggregation scheme in terms of overall network energy consumptions and end-to-end data transmission delay.Keywords: big data, clustering, tree topology, data aggregation, sensor networks
Procedia PDF Downloads 34825274 Employee Happiness: The Influence of Providing Consumers with an Experience versus an Object
Authors: Wilson Bastos, Sigal G. Barsade
Abstract:
Much of what happens in the marketplace revolves around the provision and consumption of goods. Recent research has advanced a useful categorization of these goods—as experiential versus material—and shown that, from the consumers’ perspective, experiences (e.g., a theater performance) are superior to objects (e.g., an electronic gadget) in offering various social and psychological benefits. A common finding in this growing research stream is that consumers gain more happiness from the experiences they have than the objects they own. By focusing solely on those acquiring the experiential or material goods (the consumers), prior research has remained silent regarding another important group of individuals—those providing the goods (the employees). Do employees whose jobs are primarily focused on offering consumers an experience (vs. object) also gain more happiness from their occupation? We report evidence from four experiments supporting an experiential-employee advantage. Further, we use mediation and moderation tests to unearth the mechanism responsible for this effect. Results reveal that work meaningfulness is the primary driver of the experiential-employee advantage. Overall, our findings suggest that employees find it more meaningful to provide people with an experience as compared to a material object, which in turn shapes the happiness they derive from their jobs. We expect this finding to have implications on human development, and to be of relevance to researchers and practitioners interested in how to advance human condition in the workplace.Keywords: employee happiness, experiential versus material jobs, work meaningfulness
Procedia PDF Downloads 27225273 Hybrid Model of an Increasing Unique Consumer Value on Purchases that Influences the Consumer Loyalty and the Pursuit of a Sustainable Competitive Advantage from the Institutions in Jakarta
Authors: Wilhelmus Hary Susilo
Abstract:
The marketplace would have at least some resources that are unique (e.g., well communication, knowledgeable employees, consumer value, effective transaction, efficient production processes and institutional branding). The institutions should have an advantage in resources and then could lead to positions of competitive advantage. These major challenges focus on increasing unique consumer value on reliable purchases that influence of loyalty and pursuit of a sustainable competitive advantage from the Institutions in Jakarta. Furthermore, a research was conducted with a quantitative method and a confirmatory strategic research design. The research resulted in entire confirmatory factors analysis (1st CFA and 2nd CFA) among variables pertains to; χ2//Df (9.30, 4.38, 6.95, 2.76, 2.97, 2.91, 2.32 and 6.90), GFI (0.72, 0.82, 0.82, 0.81, 0.78, 0.84, 0.89 and 0.70) and CFI (0.90, 0.95, 0.93, 0.92, 0.95, 0.91, 0.96 and 0.89), which indicates a good model. Furthermore, the hybrid model is well fit with, χ2//Df=1.84, P value = 0.00, RMSEA = 0.076, GFI = 0.76, NNFI= 0.95, PNFI= 0.82, IFI= 0.96, RFI= 0.91, AGFI= 0.71 and CFI= 0.96. The result was significant hypothesis, i.e. variables of communitization marketing 3.0 and price perception influenced to unique value of consumer with tvalue =4.46 and 5.89. Furthermore, the consumers value influenced the purchasing with t value = 5.94. Additionally, the loyalty, the ‘communitization’, and the character building marketing 3.0 are affecting the pursuit of a sustainable competitive advantage from institutions with t value = 7.57, -2.12, and 2.04. Finally, the test between the most superior variable dimensions is significantly correlated between INOV and WDES, RESPON and ATT covariance matrix value= 0.72 and 0.71. Thus, ‘communitization’ and character building marketing 3.0 with dimensions of responsibility and technologies would increase a competitive advantage with the dimensions of the innovation and the job design from the institutions.Keywords: consumer loyalty, marketing 3.0, unique consumer value, purchase, sustainable competitive advantage
Procedia PDF Downloads 28625272 Evaluation of Environmental, Social, and Governance Factors by U.S. Tolling Authorities in Bond Issuance Disclosures
Authors: Nicolas D. Norboge
Abstract:
Purchasers of municipal bonds in primary and secondary markets are increasingly expecting issuers to disclose environmental, social, and governance factors (ESG) inissuance and continuing disclosure documents. U.S. tolling authorities are slowly catching up with other transportation sectors, such as public transit, in integrating ESG factors into their bond disclosure documents. A systematic mixed-methods evaluation of publicly available bond disclosure documents from 2010-2022 suggest that only a small number of U.S. tolling authorities disclosedall ESG factors; however, the pace has accelerated significantly from 2020-2022. Because many tolling authorities have a direct financial stake in the growth of passenger vehicle miles traveled on their toll facilities, and in turn the burning of more climate-warming fossil fuels, one crucial questionthat remains is how bond purchasers will view increasedESG transparency. Recent moves by large institutional investors, credit rating agencies, and regulators suggestan expectation of ESG disclosure is a trend likely to endure. This researchsuggests tolling authorities will need to proactively consider these emerging trends and carefully adapt their disclosure practiceswhere possible. Building on these findings, this research also provides a basic sketch framework for how issuers can responsibly position themselves within the changing global municipal debt marketplace.Keywords: debt policy, ESG, municipal bonds, public-private partnerships, public tolling authorities, transportation finance, and policy
Procedia PDF Downloads 18025271 Strategy in Practice: Strategy Development, Strategic Error and Project Delivery
Authors: Nipun Agarwal, David Paul, Fareed Un Din
Abstract:
Strategy development and implementation is the key to an organization’s success in today’s competitive marketplace. Many organizations develop excellent strategy but are unable to implement this strategy in order to succeed. The difference between strategic goals and its implementation is called strategic error. Strategic error occurs when an organization does not have structures in place to implement their strategy. Strategy implementation happens through projects and having a project management method that provides certainty and agility will help an organization become more competitive in implementing strategy. Numerous project management methods exist in theory and practice. However, projects mainly used the Waterfall method in the past that provides certainty in terms of budget, delivery date and resourcing. It is common practice now to utilise Agile based methods. However, Agile based methods do not provide specific deadlines and budgets. But provide agility in product design and project delivery, which is useful to companies. Both Waterfall and Agile methods in some forms are the opposites of each other. Executive management prefer agility in delivery projects as the competitive landscape changes frequently. However, they also appreciate certainty in the projects being able to quantify budgets, deadlines and resources that is harder for an Agile based method to provide. This paper attempts to develop a hybrid project management method that attempts to merge these Waterfall and Agile methods to provide the positives from both these approaches.Keywords: strategy, project management, strategy implementation, agile
Procedia PDF Downloads 11925270 Reducing Friction Associated with Commercial Use of Biomimetics While Increasing the Potential for Using Eco Materials and Design in Industry
Authors: John P. Ulhøi
Abstract:
Firms are faced with pressure to stay innovative and entrepreneurial while at the same time leaving lighter ecological footprints. Traditionally inspiration for new product development (NPD) has come from the creative in-house staff and from the marketplace. Often NPD offered by this approach has proven to be (far from) optimal for its purpose or highly (resource and energy) efficient. More recently, a bio-inspired NPD approach has surfaced under the banner of biomimetics. Biomimetics refers to inspiration from and translations of designs, systems, processes, and or specific properties that exist in nature. The principles and structures working in nature have evolved over a long period of time enable them to be optimized for the purpose and resource and energy-efficient. These characteristics reflect the raison d'être behind the field of biomimetics. While biological expertise is required to understand and explain such natural and biological principles and structures, engineers are needed to translate biological design and processes into synthetic applications. It can, therefore, hardly be surprising, biomimetics long has gained a solid foothold in both biology and engineering. The commercial adoption of biomimetic applications in new production development (NDP) in industry, however, does not quite reflect a similar growth. Differently put, this situation suggests that something is missing in the biomimetic-NPD-equation, thus acting as a brake towards the wider commercial application of biomimetics and thus the use of eco-materials and design in the industry. This paper closes some of that gap. Before concluding, avenues for future research and implications for practice will be briefly sketched out.Keywords: biomimetics, eco-materials, NPD, commercialization
Procedia PDF Downloads 16425269 Control the Flow of Big Data
Authors: Shizra Waris, Saleem Akhtar
Abstract:
Big data is a research area receiving attention from academia and IT communities. In the digital world, the amounts of data produced and stored have within a short period of time. Consequently this fast increasing rate of data has created many challenges. In this paper, we use functionalism and structuralism paradigms to analyze the genesis of big data applications and its current trends. This paper presents a complete discussion on state-of-the-art big data technologies based on group and stream data processing. Moreover, strengths and weaknesses of these technologies are analyzed. This study also covers big data analytics techniques, processing methods, some reported case studies from different vendor, several open research challenges and the chances brought about by big data. The similarities and differences of these techniques and technologies based on important limitations are also investigated. Emerging technologies are suggested as a solution for big data problems.Keywords: computer, it community, industry, big data
Procedia PDF Downloads 19625268 High Performance Computing and Big Data Analytics
Authors: Branci Sarra, Branci Saadia
Abstract:
Because of the multiplied data growth, many computer science tools have been developed to process and analyze these Big Data. High-performance computing architectures have been designed to meet the treatment needs of Big Data (view transaction processing standpoint, strategic, and tactical analytics). The purpose of this article is to provide a historical and global perspective on the recent trend of high-performance computing architectures especially what has a relation with Analytics and Data Mining.Keywords: high performance computing, HPC, big data, data analysis
Procedia PDF Downloads 524