Search results for: decentralized data management
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 30719

Search results for: decentralized data management

28499 An Approach to Tackle Start up Problems Using Applied Games

Authors: Aiswarya Gopal, Kamal Bijlani, Vinoth Rengaraj, R. Jayakrishnan

Abstract:

In the business world, the term “startup” is frequently ringing the bell with the high frequency of young ventures. The main dilemma of startups is the unsuccessful management of the unique risks that have to be confronted in the present world of competition and technology. This research work tried to bring out a game based methodology to improve enough real-world experience among entrepreneurs as well as management students to handle risks and challenges in the field. The game will provide experience to the player to overcome challenges like market problems, running out of cash, poor management, and product problems which can be resolved by a proper strategic approach in the entrepreneurship world. The proposed serious game works on the life cycle of a new software enterprise where the entrepreneur moves from the planning stage to secured financial stage, laying down the basic business structure, and initiates the operations ensuring the increment in confidence level of the player.

Keywords: business model, game based learning, poor management, start up

Procedia PDF Downloads 470
28498 Optimization of Maintenance of PV Module Arrays Based on Asset Management Strategies: Case of Study

Authors: L. Alejandro Cárdenas, Fernando Herrera, David Nova, Juan Ballesteros

Abstract:

This paper presents a methodology to optimize the maintenance of grid-connected photovoltaic systems, considering the cleaning and module replacement periods based on an asset management strategy. The methodology is based on the analysis of the energy production of the PV plant, the energy feed-in tariff, and the cost of cleaning and replacement of the PV modules, with the overall revenue received being the optimization variable. The methodology is evaluated as a case study of a 5.6 kWp solar PV plant located on the Bogotá campus of the Universidad Nacional de Colombia. The asset management strategy implemented consists of assessing the PV modules through visual inspection, energy performance analysis, pollution, and degradation. Within the visual inspection of the plant, the general condition of the modules and the structure is assessed, identifying dust deposition, visible fractures, and water accumulation on the bottom. The energy performance analysis is performed with the energy production reported by the monitoring systems and compared with the values estimated in the simulation. The pollution analysis is performed using the soiling rate due to dust accumulation, which can be modelled by a black box with an exponential function dependent on historical pollution values. The pollution rate is calculated with data collected from the energy generated during two years in a photovoltaic plant on the campus of the National University of Colombia. Additionally, the alternative of assessing the temperature degradation of the PV modules is evaluated by estimating the cell temperature with parameters such as ambient temperature and wind speed. The medium-term energy decrease of the PV modules is assessed with the asset management strategy by calculating the health index to determine the replacement period of the modules due to degradation. This study proposes a tool for decision making related to the maintenance of photovoltaic systems. The above, projecting the increase in the installation of solar photovoltaic systems in power systems associated with the commitments made in the Paris Agreement for the reduction of CO2 emissions. In the Colombian context, it is estimated that by 2030, 12% of the installed power capacity will be solar PV.

Keywords: asset management, PV module, optimization, maintenance

Procedia PDF Downloads 38
28497 Parallel Vector Processing Using Multi Level Orbital DATA

Authors: Nagi Mekhiel

Abstract:

Many applications use vector operations by applying single instruction to multiple data that map to different locations in conventional memory. Transferring data from memory is limited by access latency and bandwidth affecting the performance gain of vector processing. We present a memory system that makes all of its content available to processors in time so that processors need not to access the memory, we force each location to be available to all processors at a specific time. The data move in different orbits to become available to other processors in higher orbits at different time. We use this memory to apply parallel vector operations to data streams at first orbit level. Data processed in the first level move to upper orbit one data element at a time, allowing a processor in that orbit to apply another vector operation to deal with serial code limitations inherited in all parallel applications and interleaved it with lower level vector operations.

Keywords: Memory Organization, Parallel Processors, Serial Code, Vector Processing

Procedia PDF Downloads 261
28496 Reconstructability Analysis for Landslide Prediction

Authors: David Percy

Abstract:

Landslides are a geologic phenomenon that affects a large number of inhabited places and are constantly being monitored and studied for the prediction of future occurrences. Reconstructability analysis (RA) is a methodology for extracting informative models from large volumes of data that work exclusively with discrete data. While RA has been used in medical applications and social science extensively, we are introducing it to the spatial sciences through applications like landslide prediction. Since RA works exclusively with discrete data, such as soil classification or bedrock type, working with continuous data, such as porosity, requires that these data are binned for inclusion in the model. RA constructs models of the data which pick out the most informative elements, independent variables (IVs), from each layer that predict the dependent variable (DV), landslide occurrence. Each layer included in the model retains its classification data as a primary encoding of the data. Unlike other machine learning algorithms that force the data into one-hot encoding type of schemes, RA works directly with the data as it is encoded, with the exception of continuous data, which must be binned. The usual physical and derived layers are included in the model, and testing our results against other published methodologies, such as neural networks, yields accuracy that is similar but with the advantage of a completely transparent model. The results of an RA session with a data set are a report on every combination of variables and their probability of landslide events occurring. In this way, every combination of informative state combinations can be examined.

Keywords: reconstructability analysis, machine learning, landslides, raster analysis

Procedia PDF Downloads 58
28495 Urban Logistics Dynamics: A User-Centric Approach to Traffic Modelling and Kinetic Parameter Analysis

Authors: Emilienne Lardy, Eric Ballot, Mariam Lafkihi

Abstract:

Efficient urban logistics requires a comprehensive understanding of traffic dynamics, particularly as it pertains to kinetic parameters influencing energy consumption and trip duration estimations. While real-time traffic information is increasingly accessible, current high-precision forecasting services embedded in route planning often function as opaque 'black boxes' for users. These services, typically relying on AI-processed counting data, fall short in accommodating open design parameters essential for management studies, notably within Supply Chain Management. This work revisits the modelling of traffic conditions in the context of city logistics, emphasizing its significance from the user’s point of view, with two focuses. Firstly, the focus is not on the vehicle flow but on the vehicles themselves and the impact of the traffic conditions on their driving behaviour. This means opening the range of studied indicators beyond vehicle speed, to describe extensively the kinetic and dynamic aspects of the driving behaviour. To achieve this, we leverage the Art. Kinema parameters are designed to characterize driving cycles. Secondly, this study examines how the driving context (i.e., exogenous factors to the traffic flow) determines the mentioned driving behaviour. Specifically, we explore how accurately the kinetic behaviour of a vehicle can be predicted based on a limited set of exogenous factors, such as time, day, road type, orientation, slope, and weather conditions. To answer this question, statistical analysis was conducted on real-world driving data, which includes high-frequency measurements of vehicle speed. A Factor Analysis and a Generalized Linear Model have been established to link kinetic parameters with independent categorical contextual variables. The results include an assessment of the adjustment quality and the robustness of the models, as well as an overview of the model’s outputs.

Keywords: factor analysis, generalised linear model, real world driving data, traffic congestion, urban logistics, vehicle kinematics

Procedia PDF Downloads 61
28494 Business Process Management Maturity in Croatian Companies

Authors: V. Bosilj Vuksic

Abstract:

This paper aims to investigate business process management (BPM) maturity in Croatian companies. First, a brief literature review of the research field is given. Next, the results of empirical research are presented, analyzed and discussed. The results reveal that Croatian companies achieved the intermediate level of BPM maturity. The empirical evidence supports the proposed theoretical background. Furthermore, a case study approach was used to illustrate BPM adoption in a Croatian company at the upmost stage of BPM maturity. In practical terms, this case study identifies BPM maturity success factors that need to exist in order for a company to effectively adopt BPM.

Keywords: business process management, case study, Croatian companies, maturity, process performance index, questionnaire

Procedia PDF Downloads 226
28493 An Exploratory Study to Appraise the Current Challenges and Limitations Faced in Applying and Integrating the Historic Building Information Modelling Concept for the Management of Historic Buildings

Authors: Oluwatosin Adewale

Abstract:

The sustainability of built heritage has become a relevant issue in recent years due to the social and economic values associated with these buildings. Heritage buildings provide a means for human perception of culture and represent a legacy of long-existing history; they define the local character of the social world and provide a vital connection to the past with their associated aesthetical and communal benefits. The identified values of heritage buildings have increased the importance of conservation and the lifecycle management of these buildings. The recent developments of digital design technology in engineering and the built environment have led to the adoption of Building Information Modelling (BIM) by the Architecture, Engineering, Construction, and Operations (AECO) industry. BIM provides a platform for the lifecycle management of a construction project through effective collaboration among stakeholders and the analysis of a digital information model. This growth in digital design technology has also made its way into the field of architectural heritage management in the form of Historic Building Information Modelling (HBIM). A reverse engineering process for digital documentation of heritage assets that draws upon similar information management processes as the BIM process. However, despite the several scientific and technical contributions made to the development of the HBIM process, it doesn't remain easy to integrate at the most practical level of heritage asset management. The main objective identified under the scope of the study is to review the limitations and challenges faced by heritage management professionals in adopting an HBIM-based asset management procedure for historic building projects. This paper uses an exploratory study in the form of semi-structured interviews to investigate the research problem. A purposive sample of heritage industry experts and professionals were selected to take part in a semi-structured interview to appraise some of the limitations and challenges they have faced with the integration of HBIM into their project workflows. The findings from this study will present the challenges and limitations faced in applying and integrating the HBIM concept for the management of historic buildings.

Keywords: building information modelling, built heritage, heritage asset management, historic building information modelling, lifecycle management

Procedia PDF Downloads 88
28492 Developing a Grading System for Restaurants

Authors: Joseph Roberson, Carina Kleynhans, Willie Coetzee

Abstract:

The low entry barriers of the restaurant industry lead to an extremely competitive business environment. In this volatile business sector it is of the utmost importance to implement a strategy of quality differentiation. Vital aspects of a quality differentiation strategy are total quality management, benchmarking and service quality management. Ultimately, restaurant success depends on the continuous support of customers. Customers select restaurants based on their expectations of quality. If the customers' expectations are met, they perceive quality service and will re-patronize the restaurant. The restaurateur can manage perceptions of quality by influencing expectations while ensuring that those expectations are not inflated. The management of expectations can be done by communicating service quality to customers. The aim of this research paper is to describe the development of a grading process for restaurants. An assessment of the extensive body of literature on grading was conducted through content analysis. A standardized method for developing a grading system would assist in successful grading systems that could inform both customers and restaurateurs of restaurant quality.

Keywords: benchmarking, restaurants, grading, service quality, total quality management

Procedia PDF Downloads 325
28491 Generic Data Warehousing for Consumer Electronics Retail Industry

Authors: S. Habte, K. Ouazzane, P. Patel, S. Patel

Abstract:

The dynamic and highly competitive nature of the consumer electronics retail industry means that businesses in this industry are experiencing different decision making challenges in relation to pricing, inventory control, consumer satisfaction and product offerings. To overcome the challenges facing retailers and create opportunities, we propose a generic data warehousing solution which can be applied to a wide range of consumer electronics retailers with a minimum configuration. The solution includes a dimensional data model, a template SQL script, a high level architectural descriptions, ETL tool developed using C#, a set of APIs, and data access tools. It has been successfully applied by ASK Outlets Ltd UK resulting in improved productivity and enhanced sales growth.

Keywords: consumer electronics, data warehousing, dimensional data model, generic, retail industry

Procedia PDF Downloads 405
28490 The Effect of Emotional Intelligence on Performance and Motivation of Staff: A Case Study of East Azerbaijan Red Crescent

Authors: Bahram Asghari Aghdam, Ali Mahjoub

Abstract:

The purpose of this study is to evaluate the effect of emotional intelligence on the motivation and performance of East Azarbaijan the Red Crescent staff. In this study, EI is determined as the independent variable component of self awareness, self management, social awareness, and relations management, motivation and performance as dependent variables. The research method is descriptive-survey. In this study, simple random sampling method is used and research sample consists of 130 East Azarbaijan the Red Crescent staff that uses Cochran's formula 100 of them were selected and questionnaires were filled by them. Three types of questionnaires were used in this study for emotional intelligence, consisting of the Bradbury Travis and Jane Greaves standard questionnaire; and for motivation and performance a questionnaire is regulated by the researcher with help of professionals and experts in this field that consists of 33 questions about the motivation and 15 questions about performance and content validity were used to obtain the necessary credit. Reliability by using the Cronbach's alpha coefficient /948 was approved. Also, in this study to test the hypothesis of the Spearman correlation coefficient and linear regressions and determine fitness of variables' of structural equation modeling is used. The results show that emotional intelligence with coefficient /865, motivation and performance of in East Azerbaijan the Red Crescent employees has a positive effect. Based on Friedman Test ranking the most influence in motivation and performance of staff in respondents' opinion is in order of self-awareness, relations management, social awareness and self-management.

Keywords: emotional intelligence, self-awareness, self-management, social awareness, relations management, motivation, performance

Procedia PDF Downloads 468
28489 Design Criteria for an Internal Information Technology Cost Allocation to Support Business Information Technology Alignment

Authors: Andrea Schnabl, Mario Bernhart

Abstract:

The controlling instrument of an internal cost allocation (IT chargeback) is commonly used to make IT costs transparent and controllable. Information Technology (IT) became, especially for information industries, a central competitive factor. Consequently, the focus is not on minimizing IT costs but on the strategic aligned application of IT. Hence, an internal IT cost allocation should be designed to enhance the business-IT alignment (strategic alignment of IT) in order to support the effective application of IT from a company’s point of view. To identify design criteria for an internal cost allocation to support business alignment a case study analysis at a typical medium-sized firm in information industry is performed. Documents, Key Performance Indicators, and cost accounting data over a period of 10 years are analyzed and interviews are performed. The derived design criteria are evaluated by 6 heads of IT departments from 6 different companies, which have an internal IT cost allocation at use. By applying these design criteria an internal cost allocation serves not only for cost controlling but also as an instrument in strategic IT management.

Keywords: accounting for IT services, Business IT Alignment, internal cost allocation, IT controlling, IT governance, strategic IT management

Procedia PDF Downloads 154
28488 Sequential Data Assimilation with High-Frequency (HF) Radar Surface Current

Authors: Lei Ren, Michael Hartnett, Stephen Nash

Abstract:

The abundant measured surface current from HF radar system in coastal area is assimilated into model to improve the modeling forecasting ability. A simple sequential data assimilation scheme, Direct Insertion (DI), is applied to update model forecast states. The influence of Direct Insertion data assimilation over time is analyzed at one reference point. Vector maps of surface current from models are compared with HF radar measurements. Root-Mean-Squared-Error (RMSE) between modeling results and HF radar measurements is calculated during the last four days with no data assimilation.

Keywords: data assimilation, CODAR, HF radar, surface current, direct insertion

Procedia PDF Downloads 567
28487 Measured versus Default Interstate Traffic Data in New Mexico, USA

Authors: M. A. Hasan, M. R. Islam, R. A. Tarefder

Abstract:

This study investigates how the site specific traffic data differs from the Mechanistic Empirical Pavement Design Software default values. Two Weigh-in-Motion (WIM) stations were installed in Interstate-40 (I-40) and Interstate-25 (I-25) to developed site specific data. A computer program named WIM Data Analysis Software (WIMDAS) was developed using Microsoft C-Sharp (.Net) for quality checking and processing of raw WIM data. A complete year data from November 2013 to October 2014 was analyzed using the developed WIM Data Analysis Program. After that, the vehicle class distribution, directional distribution, lane distribution, monthly adjustment factor, hourly distribution, axle load spectra, average number of axle per vehicle, axle spacing, lateral wander distribution, and wheelbase distribution were calculated. Then a comparative study was done between measured data and AASHTOWare default values. It was found that the measured general traffic inputs for I-40 and I-25 significantly differ from the default values.

Keywords: AASHTOWare, traffic, weigh-in-motion, axle load distribution

Procedia PDF Downloads 337
28486 Management Control Systems in Post-Incubation: An Investigation of Closed Down High-Technology Start-Ups

Authors: Jochen Edmund Kerschenbauer, Roman Salinger, Daniel Strametz

Abstract:

Insufficient informal communication systems can lead to the first crisis (‘Crisis of Leadership’) for start-ups. Management Control Systems (MCS) are one way for high-technology start-ups to successfully overcome these problems. So far the literature has investigated the incubation of a start-up, but focused less on the post-incubation stage. This paper focuses on the use of MCS in post-incubation and, if failed start-ups agree, on how MCS are used. We conducted 14 semi-structured interviews for this purpose, to obtain our results. The overall conclusion is that the majority of the companies were closed down due to a combination of strategic, operative and financial reasons.

Keywords: closed down, high-technology, incubation, levers of control, management control systems, post-incubation, start-ups

Procedia PDF Downloads 1088
28485 A Descriptive Study of the Characteristics of Introductory Accounting Courses Offered by Community Colleges

Authors: Jonathan Nash, Allen Hartt, Catherine Plante

Abstract:

In many nations, community colleges, or similar institutions, play a crucial role in higher education. For example, in the United States more than half of all undergraduate students enroll in a community college at some point during their academic career. Similar statistics have been reported for Australia and Canada. Recognizing the important role these institutions play in educating future accountants, the American Accounting Association has called for research that contributes to a better understanding of these members of the academic community. Although previous literature has shown that community colleges and 4-year institutions differ on many levels, the extant literature has provided data on the characteristics of introductory accounting courses for four-year institutions but not for community colleges. We fill a void in the literature by providing data on the characteristics of introductory accounting courses offered by community colleges in the United States. Data are collected on several dimensions including: course size and staffing, pedagogical orientation, standardization of course elements, textbook selection, and use of technology-based course management tools. Many of these dimensions have been used in previous research examining four-year institutions thereby facilitating comparisons. The resulting data should be of interest to instructors, regulators and administrators, researchers, and the accounting profession. The data provide information on the introductory accounting courses completed by the average community college student which can help instructors identify areas where transfer students’ experiences might differ from their contemporaries at four-year colleges. Regulators and administrators may be interested in the differences between accounting courses offered by two- and four-year institutions when implementing standardized transfer programs. Researchers might use the data to motivate future research into whether differences between two- and four-year institutions affect outcomes like the probability of students choosing to major in accounting and their performance within the major. Accounting professionals may use our findings as a springboard for facilitating discussions related to the accounting labor supply.

Keywords: Accounting curricula, Community college, Descriptive study, Introductory accounting

Procedia PDF Downloads 99
28484 Partisan Agenda Setting in Digital Media World

Authors: Hai L. Tran

Abstract:

Previous research on agenda setting effects has often focused on the top-down influence of the media at the aggregate level, while overlooking the capacity of audience members to select media and content to fit their individual dispositions. The decentralized characteristics of online communication and digital news create more choices and greater user control, thereby enabling each audience member to seek out a unique blend of media sources, issues, and elements of messages and to mix them into a coherent individual picture of the world. This study examines how audiences use media differently depending on their prior dispositions, thereby making sense of the world in ways that are congruent with their preferences and cognitions. The current undertaking is informed by theoretical frameworks from two distinct lines of scholarship. According to the ideological migration hypothesis, individuals choose to live in communities with ideologies like their own to satisfy their need to belong. One tends to move away from Zip codes that are incongruent and toward those that are more aligned with one’s ideological orientation. This geographical division along ideological lines has been documented in social psychology research. As an extension of agenda setting, the agendamelding hypothesis argues that audiences seek out information in attractive media and blend them into a coherent narrative that fits with a common agenda shared by others, who think as they do and communicate with them about issues of public. In other words, individuals, through their media use, identify themselves with a group/community that they want to join. Accordingly, the present study hypothesizes that because ideology plays a role in pushing people toward a physical community that fits their need to belong, it also leads individuals to receive an idiosyncratic blend of media and be influenced by such selective exposure in deciding what issues are more relevant. Consequently, the individualized focus of media choices impacts how audiences perceive political news coverage and what they know about political issues. The research project utilizes recent data from The American Trends Panel survey conducted by Pew Research Center to explore the nuanced nature of agenda setting at the individual level and amid heightened polarization. Hypothesis testing is performed with both nonparametric and parametric procedures, including regression and path analysis. This research attempts to explore the media-public relationship from a bottom-up approach, considering the ability of active audience members to select among media in a larger process that entails agenda setting. It helps encourage agenda-setting scholars to further examine effects at the individual, rather than aggregate, level. In addition to theoretical contributions, the study’s findings are useful for media professionals in building and maintaining relationships with the audience considering changes in market share due to the spread of digital and social media.

Keywords: agenda setting, agendamelding, audience fragmentation, ideological migration, partisanship, polarization

Procedia PDF Downloads 54
28483 Study on Safety Management of Deep Foundation Pit Construction Site Based on Building Information Modeling

Authors: Xuewei Li, Jingfeng Yuan, Jianliang Zhou

Abstract:

The 21st century has been called the century of human exploitation of underground space. Due to the characteristics of large quantity, tight schedule, low safety reserve and high uncertainty of deep foundation pit engineering, accidents frequently occur in deep foundation pit engineering, causing huge economic losses and casualties. With the successful application of information technology in the construction industry, building information modeling has become a research hotspot in the field of architectural engineering. Therefore, the application of building information modeling (BIM) and other information communication technologies (ICTs) in construction safety management is of great significance to improve the level of safety management. This research summed up the mechanism of the deep foundation pit engineering accident through the fault tree analysis to find the control factors of deep foundation pit engineering safety management, the deficiency existing in the traditional deep foundation pit construction site safety management. According to the accident cause mechanism and the specific process of deep foundation pit construction, the hazard information of deep foundation pit engineering construction site was identified, and the hazard list was obtained, including early warning information. After that, the system framework was constructed by analyzing the early warning information demand and early warning function demand of the safety management system of deep foundation pit. Finally, the safety management system of deep foundation pit construction site based on BIM through combing the database and Web-BIM technology was developed, so as to realize the three functions of real-time positioning of construction site personnel, automatic warning of entering a dangerous area, real-time monitoring of deep foundation pit structure deformation and automatic warning. This study can initially improve the current situation of safety management in the construction site of deep foundation pit. Additionally, the active control before the occurrence of deep foundation pit accidents and the whole process dynamic control in the construction process can be realized so as to prevent and control the occurrence of safety accidents in the construction of deep foundation pit engineering.

Keywords: Web-BIM, safety management, deep foundation pit, construction

Procedia PDF Downloads 149
28482 Smart Services for Easy and Retrofittable Machine Data Collection

Authors: Till Gramberg, Erwin Gross, Christoph Birenbaum

Abstract:

This paper presents the approach of the Easy2IoT research project. Easy2IoT aims to enable companies in the prefabrication sheet metal and sheet metal processing industry to enter the Industrial Internet of Things (IIoT) with a low-threshold and cost-effective approach. It focuses on the development of physical hardware and software to easily capture machine activities from on a sawing machine, benefiting various stakeholders in the SME value chain, including machine operators, tool manufacturers and service providers. The methodological approach of Easy2IoT includes an in-depth requirements analysis and customer interviews with stakeholders along the value chain. Based on these insights, actions, requirements and potential solutions for smart services are derived. The focus is on providing actionable recommendations, competencies and easy integration through no-/low-code applications to facilitate implementation and connectivity within production networks. At the core of the project is a novel, non-invasive measurement and analysis system that can be easily deployed and made IIoT-ready. This system collects machine data without interfering with the machines themselves. It does this by non-invasively measuring the tension on a sawing machine. The collected data is then connected and analyzed using artificial intelligence (AI) to provide smart services through a platform-based application. Three Smart Services are being developed within Easy2IoT to provide immediate benefits to users: Wear part and product material condition monitoring and predictive maintenance for sawing processes. The non-invasive measurement system enables the monitoring of tool wear, such as saw blades, and the quality of consumables and materials. Service providers and machine operators can use this data to optimize maintenance and reduce downtime and material waste. Optimize Overall Equipment Effectiveness (OEE) by monitoring machine activity. The non-invasive system tracks machining times, setup times and downtime to identify opportunities for OEE improvement and reduce unplanned machine downtime. Estimate CO2 emissions for connected machines. CO2 emissions are calculated for the entire life of the machine and for individual production steps based on captured power consumption data. This information supports energy management and product development decisions. The key to Easy2IoT is its modular and easy-to-use design. The non-invasive measurement system is universally applicable and does not require specialized knowledge to install. The platform application allows easy integration of various smart services and provides a self-service portal for activation and management. Innovative business models will also be developed to promote the sustainable use of the collected machine activity data. The project addresses the digitalization gap between large enterprises and SME. Easy2IoT provides SME with a concrete toolkit for IIoT adoption, facilitating the digital transformation of smaller companies, e.g. through retrofitting of existing machines.

Keywords: smart services, IIoT, IIoT-platform, industrie 4.0, big data

Procedia PDF Downloads 64
28481 Traditional Management Systems and the Conservation of Cultural and Natural Heritage: Multiple Case Studies in Zimbabwe

Authors: Nyasha Agnes Gurira, Petronella Katekwe

Abstract:

Traditional management systems (TMS) are a vital source of knowledge for conserving cultural and natural heritage. TMS’s are renowned for their ability to preserve both tangible and intangible manifestations of heritage. They are a construct of the intricate relationship that exists between heritage and host communities, where communities are recognized as owners of heritage and so, set up management mechanisms to ensure its adequate conservation. Multiple heritage condition surveys were conducted to assess the effectiveness of using TMS in the conservation of both natural and cultural heritage. Surveys were done at Nharira Hills, Mahwemasimike, Dzimbahwe, Manjowe Rock art sites and Norumedzo forest which are heritage places in Zimbabwe. It assessed the state of conservation of the five case studies and assessed the role that host communities play in the management of these heritage places. It was revealed that TMS’s are effective in the conservation of natural heritage, however in relation to heritage forms with cultural manifestations, there are major disparities. These range from differences in appreciation and perception of value within communities leading to vandalism, over emphasis in the conservation of the intangible element as opposed to the tangible. This leaves the tangible element at risk. Despite these issues, TMS are a reliable knowledge base which enables more holistic conservation approaches for cultural and natural heritage.

Keywords: communities, cultural intangible, tangible heritage, traditional management systems, natural

Procedia PDF Downloads 545
28480 Possible Approach for Interlinking of Ponds to Mitigate Drought in Sivaganga Villages at Micro Level

Authors: Manikandan Sathianarayanan, Pernaidu Pasala

Abstract:

This paper presents the results of our studies concerning the implementation and exploitation of a Geographical Information System (GIS) dedicated to the support and assistance of decisions requested by drought management. In this study on diverting of surplus water through canals, pond sand check dams in the study area was carried out. The remote sensing data and GIS data was used to identify the drought prone villages in sivaganga taluk and to generate present land use, drainage pattern as well as slope and contour. This analysis was carried out for diverting surplus water through proposed canal and pond. The results of the study indicate that if the surplus water from the ponds and streams are diverted to the drought villages in Sivaganga taluk, it will definitely improve the agricultural production due to availability of water in the ponds. The improvements in agricultural production will help to improve the economical condition of the farmers in the region.

Keywords: interlinking, spatial analysis, remote sensing, GIS

Procedia PDF Downloads 248
28479 Insight-Based Evaluation of a Map-Based Dashboard

Authors: Anna Fredriksson Häägg, Charlotte Weil, Niklas Rönnberg

Abstract:

Map-based dashboards are used for data exploration every day. The present study used an insight-based methodology for evaluating a map-based dashboard that presents research findings of water management and ecosystem services in the Amazon. In addition to analyzing the insights gained from using the dashboard, the evaluation method was compared to standardized questionnaires and task-based evaluations. The result suggests that the dashboard enabled the participants to gain domain-relevant, complex insights regarding the topic presented. Furthermore, the insight-based analysis highlighted unexpected insights and hypotheses regarding causes and potential adaptation strategies for remediation. Although time- and resource-consuming, the insight-based methodology was shown to have the potential of thoroughly analyzing how end users can utilize map-based dashboards for data exploration and decision making. Finally, the insight-based methodology is argued to evaluate tools in scenarios more similar to real-life usage compared to task-based evaluation methods.

Keywords: visual analytics, dashboard, insight-based evaluation, geographic visualization

Procedia PDF Downloads 112
28478 Configuring Systems to Be Viable in a Crisis: The Role of Intuitive Decision-Making

Authors: Ayham Fattoum, Simos Chari, Duncan Shaw

Abstract:

Volatile, uncertain, complex, and ambiguous (VUCA) conditions threaten systems viability with emerging and novel events requiring immediate and localized responses. Such responsiveness is only possible through devolved freedom and emancipated decision-making. The Viable System Model (VSM) recognizes the need and suggests maximizing autonomy to localize decision-making and minimize residual complexity. However, exercising delegated autonomy in VUCA requires confidence and knowledge to use intuition and guidance to maintain systemic coherence. This paper explores the role of intuition as an enabler of emancipated decision-making and autonomy under VUCA. Intuition allows decision-makers to use their knowledge and experience to respond rapidly to novel events. This paper offers three contributions to VSM. First, it designs a system model that illustrates the role of intuitive decision-making in managing complexity and maintaining viability. Second, it takes a black-box approach to theory development in VSM to model the role of autonomy and intuition. Third, the study uses a multi-stage discovery-oriented approach (DOA) to develop theory, with each stage combining literature, data analysis, and model/theory development and identifying further questions for the subsequent stage. We synthesize literature (e.g., VSM, complexity management) with seven months of field-based insights (interviews, workshops, and observation of a live disaster exercise) to develop a framework of intuitive complexity management framework and VSM models. The results have practical implications for enhancing the resilience of organizations and communities.

Keywords: Intuition, complexity management, decision-making, viable system model

Procedia PDF Downloads 64
28477 The Road Ahead: Merging Human Cyber Security Expertise with Generative AI

Authors: Brennan Lodge

Abstract:

Amidst a complex regulatory landscape, Retrieval Augmented Generation (RAG) emerges as a transformative tool for Governance Risk and Compliance (GRC) officers. This paper details the application of RAG in synthesizing Large Language Models (LLMs) with external knowledge bases, offering GRC professionals an advanced means to adapt to rapid changes in compliance requirements. While the development for standalone LLM’s (Large Language Models) is exciting, such models do have their downsides. LLM’s cannot easily expand or revise their memory, and they can’t straightforwardly provide insight into their predictions, and may produce “hallucinations.” Leveraging a pre-trained seq2seq transformer and a dense vector index of domain-specific data, this approach integrates real-time data retrieval into the generative process, enabling gap analysis and the dynamic generation of compliance and risk management content. We delve into the mechanics of RAG, focusing on its dual structure that pairs parametric knowledge contained within the transformer model with non-parametric data extracted from an updatable corpus. This hybrid model enhances decision-making through context-rich insights, drawing from the most current and relevant information, thereby enabling GRC officers to maintain a proactive compliance stance. Our methodology aligns with the latest advances in neural network fine-tuning, providing a granular, token-level application of retrieved information to inform and generate compliance narratives. By employing RAG, we exhibit a scalable solution that can adapt to novel regulatory challenges and cybersecurity threats, offering GRC officers a robust, predictive tool that augments their expertise. The granular application of RAG’s dual structure not only improves compliance and risk management protocols but also informs the development of compliance narratives with pinpoint accuracy. It underscores AI’s emerging role in strategic risk mitigation and proactive policy formation, positioning GRC officers to anticipate and navigate the complexities of regulatory evolution confidently.

Keywords: cybersecurity, gen AI, retrieval augmented generation, cybersecurity defense strategies

Procedia PDF Downloads 88
28476 The Role of Social and Technical Lean Implementation in Improving Operational Performance: Insights from the Pharmaceutical Industry

Authors: Bernasconi Matteo, Grothkopp Mark, Friedli Thomas

Abstract:

The objective of this paper is to examine the relationships between technical and social lean bundles as well as operational performance in the context of the pharmaceutical industry. We investigate the direct and mediating effects of the lean bundles total productive maintenance (TPM), total quality management (TQM), Just-In-Time (JIT), and human resource management (HRM) on operational performance. Our analysis relies on 113 manufacturing facilities from the St.Gallen OPEX benchmarking database. The results show that HRM has a positive indirect effect on operational performance mediated by the technical lean bundles.

Keywords: human resource management, operational performance, pharmaceutical industry, technical lean practices

Procedia PDF Downloads 123
28475 Educational Tours as a Learning Tool to the Third Years Tourism Students of De La Salle University, Dasmarinas

Authors: Jackqueline Uy, Hannah Miriam Verano, Crysler Luis Verbo, Irene Gueco

Abstract:

Educational tours are part of the curriculum of the College of Tourism and Hospitality Management, De La Salle University-Dasmarinas. They are highly significant to the students, especially Tourism students. The purpose of this study was to determine how effective educational tours were as a learning tool using the Experiential Learning Theory by David Kolb. This study determined the demographic profile of the third year tourism students in terms of gender, section, educational tours joined, and monthly family income and lastly, this study determined if there is a significant difference between the demographic profile of the respondents and their assessment of educational tours as a learning tool. The researchers used a historical research design with the third-year students of the bachelor of science in tourism management as the population size and used a random sampling method. The researchers made a survey questionnaire and utilized statistical tools such as weighted mean, frequency distribution, percentage, standard deviation, T-test, and ANOVA. The result of the study answered the profile of the respondents such as the gender, section, educational tour/s joined, and family monthly income. The findings of the study showed that the 3rd year tourism management students strongly agree that educational tours are a highly effective learning tool in terms of active experimentation, concrete experience, reflective observation, and abstract conceptualisation based on the data gathered from the respondents.

Keywords: CTHM, educational tours, experiential learning theory, De La Salle University Dasmarinas, tourism

Procedia PDF Downloads 165
28474 Management of Blood Exposure Risk: Knowledge and Attitudes of Caregivers in Pediatric Dapartments

Authors: Hela Ghali, Oumayma Ben Amor, Salwa Khefacha, Mohamed Ben Rejeb, Sirine Frigui, Meriam Tourki Dhidah, Lamine Dhidah, Houyem Said Laatiri

Abstract:

Background: Blood exposure accidents are the most common problem in hospitals that threaten healthcare professionals with a high risk of infectious complications which weighs heavily on health systems worldwide. Paramedics are the highest risk group due to the nature of their daily activities. We aimed to determine knowledge and attitudes about the management of blood-exposure accidents among nurses and technicians in two pediatric departments. Materials/Methods: This is a cross-sectional descriptive study conducted on March 2017, carried out with the care staff of the pediatric ward of the Farhat Hached Teaching Hospital of Sousse and pediatric surgery of the Fattouma Bourguiba University Hospital in Monastir, using a pre- tested and self-administered questionnaire. Data entry and analysis were performed using Excel software. Results: The response rate was 85.1%. A female predominance (82.5%) was reported among respondents with a sex ratio of 0.21. 80% of the participants were under 35 years old. Seniority of less than 10 years was found in 77.5% of respondents. Only 22.5% knew the definition of a blood- exposure accident. 100% and 95% of participants reported the relative risk, respectively, to hepatitis and AIDS viruses. However, only 15% recognized the severity factors of a blood-exposure accident. Hygiene compliance was the most important dimension for almost the entire population for the prevention. On the other hand, only 12.5% knew the meaning of 'standard precautions' and ¼ considered them necessary for at-risk patients only. 40% reported being exposed at least once, among them, 87.5% used betadine, and 77.5% said that anti-infectious chemoprophylaxis is necessary regardless of the patient's serological status. However, 52.5% did not know the official reporting circuit of management of blood-exposure accident in their institutions. Conclusion: For better management of risks in hospitals and an improvement of the safety of the care, a reinforcement of the sensibilization of the caregivers with regard to the risks of blood exposure accident is necessary, while developing their knowledge to act in security.

Keywords: attitudes, blood-exposure accident, knowledge, pediatric department

Procedia PDF Downloads 191
28473 A Survey on Data-Centric and Data-Aware Techniques for Large Scale Infrastructures

Authors: Silvina Caíno-Lores, Jesús Carretero

Abstract:

Large scale computing infrastructures have been widely developed with the core objective of providing a suitable platform for high-performance and high-throughput computing. These systems are designed to support resource-intensive and complex applications, which can be found in many scientific and industrial areas. Currently, large scale data-intensive applications are hindered by the high latencies that result from the access to vastly distributed data. Recent works have suggested that improving data locality is key to move towards exascale infrastructures efficiently, as solutions to this problem aim to reduce the bandwidth consumed in data transfers, and the overheads that arise from them. There are several techniques that attempt to move computations closer to the data. In this survey we analyse the different mechanisms that have been proposed to provide data locality for large scale high-performance and high-throughput systems. This survey intends to assist scientific computing community in understanding the various technical aspects and strategies that have been reported in recent literature regarding data locality. As a result, we present an overview of locality-oriented techniques, which are grouped in four main categories: application development, task scheduling, in-memory computing and storage platforms. Finally, the authors include a discussion on future research lines and synergies among the former techniques.

Keywords: data locality, data-centric computing, large scale infrastructures, cloud computing

Procedia PDF Downloads 254
28472 The Human Resource Management Systems and Practices of Multinational Companies in Their Nigerian Subsidiaries

Authors: Suwaiba Sabiu Bako, Yaw Debrah

Abstract:

In spite of the extensive literature available on the human resource management (HRM) systems and practices of multinational companies (MNCs) from developed countries, there are gaps concerning emerging countries’ multinational companies’ (EMNCs) HRM systems and practices. This study examines the transfer of HRM practices in Nigerian subsidiaries of MNCs from South Africa. It reveals that South MNCs hybridise their recruitment and selection processes and localise their compensation and employee relations. It also proves that performance appraisal, talent management and code of conduct practices are largely transferred to subsidiaries with minimal adaptation.

Keywords: EMNCs, HRM practices, HRM systems, Nigeria, South Africa

Procedia PDF Downloads 104
28471 Solid Waste Pollution and the Importance of Environmental Planning in Managing and Preserving the Public Environment in Benghazi City and Its Surrounding Areas

Authors: Abdelsalam Omran Gebril

Abstract:

Pollution and solid waste are the most important environmental problems plaguing the city of Benghazi as well as other cities and towns in Libya. These problems are caused by the lack of environmental planning and sound environmental management. Environmental planning is very important at present for the development of projects that preserve the environment, therefore, the planning process should be prioritized over the management process. Pollution caused by poor planning and environmental management exists not only in Benghazi but also in all other Libyan cities. This study was conducted through various field visits to several neighborhoods and areas within Benghazi as well as its neighboring regions. Follow-ups in these areas were conducted from March 2013 to October 2013 and documented by photographs. The existing methods of waste collection and means of transportation were investigated. Interviews were conducted with relevant authorities, including the Environment Public Authority in Benghazi and the Public Service Company of Benghazi. The objective of this study is to determine the causes of solid waste pollution in Benghazi City and its surrounding areas. Results show that solid waste pollution in Benghazi and its surrounding areas is the result of poor planning and environmental management, population growth, and the lack of hardware and equipment for the collection and transport of waste from the city to the landfill site. One of the most important recommendations in this study is the development of a complete and comprehensive plan that includes environmental planning and environmental management to reduce solid waste pollution.

Keywords: solid waste, pollution, environmental planning, management, Benghazi, Libya

Procedia PDF Downloads 308
28470 Seismic Preparedness Challenge in Ionian Islands (Greece) through 'Telemachus' Project

Authors: A. Kourou, M. Panoutsopoulou

Abstract:

Nowadays, disaster risk reduction requires innovative ways of working collaboratively, monitoring tools, management methods, risk communication, and knowledge, as key factors for decision-making actors. Experience has shown that the assessment of seismic risk and its effective management is still an important challenge. In Greece, Ionian Islands region is characterized as the most seismic area of the country and one of the most active worldwide. It is well known that in case of a disastrous earthquake the local authorities need to assess the situation in the affected area and coordinate the disaster response. In particular, the main outcomes of 'Telemachus' project are the development of an innovative operational system that hosts the needed data of seismic risk management in the Ionian Islands and the implementation of educational actions for the involved target groups. This project is funded in the Priority Axis 'Environmental Protection and Sustainable Development' of Operational Plan 'Ionian Islands 2014-2020'. EPPO is one of the partners of the project and it is responsible, among others, for the development of proper training material. This paper presents the training material of 'Telemachus' and its usage as a helpful, managerial tool in case of earthquake emergency. This material is addressed to different target groups, such as civil protection staff, people that involved with the tourism industry, educators of disabled people, etc. Very positive aspect of the project is the involvement of end-users that should evaluate the training products; test standards; clarify the personnel’s roles and responsibilities; improve interagency coordination; identify gaps in resources; improve individual performance; and identify opportunities for improvement. It is worth mentioning that even though the abovementioned material developed is useful for the training of specific target groups on emergency management issues within Ionian Islands Region, it could be used throughout Greece and other countries too.

Keywords: education of civil protection staff, Ionian Islands Region of Greece, seismic risk, training material

Procedia PDF Downloads 121