Search results for: data management applications
35291 Development of Intervention Policy Options for Sustainable Fisheries Management of Lake Hawassa, Ethiopia
Authors: Mekonen Hailu, Gashaw Tesfaye, Adamneh Dagne, Hiwot Teshome
Abstract:
Lake Hawassa is one of the most important lakes for Ethiopian fishery. It serves as a source of food and nutrition, income and livelihood for many inhabitants. However, the fishery in Lake Hawassa shows a declining trend, especially for the most valuable species, such as the Nile tilapia (Oreochromis niloticus L.), indicating that the existing management systems are either not fully enforced or inadequate. The aim of this study was therefore to develop management policy options for the sustainable utilization and management of fishery resources in Lake Hawassa. A blend of primary and secondary data was used for the study. Primary data were collected using Participatory Rural Appraisal (PRA) techniques such as focus group discussions with members of fishing co-operatives, co-operative leaders and key informant discussion to understand the current state of the fisheries resources. Then literatures were reviewed to obtain secondary data and develop alternative management policy options. It has been realized that Lake Hawassa is not very species-rich in terms of fish diversity. It contains only six species belonging to four families, of which only three are commercially important, including the Nile tilapia (90 % of catches), the African catfish Clarias gariepinus B. (7 % of catches) and the African large barb Labeobarbus intermedius R. (only 3 % of catches). The production has been declining since 2007. The top six challenges that could be responsible for this decline, identified by about two-thirds of respondents and supported by the literature review, are directly linked to fisheries and fisheries management, with overfishing, irregular monitoring, control, and surveillance (MCS) system and the lack of a fishing licensing system ranking first, second and third respectively. It is, therefore, important to address these and other problems identified in the study. Of the management options analyzed, we suggest adapting the management approach to sustain the fishery in Lake Hawaasa and its socio-economic benefits. We also present important conditions for successfully implementing co-management in this and other lakes in Ethiopia.Keywords: comanagement, community-based management, fishery, overfishing, participatory approach, top-down management
Procedia PDF Downloads 1235290 Phase II Monitoring of First-Order Autocorrelated General Linear Profiles
Authors: Yihua Wang, Yunru Lai
Abstract:
Statistical process control has been successfully applied in a variety of industries. In some applications, the quality of a process or product is better characterized and summarized by a functional relationship between a response variable and one or more explanatory variables. A collection of this type of data is called a profile. Profile monitoring is used to understand and check the stability of this relationship or curve over time. The independent assumption for the error term is commonly used in the existing profile monitoring studies. However, in many applications, the profile data show correlations over time. Therefore, we focus on a general linear regression model with a first-order autocorrelation between profiles in this study. We propose an exponentially weighted moving average charting scheme to monitor this type of profile. The simulation study shows that our proposed methods outperform the existing schemes based on the average run length criterion.Keywords: autocorrelation, EWMA control chart, general linear regression model, profile monitoring
Procedia PDF Downloads 46035289 Timing and Noise Data Mining Algorithm and Software Tool in Very Large Scale Integration (VLSI) Design
Authors: Qing K. Zhu
Abstract:
Very Large Scale Integration (VLSI) design becomes very complex due to the continuous integration of millions of gates in one chip based on Moore’s law. Designers have encountered numerous report files during design iterations using timing and noise analysis tools. This paper presented our work using data mining techniques combined with HTML tables to extract and represent critical timing/noise data. When we apply this data-mining tool in real applications, the running speed is important. The software employs table look-up techniques in the programming for the reasonable running speed based on performance testing results. We added several advanced features for the application in one industry chip design.Keywords: VLSI design, data mining, big data, HTML forms, web, VLSI, EDA, timing, noise
Procedia PDF Downloads 25435288 Human Resource Practices and Organization Knowledge Capability: An Exploratory Study Applied to Private Organization
Authors: Mamoona Rasheed, Salman Iqbal, Muhammad Abdullah
Abstract:
Organizational capability, in terms of employees’ knowledge is valuable, and difficult to reproduce; and help to build sustainable competitive advantages. Knowledge capability is linked with human resource (HR) practices of an organization. This paper investigates the relationship between HR practices, knowledge management and organization capability. In an organization, employees play key role for the effective organizational performance by sharing their knowledge with management and co-workers that contributes towards organization capability. Pakistan being a developing country has different HR practices and culture. The business opportunities give rise to the discussion about the effect of HR practices on knowledge management and organization capability as innovation performance. An empirical study is conducted through questionnaires form the employees in private banks of Lahore, Pakistan. The data is collected via structured questionnaire with a sample of 120 cases. Data is analyzed using Structure Equation Modeling (SEM), and results are depicted using AMOS software. Results of this study are tabulated, interpreted and crosschecked with other studies. Findings suggest that there is a positive relationship of training & development along with incentives on knowledge management. On the other hand, employee’s participation has insignificant association with knowledge management. In addition, knowledge management has also positive association with organization capability. In line with the previous research, it is suggested that knowledge management is important for improving the organizational capability such as innovation performance and knowledge capacity of firm. Organization capability may improve significantly once specific HR practices are properly established and implemented by HR managers. This Study has key implications for knowledge management and innovation fields theoretically and practically.Keywords: employee participation, incentives, knowledge management, organization capability, training and development
Procedia PDF Downloads 16035287 A Relationship between Transformational Leadership, Internal Audit and Risk Management Implementation in the Indonesian Public Sector
Authors: Tio Novita Efriani
Abstract:
Public sector organizations work in a complex and risky environment. Since the beginning of 2000s, the public sector has paid attention to the need for an effective risk management. The Indonesian public sector has also concerned about this issue and in 2008 it enacted the Government Regulation that gives mandate for the implementation of risk management in government organizations. This paper investigates risk management implementation in the Indonesian public sector organizations and the role of transformational leadership and internal audit activities. Data was collected via survey. A total of 202 effective responses (30% response rate) from employees in 34 government ministries were statistically analyzed by using Partial least square structural equation modelling (PLS-SEM) and the software was SmartPLS 3.0. All the constructs were lower order, except for the risk management implementation construct, which was treated as a second-order construct. A two-stage approach was employed in the analysis of the higher order component. The findings revealed that transformational leadership positively influence risk management implementation. The findings also found that the core and legitimate roles of internal audit in risk management positively affect the implementation of risk management. The final finding showed that internal auditing mediates a relationship between transformational leadership and risk management implementation. These results suggest that the implementation of risk management in the Indonesian public sector was significantly supported by internal auditors and leadership. The findings confirm the importance of transformational leadership and internal audit in the public sector risk management strategies.Keywords: Indonesian public sector, internal audit, risk management, transformational leadership
Procedia PDF Downloads 20435286 Enhancing Information Technologies with AI: Unlocking Efficiency, Scalability, and Innovation
Authors: Abdal-Hafeez Alhussein
Abstract:
Artificial Intelligence (AI) has become a transformative force in the field of information technologies, reshaping how data is processed, analyzed, and utilized across various domains. This paper explores the multifaceted applications of AI within information technology, focusing on three key areas: automation, scalability, and data-driven decision-making. We delve into how AI-powered automation is optimizing operational efficiency in IT infrastructures, from automated network management to self-healing systems that reduce downtime and enhance performance. Scalability, another critical aspect, is addressed through AI’s role in cloud computing and distributed systems, enabling the seamless handling of increasing data loads and user demands. Additionally, the paper highlights the use of AI in cybersecurity, where real-time threat detection and adaptive response mechanisms significantly improve resilience against sophisticated cyberattacks. In the realm of data analytics, AI models—especially machine learning and natural language processing—are driving innovation by enabling more precise predictions, automated insights extraction, and enhanced user experiences. The paper concludes with a discussion on the ethical implications of AI in information technologies, underscoring the importance of transparency, fairness, and responsible AI use. It also offers insights into future trends, emphasizing the potential of AI to further revolutionize the IT landscape by integrating with emerging technologies like quantum computing and IoT.Keywords: artificial intelligence, information technology, automation, scalability
Procedia PDF Downloads 1935285 Creation and Validation of a Measurement Scale of E-Management: An Exploratory and Confirmatory Study
Authors: Hamadi Khlif
Abstract:
This paper deals with the understanding of the concept of e-management and the development of a measuring instrument adapted to the new problems encountered during the application of this new practice within the modern enterprise. Two principal e-management factors have been isolated in an exploratory study carried out among 260 participants. A confirmatory study applied to a second sample of 270 participants has been established in a cross-validation of the scale of measurement. The study presents the literature review specifically dedicated to e-management and the results of the exploratory and confirmatory phase of the development of this scale, which demonstrates satisfactory psychometric qualities. The e-management has two dimensions: a managerial dimension and a technological dimension.Keywords: e-management, management, ICT deployment, mode of management
Procedia PDF Downloads 32535284 Fuzzy Optimization Multi-Objective Clustering Ensemble Model for Multi-Source Data Analysis
Authors: C. B. Le, V. N. Pham
Abstract:
In modern data analysis, multi-source data appears more and more in real applications. Multi-source data clustering has emerged as a important issue in the data mining and machine learning community. Different data sources provide information about different data. Therefore, multi-source data linking is essential to improve clustering performance. However, in practice multi-source data is often heterogeneous, uncertain, and large. This issue is considered a major challenge from multi-source data. Ensemble is a versatile machine learning model in which learning techniques can work in parallel, with big data. Clustering ensemble has been shown to outperform any standard clustering algorithm in terms of accuracy and robustness. However, most of the traditional clustering ensemble approaches are based on single-objective function and single-source data. This paper proposes a new clustering ensemble method for multi-source data analysis. The fuzzy optimized multi-objective clustering ensemble method is called FOMOCE. Firstly, a clustering ensemble mathematical model based on the structure of multi-objective clustering function, multi-source data, and dark knowledge is introduced. Then, rules for extracting dark knowledge from the input data, clustering algorithms, and base clusterings are designed and applied. Finally, a clustering ensemble algorithm is proposed for multi-source data analysis. The experiments were performed on the standard sample data set. The experimental results demonstrate the superior performance of the FOMOCE method compared to the existing clustering ensemble methods and multi-source clustering methods.Keywords: clustering ensemble, multi-source, multi-objective, fuzzy clustering
Procedia PDF Downloads 19135283 Applications of AI, Machine Learning, and Deep Learning in Cyber Security
Authors: Hailyie Tekleselase
Abstract:
Deep learning is increasingly used as a building block of security systems. However, neural networks are hard to interpret and typically solid to the practitioner. This paper presents a detail survey of computing methods in cyber security, and analyzes the prospects of enhancing the cyber security capabilities by suggests that of accelerating the intelligence of the security systems. There are many AI-based applications used in industrial scenarios such as Internet of Things (IoT), smart grids, and edge computing. Machine learning technologies require a training process which introduces the protection problems in the training data and algorithms. We present machine learning techniques currently applied to the detection of intrusion, malware, and spam. Our conclusions are based on an extensive review of the literature as well as on experiments performed on real enterprise systems and network traffic. We conclude that problems can be solved successfully only when methods of artificial intelligence are being used besides human experts or operators.Keywords: artificial intelligence, machine learning, deep learning, cyber security, big data
Procedia PDF Downloads 12735282 A Machine Learning Approach for the Leakage Classification in the Hydraulic Final Test
Authors: Christian Neunzig, Simon Fahle, Jürgen Schulz, Matthias Möller, Bernd Kuhlenkötter
Abstract:
The widespread use of machine learning applications in production is significantly accelerated by improved computing power and increasing data availability. Predictive quality enables the assurance of product quality by using machine learning models as a basis for decisions on test results. The use of real Bosch production data based on geometric gauge blocks from machining, mating data from assembly and hydraulic measurement data from final testing of directional valves is a promising approach to classifying the quality characteristics of workpieces.Keywords: machine learning, classification, predictive quality, hydraulics, supervised learning
Procedia PDF Downloads 21435281 Enablers and Inhibitors of Effective Waste Management Measures in Informal Settlements in South Africa: A Case of Alaska
Authors: Lynda C. Mbadugha, Bankole Awuzie, Kwanda Khumalo, Lindokuhle Matsebula, Masenoke Kgaditsi
Abstract:
Inadequate waste management remains a fundamental issue in the majority of cities around the globe, but it becomes a threat when it concerns informal settlements. Although studies have evaluated the performance of waste management measures, only a few have addressed that with a focus on South African informal settlements and the reasons for their apparent ineffectiveness in such locations. However, there may be evidence of variations in the extant problems due to the uniqueness of each location and the factors influencing the performance. Thus, there is a knowledge deficit regarding implementing waste management measures in South African informal settlements. This study seeks to evaluate the efficacy of waste management measures in the Alaska informal settlement in South Africa to assess the previously collected data of other areas using the degree of correlation. The research investigated a real-world scenario in the specified location using a case study approach and multiple data sources. The findings described various waste management practices used in Alaska's informal settlements; however, a correlation was found between the performance of these measures and those already used. The observed differences are primarily attributable to the physical characteristics of the locations, the lack of understanding of the environmental and health consequences of careless waste disposal, and the negative attitudes of the residents toward waste management practices. This study elucidates waste management implementation in informal settlements. It contributes to the relevant bodies of knowledge by describing these practices in South Africa. This paper's practical value emphasizes the general waste management characteristics of South Africa's informal settlements to facilitate the planning and provision of necessary interventions. The study concludes that the enablers and inhibitors are mainly political, behavioral, and environmental concerns.Keywords: factors, informal settlement, performance, waste management
Procedia PDF Downloads 9635280 Regional Flood-Duration-Frequency Models for Norway
Authors: Danielle M. Barna, Kolbjørn Engeland, Thordis Thorarinsdottir, Chong-Yu Xu
Abstract:
Design flood values give estimates of flood magnitude within a given return period and are essential to making adaptive decisions around land use planning, infrastructure design, and disaster mitigation. Often design flood values are needed at locations with insufficient data. Additionally, in hydrologic applications where flood retention is important (e.g., floodplain management and reservoir design), design flood values are required at different flood durations. A statistical approach to this problem is a development of a regression model for extremes where some of the parameters are dependent on flood duration in addition to being covariate-dependent. In hydrology, this is called a regional flood-duration-frequency (regional-QDF) model. Typically, the underlying statistical distribution is chosen to be the Generalized Extreme Value (GEV) distribution. However, as the support of the GEV distribution depends on both its parameters and the range of the data, special care must be taken with the development of the regional model. In particular, we find that the GEV is problematic when developing a GAMLSS-type analysis due to the difficulty of proposing a link function that is independent of the unknown parameters and the observed data. We discuss these challenges in the context of developing a regional QDF model for Norway.Keywords: design flood values, bayesian statistics, regression modeling of extremes, extreme value analysis, GEV
Procedia PDF Downloads 7235279 Quantifying the Methods of Monitoring Timers in Electric Water Heater for Grid Balancing on Demand-Side Management: A Systematic Mapping Review
Authors: Yamamah Abdulrazaq, Lahieb A. Abrahim, Samuel E. Davies, Iain Shewring
Abstract:
An electric water heater (EWH) is a powerful appliance that uses electricity in residential, commercial, and industrial settings, and the ability to control them properly will result in cost savings and the prevention of blackouts on the national grid. This article discusses the usage of timers in EWH control strategies for demand-side management (DSM). Up to the authors' knowledge, there is no systematic mapping review focusing on the utilisation of EWH control strategies in DSM has yet been conducted. Consequently, the purpose of this research is to identify and examine main papers exploring EWH procedures in DSM by quantifying and categorising information with regard to publication year and source, kind of methods, and source of data for monitoring control techniques. In order to answer the research questions, a total of 31 publications published between 1999 and 2023 were selected depending on specific inclusion and exclusion criteria. The data indicate that direct load control (DLC) has been somewhat more prevalent than indirect load control (ILC). Additionally, the mixing method is much lower than the other techniques, and the proportion of Real-time data (RTD) to non-real-time data (NRTD) is about equal.Keywords: demand side management, direct load control, electric water heater, indirect load control, non real-time data, real-time data
Procedia PDF Downloads 8235278 Central African Republic Government Recruitment Agency Based on Identity Management and Public Key Encryption
Authors: Koyangbo Guere Monguia Michel Alex Emmanuel
Abstract:
In e-government and especially recruitment, many researches have been conducted to build a trustworthy and reliable online or application system capable to process users or job applicant files. In this research (Government Recruitment Agency), cloud computing, identity management and public key encryption have been used to management domains, access control authorization mechanism and to secure data exchange between entities for reliable procedure of processing files.Keywords: cloud computing network, identity management systems, public key encryption, access control and authorization
Procedia PDF Downloads 36035277 The Status of Precision Agricultural Technology Adoption on Row Crop Farms vs. Specialty Crop Farms
Authors: Shirin Ghatrehsamani
Abstract:
Higher efficiency and lower environmental impact are the consequence of using advanced technology in farming. They also help to decrease yield variability by diminishing weather variability impact, optimizing nutrient and pest management as well as reducing competition from weeds. A better understanding of the pros and cons of applying technology and finding the main reason for preventing the utilization of the technology has a significant impact on developing technology adoption among farmers and producers in the digital agriculture era. The results from two surveys carried out in 2019 and 2021 were used to investigate whether the crop types had an impact on the willingness to utilize technology on the farms. The main focus of the questionnaire was on utilizing precision agriculture (PA) technologies among farmers in some parts of the united states. Collected data was analyzed to determine the practical application of various technologies. The survey results showed more similarities in the main reason not to use PA between the two crop types, but the present application of using technology in specialty crops is generally five times larger than in row crops. GPS receiver applications were reported similar for both types of crops. Lack of knowledge and high cost of data handling were cited as the main problems. The most significant difference was among using variable rate technology, which was 43% for specialty crops while was reported 0% for row crops. Pest scouting and mapping were commonly used for specialty crops, while they were rarely applied for row crops. Survey respondents found yield mapping, soil sampling map, and irrigation scheduling were more valuable for specialty crops than row crops in management decisions. About 50% of the respondents would like to share the PA data in both types of crops. Almost 50 % of respondents got their PA information from retailers in both categories, and as the second source, using extension agents were more common in specialty crops than row crops.Keywords: precision agriculture, smart farming, digital agriculture, technology adoption
Procedia PDF Downloads 11635276 Mobile Device Applications in Physical Education: Investigating New Pedagogical Possibilities
Authors: Danica Vidotto
Abstract:
Digital technology is continuing to disrupt and challenge local conventions of teaching and education. As mobile devices continue to make their way into contemporary classrooms, educators need new pedagogies incorporating information communication technology to help reform the learning environment. In physical education, however, this can seem controversial as physical inactivity is often related to an excess of screen-time. This qualitative research project is an investigation on how physical educators use mobile device applications (apps) in their pedagogy and to what end. A comprehensive literature review is included to examine and engage current academic research of new pedagogies and technology, and their relevance to physical activity. Data were collected through five semi-structured interviews resulting in three overarching themes; i) changing pedagogies in physical education; ii) the perceived benefits and experienced challenges of using apps; and iii) apps, physical activity, and physical education. This study concludes with a discussion of the findings engaging the literature, discussing the implications of findings, and recommendations for future research.Keywords: applications (apps), mobile devices, new pedagogies, physical education
Procedia PDF Downloads 19435275 A Method to Estimate Wheat Yield Using Landsat Data
Authors: Zama Mahmood
Abstract:
The increasing demand of food management, monitoring of the crop growth and forecasting its yield well before harvest is very important. These days, yield assessment together with monitoring of crop development and its growth are being identified with the help of satellite and remote sensing images. Studies using remote sensing data along with field survey validation reported high correlation between vegetation indices and yield. With the development of remote sensing technique, the detection of crop and its mechanism using remote sensing data on regional or global scales have become popular topics in remote sensing applications. Punjab, specially the southern Punjab region is extremely favourable for wheat production. But measuring the exact amount of wheat production is a tedious job for the farmers and workers using traditional ground based measurements. However, remote sensing can provide the most real time information. In this study, using the Normalized Differentiate Vegetation Index (NDVI) indicator developed from Landsat satellite images, the yield of wheat has been estimated during the season of 2013-2014 for the agricultural area around Bahawalpur. The average yield of the wheat was found 35 kg/acre by analysing field survey data. The field survey data is in fair agreement with the NDVI values extracted from Landsat images. A correlation between wheat production (ton) and number of wheat pixels has also been calculated which is in proportional pattern with each other. Also a strong correlation between the NDVI and wheat area was found (R2=0.71) which represents the effectiveness of the remote sensing tools for crop monitoring and production estimation.Keywords: landsat, NDVI, remote sensing, satellite images, yield
Procedia PDF Downloads 33535274 Management of Theatre with Social and Culture
Authors: Chitsuphang Ungsvanonda
Abstract:
Objective of this research is to study the government’s theater management system regarding planning and operation. Also studying how the management associate with the change of an environment. This is to gather an appropriate model to develop a theater management system especially regarding all show performance. The research will be done by a Qualitative Research with an interview of 35 person by specify and unexpectedly group.Keywords: management, theatre, social, culture
Procedia PDF Downloads 47135273 Branding Tourism Destinations; The Trending Initiatives for Edifice Image Choices of Foreign Policy
Authors: Mehtab Alam, Mudiarasan Kuppusamy, Puvaneswaran Kunaserkaran
Abstract:
The purpose of this paper is to bridge the gap and complete the relationship between tourism destinations and image branding as a choice of edifice foreign policy. Such options became a crucial component for individuals interested in leisure and travel activities. The destination management factors have been evaluated and analyzed using the primary and secondary data in a mixed-methods approach (quantitative sample of 384 and qualitative 8 semi-structured interviews at saturated point). The study chose the Environmental Management Accounting (EMA) and Image Restoration (IR) theories, along with a schematic diagram and an analytical framework supported by NVivo software 12, for two locations in Abbottabad, KPK, Pakistan: Shimla Hill and Thandiani. This incorporates the use of PLS-SEM model for assessing validity of data while SPSS for data screening of descriptive statistics. The results show that destination management's promotion of tourism has significantly improved Pakistan's state image. The use of institutional setup, environmental drivers, immigration, security, and hospitality as well as recreational initiatives on destination management is encouraged. The practical ramifications direct the heads of tourism projects, diplomats, directors, and policymakers to complete destination projects before inviting people to Pakistan. The paper provides the extent of knowledge for academic tourism circles to use tourism destinations as brand ambassadors.Keywords: tourism, management, state image, foreign policy, image branding
Procedia PDF Downloads 6935272 Management Information System to Help Managers for Providing Decision Making in an Organization
Authors: Ajayi Oluwasola Felix
Abstract:
Management information system (MIS) provides information for the managerial activities in an organization. The main purpose of this research is, MIS provides accurate and timely information necessary to facilitate the decision-making process and enable the organizations planning control and operational functions to be carried out effectively. Management information system (MIS) is basically concerned with processing data into information and is then communicated to the various departments in an organization for appropriate decision-making. MIS is a subset of the overall planning and control activities covering the application of humans technologies, and procedures of the organization. The information system is the mechanism to ensure that information is available to the managers in the form they want it and when they need it.Keywords: Management Information Systems (MIS), information technology, decision-making, MIS in Organizations
Procedia PDF Downloads 55735271 Automatic Tuning for a Systemic Model of Banking Originated Losses (SYMBOL) Tool on Multicore
Authors: Ronal Muresano, Andrea Pagano
Abstract:
Nowadays, the mathematical/statistical applications are developed with more complexity and accuracy. However, these precisions and complexities have brought as result that applications need more computational power in order to be executed faster. In this sense, the multicore environments are playing an important role to improve and to optimize the execution time of these applications. These environments allow us the inclusion of more parallelism inside the node. However, to take advantage of this parallelism is not an easy task, because we have to deal with some problems such as: cores communications, data locality, memory sizes (cache and RAM), synchronizations, data dependencies on the model, etc. These issues are becoming more important when we wish to improve the application’s performance and scalability. Hence, this paper describes an optimization method developed for Systemic Model of Banking Originated Losses (SYMBOL) tool developed by the European Commission, which is based on analyzing the application's weakness in order to exploit the advantages of the multicore. All these improvements are done in an automatic and transparent manner with the aim of improving the performance metrics of our tool. Finally, experimental evaluations show the effectiveness of our new optimized version, in which we have achieved a considerable improvement on the execution time. The time has been reduced around 96% for the best case tested, between the original serial version and the automatic parallel version.Keywords: algorithm optimization, bank failures, OpenMP, parallel techniques, statistical tool
Procedia PDF Downloads 37035270 Geographic Information System for Simulating Air Traffic By Applying Different Multi-Radar Positioning Techniques
Authors: Amara Rafik, Mostefa Belhadj Aissa
Abstract:
Radar data is one of the many data sources used by ATM Air Traffic Management systems. These data come from air navigation radar antennas. These radars intercept signals emitted by the various aircraft crossing the controlled airspace and calculate the position of these aircraft and retransmit their positions to the Air Traffic Management System. For greater reliability, these radars are positioned in such a way as to allow their coverage areas to overlap. An aircraft will therefore be detected by at least one of these radars. However, the position coordinates of the same aircraft and sent by these different radars are not necessarily identical. Therefore, the ATM system must calculate a single position (radar track) which will ultimately be sent to the control position and displayed on the air traffic controller's monitor. There are several techniques for calculating the radar track. Furthermore, the geographical nature of the problem requires the use of a Geographic Information System (GIS), i.e. a geographical database on the one hand and geographical processing. The objective of this work is to propose a GIS for traffic simulation which reconstructs the evolution over time of aircraft positions from a multi-source radar data set and by applying these different techniques.Keywords: ATM, GIS, radar data, simulation
Procedia PDF Downloads 11935269 On the Existence of Homotopic Mapping Between Knowledge Graphs and Graph Embeddings
Authors: Jude K. Safo
Abstract:
Knowledge Graphs KG) and their relation to Graph Embeddings (GE) represent a unique data structure in the landscape of machine learning (relative to image, text and acoustic data). Unlike the latter, GEs are the only data structure sufficient for representing hierarchically dense, semantic information needed for use-cases like supply chain data and protein folding where the search space exceeds the limits traditional search methods (e.g. page-rank, Dijkstra, etc.). While GEs are effective for compressing low rank tensor data, at scale, they begin to introduce a new problem of ’data retreival’ which we observe in Large Language Models. Notable attempts by transE, TransR and other prominent industry standards have shown a peak performance just north of 57% on WN18 and FB15K benchmarks, insufficient practical industry applications. They’re also limited, in scope, to next node/link predictions. Traditional linear methods like Tucker, CP, PARAFAC and CANDECOMP quickly hit memory limits on tensors exceeding 6.4 million nodes. This paper outlines a topological framework for linear mapping between concepts in KG space and GE space that preserve cardinality. Most importantly we introduce a traceable framework for composing dense linguistic strcutures. We demonstrate performance on WN18 benchmark this model hits. This model does not rely on Large Langauge Models (LLM) though the applications are certainy relevant here as well.Keywords: representation theory, large language models, graph embeddings, applied algebraic topology, applied knot theory, combinatorics
Procedia PDF Downloads 6835268 Towards an Environmental Knowledge System in Water Management
Authors: Mareike Dornhoefer, Madjid Fathi
Abstract:
Water supply and water quality are key problems of mankind at the moment and - due to increasing population - in the future. Management disciplines like water, environment and quality management therefore need to closely interact, to establish a high level of water quality and to guarantee water supply in all parts of the world. Groundwater remediation is one aspect in this process. From a knowledge management perspective it is only possible to solve complex ecological or environmental problems if different factors, expert knowledge of various stakeholders and formal regulations regarding water, waste or chemical management are interconnected in form of a knowledge base. In general knowledge management focuses the processes of gathering and representing existing and new knowledge in a way, which allows for inference or deduction of knowledge for e.g. a situation where a problem solution or decision support are required. A knowledge base is no sole data repository, but a key element in a knowledge based system, thus providing or allowing for inference mechanisms to deduct further knowledge from existing facts. In consequence this knowledge provides decision support. The given paper introduces an environmental knowledge system in water management. The proposed environmental knowledge system is part of a research concept called Green Knowledge Management. It applies semantic technologies or concepts such as ontology or linked open data to interconnect different data and information sources about environmental aspects, in this case, water quality, as well as background material enriching an established knowledge base. Examples for the aforementioned ecological or environmental factors threatening water quality are among others industrial pollution (e.g. leakage of chemicals), environmental changes (e.g. rise in temperature) or floods, where all kinds of waste are merged and transferred into natural water environments. Water quality is usually determined with the help of measuring different indicators (e.g. chemical or biological), which are gathered with the help of laboratory testing, continuous monitoring equipment or other measuring processes. During all of these processes data are gathered and stored in different databases. Meanwhile the knowledge base needs to be established through interconnecting data of these different data sources and enriching its semantics. Experts may add their knowledge or experiences of previous incidents or influencing factors. In consequence querying or inference mechanisms are applied for the deduction of coherence between indicators, predictive developments or environmental threats. Relevant processes or steps of action may be modeled in form of a rule based approach. Overall the environmental knowledge system supports the interconnection of information and adding semantics to create environmental knowledge about water environment, supply chain as well as quality. The proposed concept itself is a holistic approach, which links to associated disciplines like environmental and quality management. Quality indicators and quality management steps need to be considered e.g. for the process and inference layers of the environmental knowledge system, thus integrating the aforementioned management disciplines in one water management application.Keywords: water quality, environmental knowledge system, green knowledge management, semantic technologies, quality management
Procedia PDF Downloads 22135267 The Impact of other Comprehensive Income Disclosure and Corporate Governance on Earnings Management and Firm Performance
Authors: Yan Wang, Yuan George Shan
Abstract:
This study examines whether earnings management reduces firm performance and how other comprehensive income (OCI) disclosure and strong corporate governance restrain earnings management. Using a data set comprising 6,260 firm-year observations from listed companies on the Shanghai and Shenzhen Stock Exchanges during 2009–2015, the results indicate that OCI disclosure generally improves firm performance, but earnings management lowers firm performance. The study also finds that OCI disclosure and corporate governance are complementary in restraining earnings manipulation and promote firm performance. The implications of the findings are relevant policy-makers and regulators in assisting them evaluate the consequences of convergence of Chinese Accounting Standards with the International Financial Reporting Standards.Keywords: other comprehensive income, corporate governance, earnings management, firm performance, China
Procedia PDF Downloads 23235266 The Affect of Total Quality Management on Firm's Innovation Performance: A Literature Review
Authors: Omer Akkaya, Nurullah Ekmekcı, Muammer Zerenler
Abstract:
Innovation for businesses means a new product and service and sometimes a new implementation. Total Quality Management is a management philosophy which focus on customer, process and system.There is a certain relationship between principles of Total Quality Management and innovation performance. Main aim of this study is to show how the implementation and principles of Total Quality Management (TQM) affect a firm's innovation performance. Also, this paper discusses positive and negative affects of Total Quality Management on innovation performance and demonstrates some examples.Keywords: innovation, innovation types, total quality management, principles of total quality management
Procedia PDF Downloads 63135265 Analyzing Tools and Techniques for Classification In Educational Data Mining: A Survey
Authors: D. I. George Amalarethinam, A. Emima
Abstract:
Educational Data Mining (EDM) is one of the newest topics to emerge in recent years, and it is concerned with developing methods for analyzing various types of data gathered from the educational circle. EDM methods and techniques with machine learning algorithms are used to extract meaningful and usable information from huge databases. For scientists and researchers, realistic applications of Machine Learning in the EDM sectors offer new frontiers and present new problems. One of the most important research areas in EDM is predicting student success. The prediction algorithms and techniques must be developed to forecast students' performance, which aids the tutor, institution to boost the level of student’s performance. This paper examines various classification techniques in prediction methods and data mining tools used in EDM.Keywords: classification technique, data mining, EDM methods, prediction methods
Procedia PDF Downloads 11835264 New Technologies in Corporate Finance Management in the Digital Economy: Case of Kyrgyzstan
Authors: Marat Kozhomberdiev
Abstract:
The research will investigate the modern corporate finance management technologies currently used in the era of digitalization of the global economy and the degree to which financial institutions are utilizing these new technologies in the field of corporate finance management in Kyrgyzstan. The main purpose of the research is to reveal the role of financial management technologies as joint service centers, intercompany banks, specialized payment centers in the third-world country. Particularly, the analysis of the implacability of automated corporate finance management systems such as enterprise resource planning system (ERP) and treasury management system (TMS) will be carried out. Moreover, the research will investigate the role of cloud accounting systems in corporate finance management in Kyrgyz banks and whether it has any impact on the field of improving corporate finance management. The study will utilize a data collection process via surveying 3 banks in Kyrgyzstan, namely Mol-Bulak, RSK, and KICB. The banks were chosen based on their ownerships, such as state banks, private banks with local authorized capital, and private bank with international capital. The regression analysis will be utilized to reveal the correlation between the ownership of the bank and the use of new financial management technologies. The research will provide policy recommendations to both private and state banks on developing strategies for switching and utilizing modern corporate finance management technologies in their daily operations.Keywords: digital economy, corporate finance, digital environment, digital technologies, cloud technologies, financial management
Procedia PDF Downloads 7135263 Graph-Oriented Summary for Optimized Resource Description Framework Graphs Streams Processing
Authors: Amadou Fall Dia, Maurras Ulbricht Togbe, Aliou Boly, Zakia Kazi Aoul, Elisabeth Metais
Abstract:
Existing RDF (Resource Description Framework) Stream Processing (RSP) systems allow continuous processing of RDF data issued from different application domains such as weather station measuring phenomena, geolocation, IoT applications, drinking water distribution management, and so on. However, processing window phase often expires before finishing the entire session and RSP systems immediately delete data streams after each processed window. Such mechanism does not allow optimized exploitation of the RDF data streams as the most relevant and pertinent information of the data is often not used in a due time and almost impossible to be exploited for further analyzes. It should be better to keep the most informative part of data within streams while minimizing the memory storage space. In this work, we propose an RDF graph summarization system based on an explicit and implicit expressed needs through three main approaches: (1) an approach for user queries (SPARQL) in order to extract their needs and group them into a more global query, (2) an extension of the closeness centrality measure issued from Social Network Analysis (SNA) to determine the most informative parts of the graph and (3) an RDF graph summarization technique combining extracted user query needs and the extended centrality measure. Experiments and evaluations show efficient results in terms of memory space storage and the most expected approximate query results on summarized graphs compared to the source ones.Keywords: centrality measures, RDF graphs summary, RDF graphs stream, SPARQL query
Procedia PDF Downloads 20335262 myITLab as an Implementation Instance of Distance Education Technologies
Authors: Leila Goosen
Abstract:
The research problem reported on in this paper relates to improving success in Computer Science and Information Technology subjects where students are learning applications, especially when teaching occurs in a distance education context. An investigation was launched in order to address students’ struggles with applications, and improve their assessment in such subjects. Some of the main arguments presented centre on formulating and situating significant concepts within an appropriate conceptual framework. The paper explores the experiences and perceptions of computing instructors, teaching assistants, students and higher education institutions on how they are empowered by using technologies such as myITLab. They also share how they are working with the available features to successfully teach applications to their students. The data collection methodology used is then described. The paper includes discussions on how myITLab empowers instructors, teaching assistants, students and higher education institutions. Conclusions are presented on the way in which this paper could make an original and significant contribution to the promotion and development of knowledge in fields related to successfully teaching applications for student learning, including in a distance education context. The paper thus provides a forum for practitioners to highlight and discuss insights and successes, as well as identify new technical and organisational challenges, lessons and concerns regarding practical activities related to myITLab as an implementation instance of distance education technologies.Keywords: distance, education, myITLab, technologies
Procedia PDF Downloads 360