Search results for: activity-based benefit assessment approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 19255

Search results for: activity-based benefit assessment approach

14665 Regulatory Frameworks and Bank Failure Prevention in South Africa: Assessing Effectiveness and Enhancing Resilience

Authors: Princess Ncube

Abstract:

In the context of South Africa's banking sector, the prevention of bank failures is of paramount importance to ensure financial stability and economic growth. This paper focuses on the role of regulatory frameworks in safeguarding the resilience of South African banks and mitigating the risks of failures. It aims to assess the effectiveness of existing regulatory measures and proposes strategies to enhance the resilience of financial institutions in the country. The paper begins by examining the specific regulatory frameworks in place in South Africa, including capital adequacy requirements, stress testing methodologies, risk management guidelines, and supervisory practices. It delves into the evolution of these measures in response to lessons learned from past financial crises and their relevance in the unique South African banking landscape. Drawing on empirical evidence and case studies specific to South Africa, this paper evaluates the effectiveness of regulatory frameworks in preventing bank failures within the country. It analyses the impact of these frameworks on crucial aspects such as early detection of distress signals, improvements in risk management practices, and advancements in corporate governance within South African financial institutions. Additionally, it explores the interplay between regulatory frameworks and the specific economic environment of South Africa, including the role of macroprudential policies in preventing systemic risks. Based on the assessment, this paper proposes recommendations to strengthen regulatory frameworks and enhance their effectiveness in bank failure prevention in South Africa. It explores avenues for refining existing regulations to align capital requirements with the risk profiles of South African banks, enhancing stress testing methodologies to capture specific vulnerabilities, and fostering better coordination among regulatory authorities within the country. Furthermore, it examines the potential benefits of adopting innovative approaches, such as leveraging technology and data analytics, to improve risk assessment and supervision in the South African banking sector.

Keywords: banks, resolution, liquidity, regulation

Procedia PDF Downloads 75
14664 Augmenting History: Case Study Measuring Motivation of Students Using Augmented Reality Apps in History Classes

Authors: Kevin. S. Badni

Abstract:

Due to the rapid advances in the use of information technology and students’ familiarity with technology, learning styles in higher education are being reshaped. One of the technology developments that has gained considerable attention in recent years is Augmented Reality (AR), where technology is used to combine overlays of digital data on physical real-world settings. While AR is being heavily promoted for entertainment by mobile phone manufacturers, it has had little adoption in higher education due to the required upfront investment that an instructor needs to undertake in creating relevant AR applications. This paper discusses a case study that uses a low upfront development approach and examines the impact on generation-Z students’ motivation whilst studying design history over a four-semester period. Even though the upfront investment in creating the AR support was minimal, the results showed a noticeable increase in student motivation. The approach used in this paper can be easily transferred to other disciplines and other areas of design education.

Keywords: augmented reality, history, motivation, technology

Procedia PDF Downloads 157
14663 Competitiveness and Pricing Policy Assessment for Resilience Surface Access System at Airports

Authors: Dimitrios J. Dimitriou

Abstract:

Considering a worldwide tendency, air transports are growing very fast and many changes have taken place in planning, management and decision making process. Given the complexity of airport operation, the best use of existing capacity is the key driver of efficiency and productivity. This paper deals with the evaluation framework for the ground access at airports, by using a set of mode choice indicators providing key messages towards airport’s ground access performance. The application presents results for a sample of 12 European airports, illustrating recommendations to define policy and improve service for the air transport access chain.

Keywords: airport ground access, air transport chain, airport access performance, airport policy

Procedia PDF Downloads 361
14662 An Interdisciplinary Maturity Model for Accompanying Sustainable Digital Transformation Processes in a Smart Residential Quarter

Authors: Wesley Preßler, Lucie Schmidt

Abstract:

Digital transformation is playing an increasingly important role in the development of smart residential quarters. In order to accompany and steer this process and ultimately make the success of the transformation efforts measurable, it is helpful to use an appropriate maturity model. However, conventional maturity models for digital transformation focus primarily on the evaluation of processes and neglect the information and power imbalances between the stakeholders, which affects the validity of the results. The Multi-Generation Smart Community (mGeSCo) research project is developing an interdisciplinary maturity model that integrates the dimensions of digital literacy, interpretive patterns, and technology acceptance to address this gap. As part of the mGeSCo project, the technological development of selected dimensions in the Smart Quarter Jena-Lobeda (Germany) is being investigated. A specific maturity model, based on Cohen's Smart Cities Wheel, evaluates the central dimensions Working, Living, Housing and Caring. To improve the reliability and relevance of the maturity assessment, the factors Digital Literacy, Interpretive Patterns and Technology Acceptance are integrated into the developed model. The digital literacy dimension examines stakeholders' skills in using digital technologies, which influence their perception and assessment of technological maturity. Digital literacy is measured by means of surveys, interviews, and participant observation, using the European Commission's Digital Literacy Framework (DigComp) as a basis. Interpretations of digital technologies provide information about how individuals perceive technologies and ascribe meaning to them. However, these are not mere assessments, prejudices, or stereotyped perceptions but collective patterns, rules, attributions of meaning and the cultural repertoire that leads to these opinions and attitudes. Understanding these interpretations helps in assessing the overarching readiness of stakeholders to digitally transform a/their neighborhood. This involves examining people's attitudes, beliefs, and values about technology adoption, as well as their perceptions of the benefits and risks associated with digital tools. These insights provide important data for a holistic view and inform the steps needed to prepare individuals in the neighborhood for a digital transformation. Technology acceptance is another crucial factor for successful digital transformation to examine the willingness of individuals to adopt and use new technologies. Surveys or questionnaires based on Davis' Technology Acceptance Model can be used to complement interpretive patterns to measure neighborhood acceptance of digital technologies. Integrating the dimensions of digital literacy, interpretive patterns and technology acceptance enables the development of a roadmap with clear prerequisites for initiating a digital transformation process in the neighborhood. During the process, maturity is measured at different points in time and compared with changes in the aforementioned dimensions to ensure sustainable transformation. Participation, co-creation, and co-production are essential concepts for a successful and inclusive digital transformation in the neighborhood context. This interdisciplinary maturity model helps to improve the assessment and monitoring of sustainable digital transformation processes in smart residential quarters. It enables a more comprehensive recording of the factors that influence the success of such processes and supports the development of targeted measures to promote digital transformation in the neighborhood context.

Keywords: digital transformation, interdisciplinary, maturity model, neighborhood

Procedia PDF Downloads 68
14661 Pharmacogenetic Analysis of Inter-Ethnic Variability in the Uptake Transporter SLCO1B1 Gene in Colombian, Mozambican, and Portuguese Populations

Authors: Mulata Haile Nega, Derebew Fikadu Berhe, Vera Ribeiro Marques

Abstract:

There is no epidemiologic data on this gene polymorphism in several countries. Therefore, this study aimed to assess the genotype and allele frequencies of the gene variant in three countries. This study involved healthy individuals from Colombia, Mozambique, and Portugal. Genomic DNA was isolated from blood samples using the Qiamp DNA Extraction Kit (Qiagen). The isolated DNA was genotyped using Polymerase Chain Reaction (PCR) - Restriction Fragment Length Polymorphism. Microstat and GraphPad quick cal software were used for the Chi-square test and evaluation of Hardy-Weinberg equilibrium, respectively. A total of 181 individuals’ blood sample was analyzed. Overall, TT (74.0%) genotype was the highest, and CC (7.8%) was the lowest. Country wise genotypic frequencies were Colombia 47(70.2%) TT, 12(17.9%) TC and 8(11.9%) CC; Mozambique 47(88.7%) TT, 5(9.4%) TC, and 1(1.9%) CC; and Portugal 40(65.6%) TT, 16(26.2%) TC, and 5(8.2%) CC. The reference (T) allele was highest among Mozambicans (93.4%) compared to Colombians (79.1%) and Portuguese (78.7%). Mozambicans showed statistically significant genotypic and allelic frequency differences compared to Colombians (p<0.01) and Portuguese (p <0.01). Overall and country-wise, the CC genotype was less frequent and relatively high for Colombians and Portuguese populations. This finding may imply statins risk-benefit variability associated with CC genotype among these populations that needs further understanding.

Keywords: c.521T>C, polymorphism, SLCO1B1, SNP, statins

Procedia PDF Downloads 117
14660 Sentinel-2 Based Burn Area Severity Assessment Tool in Google Earth Engine

Authors: D. Madhushanka, Y. Liu, H. C. Fernando

Abstract:

Fires are one of the foremost factors of land surface disturbance in diverse ecosystems, causing soil erosion and land-cover changes and atmospheric effects affecting people's lives and properties. Generally, the severity of the fire is calculated as the Normalized Burn Ratio (NBR) index. This is performed manually by comparing two images obtained afterward. Then by using the bitemporal difference of the preprocessed satellite images, the dNBR is calculated. The burnt area is then classified as either unburnt (dNBR<0.1) or burnt (dNBR>= 0.1). Furthermore, Wildfire Severity Assessment (WSA) classifies burnt areas and unburnt areas using classification levels proposed by USGS and comprises seven classes. This procedure generates a burn severity report for the area chosen by the user manually. This study is carried out with the objective of producing an automated tool for the above-mentioned process, namely the World Wildfire Severity Assessment Tool (WWSAT). It is implemented in Google Earth Engine (GEE), which is a free cloud-computing platform for satellite data processing, with several data catalogs at different resolutions (notably Landsat, Sentinel-2, and MODIS) and planetary-scale analysis capabilities. Sentinel-2 MSI is chosen to obtain regular processes related to burnt area severity mapping using a medium spatial resolution sensor (15m). This tool uses machine learning classification techniques to identify burnt areas using NBR and to classify their severity over the user-selected extent and period automatically. Cloud coverage is one of the biggest concerns when fire severity mapping is performed. In WWSAT based on GEE, we present a fully automatic workflow to aggregate cloud-free Sentinel-2 images for both pre-fire and post-fire image compositing. The parallel processing capabilities and preloaded geospatial datasets of GEE facilitated the production of this tool. This tool consists of a Graphical User Interface (GUI) to make it user-friendly. The advantage of this tool is the ability to obtain burn area severity over a large extent and more extended temporal periods. Two case studies were carried out to demonstrate the performance of this tool. The Blue Mountain national park forest affected by the Australian fire season between 2019 and 2020 is used to describe the workflow of the WWSAT. This site detected more than 7809 km2, using Sentinel-2 data, giving an error below 6.5% when compared with the area detected on the field. Furthermore, 86.77% of the detected area was recognized as fully burnt out, of which high severity (17.29%), moderate-high severity (19.63%), moderate-low severity (22.35%), and low severity (27.51%). The Arapaho and Roosevelt National Forest Park, California, the USA, which is affected by the Cameron peak fire in 2020, is chosen for the second case study. It was found that around 983 km2 had burned out, of which high severity (2.73%), moderate-high severity (1.57%), moderate-low severity (1.18%), and low severity (5.45%). These spots also can be detected through the visual inspection made possible by cloud-free images generated by WWSAT. This tool is cost-effective in calculating the burnt area since satellite images are free and the cost of field surveys is avoided.

Keywords: burnt area, burnt severity, fires, google earth engine (GEE), sentinel-2

Procedia PDF Downloads 222
14659 The Economic Impact of Mediation: An Analysis in Time of Crisis

Authors: C. M. Cebola, V. H. Ferreira

Abstract:

In the past decade mediation has been legally implemented in European legal systems, especially after the publication by the European Union of the Directive 2008/52/EC on certain aspects of mediation in civil and mercantile matters. Developments in international trade and globalization in this new century have led to an increase of the number of litigations, often cross-border, and the courts have failed to respond adequately. We do not advocate that mediation should be promoted as the solution for all justice problems, but as a means with its own specificities that the parties may choose to consider as the best way to resolve their disputes. Thus, the implementation of mediation should be based on the advantages of its application. From the economic point of view, competitive negotiation can generate negative external effects in social terms. A solution reached in a court of law is not always the most efficient one considering all elements of society (economic social benefit). On the other hand, the administration of justice adds in economic terms transaction costs that can be mitigated by the application of other forms of conflict resolution, such as mediation. In this paper, the economic benefits of mediation will be analysed in the light of various studies on the functioning of justice. Several theoretical arguments will be confronted with empirical studies to demonstrate that mediation has significant positive economic effects. The objective is to contribute to the dissemination of mediation between companies and citizens, but also to demonstrate the cost to governments and states of still limited use of mediation, particularly in the current economic crisis and propose actions to develop the application of mediation.

Keywords: economic impact, litigation costs, mediation, solutions

Procedia PDF Downloads 273
14658 Advancing Urban Sustainability through Data-Driven Machine Learning Solutions

Authors: Nasim Eslamirad, Mahdi Rasoulinezhad, Francesco De Luca, Sadok Ben Yahia, Kimmo Sakari Lylykangas, Francesco Pilla

Abstract:

With the ongoing urbanization, cities face increasing environmental challenges impacting human well-being. To tackle these issues, data-driven approaches in urban analysis have gained prominence, leveraging urban data to promote sustainability. Integrating Machine Learning techniques enables researchers to analyze and predict complex environmental phenomena like Urban Heat Island occurrences in urban areas. This paper demonstrates the implementation of data-driven approach and interpretable Machine Learning algorithms with interpretability techniques to conduct comprehensive data analyses for sustainable urban design. The developed framework and algorithms are demonstrated for Tallinn, Estonia to develop sustainable urban strategies to mitigate urban heat waves. Geospatial data, preprocessed and labeled with UHI levels, are used to train various ML models, with Logistic Regression emerging as the best-performing model based on evaluation metrics to derive a mathematical equation representing the area with UHI or without UHI effects, providing insights into UHI occurrences based on buildings and urban features. The derived formula highlights the importance of building volume, height, area, and shape length to create an urban environment with UHI impact. The data-driven approach and derived equation inform mitigation strategies and sustainable urban development in Tallinn and offer valuable guidance for other locations with varying climates.

Keywords: data-driven approach, machine learning transparent models, interpretable machine learning models, urban heat island effect

Procedia PDF Downloads 22
14657 Customer Churn Analysis in Telecommunication Industry Using Data Mining Approach

Authors: Burcu Oralhan, Zeki Oralhan, Nilsun Sariyer, Kumru Uyar

Abstract:

Data mining has been becoming more and more important and a wide range of applications in recent years. Data mining is the process of find hidden and unknown patterns in big data. One of the applied fields of data mining is Customer Relationship Management. Understanding the relationships between products and customers is crucial for every business. Customer Relationship Management is an approach to focus on customer relationship development, retention and increase on customer satisfaction. In this study, we made an application of a data mining methods in telecommunication customer relationship management side. This study aims to determine the customers profile who likely to leave the system, develop marketing strategies, and customized campaigns for customers. Data are clustered by applying classification techniques for used to determine the churners. As a result of this study, we will obtain knowledge from international telecommunication industry. We will contribute to the understanding and development of this subject in Customer Relationship Management.

Keywords: customer churn analysis, customer relationship management, data mining, telecommunication industry

Procedia PDF Downloads 303
14656 Criteria Analysis of Residential Location Preferences: An Urban Dwellers’ Perspective

Authors: Arati Siddharth Petkar, Joel E. M. Macwan

Abstract:

Preferences for residential location are of a diverse nature. Primarily they are based on the socio-economic, socio-cultural, socio-demographic characteristics of the household. It also depends on character, and the growth potential of different areas in a city. In the present study, various criteria affecting residential location preferences from the Urban Dwellers’ perspective have been analyzed. The household survey has been conducted in two parts: Existing Buyers’ survey and Future Buyers’ survey. The analysis reveals that workplace location is the most governing criterion in deciding residential location from the majority of the urban dwellers perspective. For analyzing the importance of varied criteria, Analytical Hierarchy Process approach has been explored. The suggested approach will be helpful for urban planners, decision makers and developers, while designating a new residential area or redeveloping an existing one.

Keywords: analytical hierarchy process (AHP), household, preferences, residential location preferences, residential land use, urban dwellers

Procedia PDF Downloads 198
14655 Towards a Measuring Tool to Encourage Knowledge Sharing in Emerging Knowledge Organizations: The Who, the What and the How

Authors: Rachel Barker

Abstract:

The exponential velocity in the truly knowledge-intensive world today has increasingly bombarded organizations with unfathomable challenges. Hence organizations are introduced to strange lexicons of descriptors belonging to a new paradigm of who, what and how knowledge at individual and organizational levels should be managed. Although organizational knowledge has been recognized as a valuable intangible resource that holds the key to competitive advantage, little progress has been made in understanding how knowledge sharing at individual level could benefit knowledge use at collective level to ensure added value. The research problem is that a lack of research exists to measure knowledge sharing through a multi-layered structure of ideas with at its foundation, philosophical assumptions to support presuppositions and commitment which requires actual findings from measured variables to confirm observed and expected events. The purpose of this paper is to address this problem by presenting a theoretical approach to measure knowledge sharing in emerging knowledge organizations. The research question is that despite the competitive necessity of becoming a knowledge-based organization, leaders have found it difficult to transform their organizations due to a lack of knowledge on who, what and how it should be done. The main premise of this research is based on the challenge for knowledge leaders to develop an organizational culture conducive to the sharing of knowledge and where learning becomes the norm. The theoretical constructs were derived and based on the three components of the knowledge management theory, namely technical, communication and human components where it is suggested that this knowledge infrastructure could ensure effective management. While it is realised that it might be a little problematic to implement and measure all relevant concepts, this paper presents effect of eight critical success factors (CSFs) namely: organizational strategy, organizational culture, systems and infrastructure, intellectual capital, knowledge integration, organizational learning, motivation/performance measures and innovation. These CSFs have been identified based on a comprehensive literature review of existing research and tested in a new framework adapted from four perspectives of the balanced score card (BSC). Based on these CSFs and their items, an instrument was designed and tested among managers and employees of a purposefully selected engineering company in South Africa who relies on knowledge sharing to ensure their competitive advantage. Rigorous pretesting through personal interviews with executives and a number of academics took place to validate the instrument and to improve the quality of items and correct wording of issues. Through analysis of surveys collected, this research empirically models and uncovers key aspects of these dimensions based on the CSFs. Reliability of the instrument was calculated by Cronbach’s a for the two sections of the instrument on organizational and individual levels.The construct validity was confirmed by using factor analysis. The impact of the results was tested using structural equation modelling and proved to be a basis for implementing and understanding the competitive predisposition of the organization as it enters the process of knowledge management. In addition, they realised the importance to consolidate their knowledge assets to create value that is sustainable over time.

Keywords: innovation, intellectual capital, knowledge sharing, performance measures

Procedia PDF Downloads 183
14654 Hermeneutical Attitudes to Islamic Art

Authors: Mohammad Hasan Kakizadeh

Abstract:

It is a matter of philosophical hermeneutics according to specifications, we can hand to his hermeneutic, hermeneutical approaches can be analyzed with Islamic art, Islamic art hermeneutical approaches can be of two types: one is to "Islamic Art" Art is considered the analogies and metaphors and mysteries using Nmvdgarha and tried to express spiritual or religious ideology that demonstrates the truth of Islam, and other efforts is that "art" is basically a way inconsistent with the interpretation that or "sharia," Islamic law, not be considered a way to recognize and praise God, his creation, and therefore, the "art" is a tool for reform or knowledge to Nfs.az these two efforts, the first modern effort to try and of course preceded by the second. However, the first attempt is sometimes forgotten that the early centuries AD, with respect to the nature of hermeneutic thinkers for the arts could not resist the assaults of "art" in general, or specifically some legitimacy to the "art" of business and Knnd.dyn art at the stage of its history, which distinguishes them from each other are united with each other so easily possible. However, with the rise of the monotheistic religions and leave the Pagan religions, religion, and art renewed bond becomes a difficult problem. Much of the efforts of Muslim scholars have focused on the legitimacy back to the art. These attempts without a hermeneutic approach to the "art" does not correlate with success. The findings and conclusion in this study is that the hermeneutic approach to Islamic art, whether or Mshrvsazanh Mnakavanh what Bazsazanh or deconstructive, can not ignore the fact that Islamic art has been shaped by Mabdaltbyhay.

Keywords: art, Islamic art, hermeneutics, art, religion

Procedia PDF Downloads 359
14653 Model Based Simulation Approach to a 14-Dof Car Model Using Matlab/Simulink

Authors: Ishit Sheth, Chandrasekhar Jinendran, Chinmaya Ranjan Sahu

Abstract:

A fourteen degree of freedom (DOF) ride and handling control mathematical model is developed for a car using generalized boltzmann hamel equation which will create a basis for design of ride and handling controller. Mathematical model developed yield equations of motion for non-holonomic constrained systems in quasi-coordinates. The governing differential equation developed integrates ride and handling control of car. Model-based systems engineering approach is implemented for simulation using matlab/simulink, vehicle’s response in different DOF is examined and later validated using commercial software (ADAMS). This manuscript involves detailed derivation of full car vehicle model which provides response in longitudinal, lateral and yaw motion to demonstrate the advantages of the developed model over the existing dynamic model. The dynamic behaviour of the developed ride and handling model is simulated for different road conditions.

Keywords: Full Vehicle Model, MBSE, Non Holonomic Constraints, Boltzmann Hamel Equation

Procedia PDF Downloads 205
14652 A NoSQL Based Approach for Real-Time Managing of Robotics's Data

Authors: Gueidi Afef, Gharsellaoui Hamza, Ben Ahmed Samir

Abstract:

This paper deals with the secret of the continual progression data that new data management solutions have been emerged: The NoSQL databases. They crossed several areas like personalization, profile management, big data in real-time, content management, catalog, view of customers, mobile applications, internet of things, digital communication and fraud detection. Nowadays, these database management systems are increasing. These systems store data very well and with the trend of big data, a new challenge’s store demands new structures and methods for managing enterprise data. The new intelligent machine in the e-learning sector, thrives on more data, so smart machines can learn more and faster. The robotics are our use case to focus on our test. The implementation of NoSQL for Robotics wrestle all the data they acquire into usable form because with the ordinary type of robotics; we are facing very big limits to manage and find the exact information in real-time. Our original proposed approach was demonstrated by experimental studies and running example used as a use case.

Keywords: NoSQL databases, database management systems, robotics, big data

Procedia PDF Downloads 340
14651 Assessment of Health Literacy and Awareness of Female Residents of Barangay Dagatan, Sabang, and Marauoy Lipa, Batangas on Polycystic Ovarian Syndrome: A Cross-Sectional Study

Authors: Jean Gray C. Achapero, Mary Margareth P. Ancheta, Patricia Anjelika A. Angeles, Shannon Denzel S. Ao Tai, Carl Brandon C. Barlis, Chrislen Mae B. Benavidez

Abstract:

Health literacy and awareness of Polycystic ovarian syndrome (PCOS) is a global issue that is under-addressed in the Philippines. Conducting a thorough review of the country's ability to recognize and comprehend the severity of the syndrome should be undertaken, as early treatment is essential to avoid further disorder complications. This research aims to assess the health literacy and awareness of the female residents of Barangay Dagatan, Sabang, and Marauoy Lipa, Batangas on PCOS. It followed a cross-sectional study, and data gathering was done through a pre-assessment using the Single Item Literacy Screener (SILS) and an online population-based survey questionnaire about PCOS awareness. The participants, as based on the objectives and purposive sampling method, were females aged 18-45 years old. Data were analyzed statistically using STATA 13.1 software. The study showed that 339 (76%) out of 444 respondents passed the SILS meaning the residents have proficient health literacy. Among the 339 respondents, 87% (287) had previous knowledge about PCOS. The respondents showed minimal awareness of PCOS symptoms which could be attributed to its broad spectrum of information. Respondents were shown to be most knowledgeable about PCOS physiology, treatment, beliefs, and its remedies. The respondents’ age had no significant association with their health literacy (p=0.31) and PCOS awareness (p=0.60). A significant association was noted, however, in their educational attainment linked with their health literacy (p=<0.0001) and PCOS awareness (p=0.001). It is suggested that reproductive health education even in the lower year levels must be optimized and Local Government Unit (LGU)/Non-Government Organization (NGO)-held seminars should be conducted for knowledge reinforcement. Reliable health information should be more accessible to the public and clinicians must emphasize the importance of the majority of early screening as part of routine physical examination for women of reproductive age to increase health literacy and awareness about PCOS and actively engage in the management of the disease.

Keywords: age, awareness, educational attainment, health literacy, polycystic ovarian syndrome

Procedia PDF Downloads 211
14650 Biophysical Assessment of the Ecological Condition of Wetlands in the Parkland and Grassland Natural Regions of Alberta, Canada

Authors: Marie-Claude Roy, David Locky, Ermias Azeria, Jim Schieck

Abstract:

It is estimated that up to 70% of the wetlands in the Parkland and Grassland natural regions of Alberta have been lost due to various land-use activities. These losses include ecosystem function and services they once provided. Those wetlands remaining are often embedded in a matrix of human-modified habitats and despite efforts taken to protect them the effects of land-uses on wetland condition and function remain largely unknown. We used biophysical field data and remotely-sensed human footprint data collected at 322 open-water wetlands by the Alberta Biodiversity Monitoring Institute (ABMI) to evaluate the impact of surrounding land use on the physico-chemistry characteristics and plant functional traits of wetlands. Eight physio-chemistry parameters were assessed: wetland water depth, water temperature, pH, salinity, dissolved oxygen, total phosphorus, total nitrogen, and dissolved organic carbon. Three plant functional traits were evaluated: 1) origin (native and non-native), 2) life history (annual, biennial, and perennial), and 3) habitat requirements (obligate-wetland and obligate-upland). Intensity land-use was quantified within a 250-meter buffer around each wetland. Ninety-nine percent of wetlands in the Grassland and Parkland regions of Alberta have land-use activities in their surroundings, with most being agriculture-related. Total phosphorus in wetlands increased with the cover of surrounding agriculture, while salinity, total nitrogen, and dissolved organic carbon were positively associated with the degree of soft-linear (e.g. pipelines, trails) land-uses. The abundance of non-native and annual/biennial plants increased with the amount of agriculture, while urban-industrial land-use lowered abundance of natives, perennials, and obligate wetland plants. Our study suggests that land-use types surrounding wetlands affect the physicochemical and biological conditions of wetlands. This research suggests that reducing human disturbances through reclamation of wetland buffers may enhance the condition and function of wetlands in agricultural landscapes.

Keywords: wetlands, biophysical assessment, land use, grassland and parkland natural regions

Procedia PDF Downloads 323
14649 Evaluation of Information Technology Governance Frameworks for Better Governance in South Africa

Authors: Memory Ranga, Phillip Pretorious

Abstract:

The South African Government has invested a lot of money in Information Technology Governance (ITG) within the Government departments. The ITG framework was spearheaded by the Department of Public Service and Administration (DPSA). This led to the development of a governing ITG DPSA framework and later the Government Wide Enterprise Architecture (GWEA) Framework for assisting the departments to implement ITG. In addition to this, the government departments have adopted the Information Systems Audit and Control Association (ISACA) Control Objectives for Information and Related Technology (COBIT) for ITG processes. Despite all these available frameworks, departments fail to fully capitalise and improve the ITG processes mainly as these are too generic and difficult to apply for specific governance needs. There has been less research done to evaluate the progress on ITG initiatives within the government departments. This paper aims to evaluate the existing ITG frameworks within selected government departments in South Africa. A quantitative research approach was used in this study. Data was collected through an online questionnaire targeting ICT Managers and Directors from government departments. The study is undertaken within a case study and only the Eastern Cape Province was selected for the research. Document review mainly on ITG framework and best practices was also used. Data was analysed using the Google Analytic tools and SPSS. A one–sample Chi-Squared Test was used to verity the evaluation findings. Findings show that there is evidence that the current guiding National governance framework (DPSA) is out dated and does not accommodate the new changes in other governance frameworks. The Eastern Cape Government Departments have spent huge amount of money on ITG but not yet able to identify the benefits of the ITG initiatives. The guiding framework is rigid and does to address some of the departmental needs making it difficult to be flexible and apply the DPSA framework. Furthermore, despite the large budget on ITG, the departments still find themselves with many challenges and unable to improve some of the processes and services. All the engaged Eastern Cape departments have adopted the COBIT framework, but none has been conducting COBIT maturity Assessment which is a functionality of COBIT. There is evidence of too many the ITG frameworks and underutilisation of these frameworks. The study provides a comprehensive evaluation of the ITG frameworks that have been adopted by the South African Government Departments in the Eastern Cape Province. The evaluation guides and recommends the government departments to rethink and adopt ITG frameworks that could be customised to accommodate their needs. The adoption and application of ITG by government departments should assist in better governance and service delivery to the citizens.

Keywords: information technology governance, COBIT, evaluate, framework, governance, DPSA framework

Procedia PDF Downloads 112
14648 Predicting Relative Performance of Sector Exchange Traded Funds Using Machine Learning

Authors: Jun Wang, Ge Zhang

Abstract:

Machine learning has been used in many areas today. It thrives at reviewing large volumes of data and identifying patterns and trends that might not be apparent to a human. Given the huge potential benefit and the amount of data available in the financial market, it is not surprising to see machine learning applied to various financial products. While future prices of financial securities are extremely difficult to forecast, we study them from a different angle. Instead of trying to forecast future prices, we apply machine learning algorithms to predict the direction of future price movement, in particular, whether a sector Exchange Traded Fund (ETF) would outperform or underperform the market in the next week or in the next month. We apply several machine learning algorithms for this prediction. The algorithms are Linear Discriminant Analysis (LDA), k-Nearest Neighbors (KNN), Decision Tree (DT), Gaussian Naive Bayes (GNB), and Neural Networks (NN). We show that these machine learning algorithms, most notably GNB and NN, have some predictive power in forecasting out-performance and under-performance out of sample. We also try to explore whether it is possible to utilize the predictions from these algorithms to outperform the buy-and-hold strategy of the S&P 500 index. The trading strategy to explore out-performance predictions does not perform very well, but the trading strategy to explore under-performance predictions can earn higher returns than simply holding the S&P 500 index out of sample.

Keywords: machine learning, ETF prediction, dynamic trading, asset allocation

Procedia PDF Downloads 82
14647 3D Guidance of Unmanned Aerial Vehicles Using Sliding Mode Approach

Authors: M. Zamurad Shah, M. Kemal Ozgoren, Raza Samar

Abstract:

This paper presents a 3D guidance scheme for Unmanned Aerial Vehicles (UAVs). The proposed guidance scheme is based on the sliding mode approach using nonlinear sliding manifolds. Generalized 3D kinematic equations are considered here during the design process to cater for the coupling between longitudinal and lateral motions. Sliding mode based guidance scheme is then derived for the multiple-input multiple-output (MIMO) system using the proposed nonlinear manifolds. Instead of traditional sliding surfaces, nonlinear sliding surfaces are proposed here for performance and stability in all flight conditions. In the reaching phase control inputs, the bang-bang terms with signum functions are accompanied with proportional terms in order to reduce the chattering amplitudes. The Proposed 3D guidance scheme is implemented on a 6-degrees-of-freedom (6-dof) simulation of a UAV and simulation results are presented here for different 3D trajectories with and without disturbances.

Keywords: unmanned aerial vehicles, sliding mode control, 3D guidance, nonlinear sliding manifolds

Procedia PDF Downloads 439
14646 An Experimental Approach to the Influence of Tipping Points and Scientific Uncertainties in the Success of International Fisheries Management

Authors: Jules Selles

Abstract:

The Atlantic and Mediterranean bluefin tuna fishery have been considered as the archetype of an overfished and mismanaged fishery. This crisis has demonstrated the role of public awareness and the importance of the interactions between science and management about scientific uncertainties. This work aims at investigating the policy making process associated with a regional fisheries management organization. We propose a contextualized computer-based experimental approach, in order to explore the effects of key factors on the cooperation process in a complex straddling stock management setting. Namely, we analyze the effects of the introduction of a socio-economic tipping point and the uncertainty surrounding the estimation of the resource level. Our approach is based on a Gordon-Schaefer bio-economic model which explicitly represents the decision making process. Each participant plays the role of a stakeholder of ICCAT and represents a coalition of fishing nations involved in the fishery and decide unilaterally a harvest policy for the coming year. The context of the experiment induces the incentives for exploitation and collaboration to achieve common sustainable harvest plans at the Atlantic bluefin tuna stock scale. Our rigorous framework allows testing how stakeholders who plan the exploitation of a fish stock (a common pool resource) respond to two kinds of effects: i) the inclusion of a drastic shift in the management constraints (beyond a socio-economic tipping point) and ii) an increasing uncertainty in the scientific estimation of the resource level.

Keywords: economic experiment, fisheries management, game theory, policy making, Atlantic Bluefin tuna

Procedia PDF Downloads 244
14645 The Current Ways of Thinking Mild Traumatic Brain Injury and Clinical Practice in a Trauma Hospital: A Pilot Study

Authors: P. Donnelly, G. Mitchell

Abstract:

Traumatic Brain Injury (TBI) is a major contributor to the global burden of disease; despite its ubiquity, there is significant variation in diagnosis, prognosis, and treatment between clinicians. This study aims to examine the spectrum of approaches that currently exist at a Level 1 Trauma Centre in Australasia by surveying Emergency Physicians and Neurosurgeons on those aspects of mTBI. A pilot survey of 17 clinicians (Neurosurgeons, Emergency Physicians, and others who manage patients with mTBI) at a Level 1 Trauma Centre in Brisbane, Australia, was conducted. The objective of this study was to examine the importance these clinicians place on various elements in their approach to the diagnosis, prognostication, and treatment of mTBI. The data were summarised, and the descriptive statistics reported. Loss of consciousness and post-traumatic amnesia were rated as the most important signs or symptoms in diagnosing mTBI (median importance of 8). MRI was the most important imaging modality in diagnosing mTBI (median importance of 7). ‘Number of the Previous TBIs’ and Intracranial Injury on Imaging’ were rated as the most important elements for prognostication (median importance of 9). Education and reassurance were rated as the most important modality for treating mTBI (median importance of 7). There was a statistically insignificant variation between the specialties as to the importance they place on each of these components. In this Australian tertiary trauma center, there appears to be variation in how clinicians approach mTBI. This study is underpowered to state whether this is between clinicians within a specialty or a trend between specialties. This variation is worthwhile in investigating as a step toward a unified approach to diagnosing, prognosticating, and treating this common pathology.

Keywords: mild traumatic brain injury, adult, clinician, survey

Procedia PDF Downloads 123
14644 Recursive Doubly Complementary Filter Design Using Particle Swarm Optimization

Authors: Ju-Hong Lee, Ding-Chen Chung

Abstract:

This paper deals with the optimal design of recursive doubly complementary (DC) digital filter design using a metaheuristic based optimization technique. Based on the theory of DC digital filters using two recursive digital all-pass filters (DAFs), the design problem is appropriately formulated to result in an objective function which is a weighted sum of the phase response errors of the designed DAFs. To deal with the stability of the recursive DC filters during the design process, we can either impose some necessary constraints on the phases of the recursive DAFs. Through a frequency sampling and a weighted least squares approach, the optimization problem of the objective function can be solved by utilizing a population based stochastic optimization approach. The resulting DC digital filters can possess satisfactory frequency response. Simulation results are presented for illustration and comparison.

Keywords: doubly complementary, digital all-pass filter, weighted least squares algorithm, particle swarm optimization

Procedia PDF Downloads 674
14643 Numerical Solutions of an Option Pricing Rainfall Derivatives Model

Authors: Clarinda Vitorino Nhangumbe, Ercília Sousa

Abstract:

Weather derivatives are financial products used to cover non catastrophic weather events with a weather index as the underlying asset. The rainfall weather derivative pricing model is modeled based in the assumption that the rainfall dynamics follows Ornstein-Uhlenbeck process, and the partial differential equation approach is used to derive the convection-diffusion two dimensional time dependent partial differential equation, where the spatial variables are the rainfall index and rainfall depth. To compute the approximation solutions of the partial differential equation, the appropriate boundary conditions are suggested, and an explicit numerical method is proposed in order to deal efficiently with the different choices of the coefficients involved in the equation. Being an explicit numerical method, it will be conditionally stable, then the stability region of the numerical method and the order of convergence are discussed. The model is tested for real precipitation data.

Keywords: finite differences method, ornstein-uhlenbeck process, partial differential equations approach, rainfall derivatives

Procedia PDF Downloads 88
14642 A Framework for Successful TQM Implementation and Its Effect on the Organizational Sustainability Development

Authors: Redha Elhuni, M. Munir Ahmad

Abstract:

The main purpose of this research is to construct a generic model for successful implementation of Total Quality Management (TQM) in oil sector, and to find out the effects of this model on the organizational sustainability development (OSD) performance of Libyan oil and gas companies using the structured equation modeling (SEM) approach. The research approach covers both quantitative and qualitative methods. A questionnaire was developed in order to identify the quality factors that are seen by Libyan oil and gas companies to be critical to the success of TQM implementation. Hypotheses were developed to evaluate the impact of TQM implementation on O SD. Data analysis reveals that there is a significant positive effect of the TQM implementation on OSD. 24 quality factors are found to be critical and absolutely essential for successful TQM implementation. The results generated a structure of the TQMSD implementation framework based on the four major road map constructs (Top management commitment, employee involvement and participation, customer-driven processes, and continuous improvement culture).

Keywords: total quality management, critical success factors, oil and gas, organizational sustainability development (SD), Libya

Procedia PDF Downloads 268
14641 Operation System for Aluminium-Air Cell: A Strategy to Harvest the Energy from Secondary Aluminium

Authors: Binbin Chen, Dennis Y. C. Leung

Abstract:

Aluminium (Al) -air cell holds a high volumetric capacity density of 8.05 Ah cm-3, benefit from the trivalence of Al ions. Additional benefits of Al-air cell are low price and environmental friendliness. Furthermore, the Al energy conversion process is characterized of 100% recyclability in theory. Along with a large base of raw material reserve, Al attracts considerable attentions as a promising material to be integrated within the global energy system. However, despite the early successful applications in military services, several problems exist that prevent the Al-air cells from widely civilian use. The most serious issue is the parasitic corrosion of Al when contacts with electrolyte. To overcome this problem, super-pure Al alloyed with various traces of metal elements are used to increase the corrosion resistance. Nevertheless, high-purity Al alloys are costly and require high energy consumption during production process. An alternative approach is to add inexpensive inhibitors directly into the electrolyte. However, such additives would increase the internal ohmic resistance and hamper the cell performance. So far these methods have not provided satisfactory solutions for the problem within Al-air cells. For the operation of alkaline Al-air cell, there are still other minor problems. One of them is the formation of aluminium hydroxide in the electrolyte. This process decreases ionic conductivity of electrolyte. Another one is the carbonation process within the gas diffusion layer of cathode, blocking the porosity of gas diffusion. Both these would hinder the performance of cells. The present work optimizes the above problems by building an Al-air cell operation system, consisting of four components. A top electrolyte tank containing fresh electrolyte is located at a high level, so that it can drive the electrolyte flow by gravity force. A mechanical rechargeable Al-air cell is fabricated with low-cost materials including low grade Al, carbon paper, and PMMA plates. An electrolyte waste tank with elaborate channel is designed to separate the hydrogen generated from the corrosion, which would be collected by gas collection device. In the first section of the research work, we investigated the performance of the mechanical rechargeable Al-air cell with a constant flow rate of electrolyte, to ensure the repeatability experiments. Then the whole system was assembled together and the feasibility of operating was demonstrated. During experiment, pure hydrogen is collected by collection device, which holds potential for various applications. By collecting this by-product, high utilization efficiency of aluminum is achieved. Considering both electricity and hydrogen generated, an overall utilization efficiency of around 90 % or even higher under different working voltages are achieved. Fluidic electrolyte could remove aluminum hydroxide precipitate and solve the electrolyte deterioration problem. This operation system provides a low-cost strategy for harvesting energy from the abundant secondary Al. The system could also be applied into other metal-air cells and is suitable for emergency power supply, power plant and other applications. The low cost feature implies great potential for commercialization. Further optimization, such as scaling up and optimization of fabrication, will help to refine the technology into practical market offerings.

Keywords: aluminium-air cell, high efficiency, hydrogen, mechanical recharge

Procedia PDF Downloads 273
14640 Immiscible Polymer Blends with Controlled Nanoparticle Location for Excellent Microwave Absorption: A Compartmentalized Approach

Authors: Sourav Biswas, Goutam Prasanna Kar, Suryasarathi Bose

Abstract:

In order to obtain better materials, control in the precise location of nanoparticles is indispensable. It was shown here that ordered arrangement of nanoparticles, possessing different characteristics (electrical/magnetic dipoles), in the blend structure can result in excellent microwave absorption. This is manifested from a high reflection loss of ca. -67 dB for the best blend structure designed here. To attenuate electromagnetic radiations, the key parameters i.e. high electrical conductivity and large dielectric/magnetic loss are targeted here using a conducting inclusion [multiwall carbon nanotubes, MWNTs]; ferroelectric nanostructured material with associated relaxations in the GHz frequency [barium titanate, BT]; and a loss ferromagnetic nanoparticles [nickel ferrite, NF]. In this study, bi-continuous structures were designed using 50/50 (by wt) blends of polycarbonate (PC) and polyvinylidene fluoride (PVDF). The MWNTs was modified using an electron acceptor molecule; a derivative of perylenediimide, which facilitates π-π stacking with the nanotubes and stimulates efficient charge transport in the blends. The nanoscopic materials have specific affinity towards the PVDF phase. Hence, by introducing surface-active groups, ordered arrangement can be tailored. To accomplish this, both BT and NF was first hydroxylated followed by introducing amine-terminal groups on the surface. The latter facilitated in nucleophilic substitution reaction with PC and resulted in their precise location. In this study, we have shown for the first time that by compartmentalized approach, superior EM attenuation can be achieved. For instance, when the nanoparticles were localized exclusively in the PVDF phase or in both the phases, the minimum reflection loss was ca. -18 dB (for MWNT/BT mixture) and -29 dB (for MWNT/NF mixture), and the shielding was primarily through reflection. Interestingly, by adopting the compartmentalized approach where in, the lossy materials were in the PC phase and the conducting inclusion (MWNT) in PVDF, an outstanding reflection loss of ca. -57 dB (for BT and MWNT combination) and -67 dB (for NF and MWNT combination) was noted and the shielding was primarily through absorption. Thus, the approach demonstrates that nanoscopic structuring in the blends can be achieved under macroscopic processing conditions and this strategy can further be explored to design microwave absorbers.

Keywords: barium titanate, EMI shielding, MWNTs, nickel ferrite

Procedia PDF Downloads 435
14639 An Abductive Approach to Policy Analysis: Policy Analysis as Informed Guessing

Authors: Adrian W. Chew

Abstract:

This paper argues that education policy analysis tends to be steered towards empiricist oriented approaches, which place emphasis on objective and measurable data. However, this paper argues that empiricist oriented approaches are generally based on inductive and/or deductive reasoning, which are unable to generate new ideas/knowledge. This paper will outline the logical structure of induction, deduction, and abduction, and argues that only abduction provides possibilities for the creation of new ideas/knowledge. This paper proposes the neologism of ‘informed guessing’ as a reformulation of abduction, and also as an approach to education policy analysis. On one side, the signifier ‘informed’ encapsulates the idea that abductive policy analysis needs to be informed by descriptive conceptualization theory to be able to make relations and connections between, and within, observed phenomenon and unobservable general structures. On the other side, the signifier ‘guessing’ captures the cyclical and unsystematic process of abduction. This paper will end with a brief example of utilising ‘informed guessing’ for a policy analysis of school choice lotteries in the United States.

Keywords: abductive reasoning, empiricism, informed guessing, policy analysis

Procedia PDF Downloads 341
14638 Flow Boiling Heat Transfer at Low Mass and Heat Fluxes: Heat Transfer Coefficient, Flow Pattern Analysis and Correlation Assessment

Authors: Ernest Gyan Bediako, Petra Dancova, Tomas Vit

Abstract:

Flow boiling heat transfer remains an important area of research due to its relevance in thermal management systems and other applications. Despite the enormous work done in the field of flow boiling heat transfer over the years to understand how flow parameters such as mass flux, heat flux, saturation conditions and tube geometries influence the characteristics of flow boiling heat transfer, there are still many contradictions and lack of agreement on the actual mechanisms controlling heat transfer and how flow parameters impact the heat transfer. This work thus seeks to experimentally investigate the heat transfer characteristics and flow patterns at low mass fluxes, low heat fluxes and low saturation pressure conditions which are of less attention in literature but prevalent in refrigeration, air-conditioning and heat pump applications. In this study, flow boiling experiment was conducted for R134a working fluid in a 5 mm internal diameter stainless steel horizontal smooth tube with mass flux ranging from 80- 100 kg/m2 s, heat fluxes ranging from 3.55kW/m2 - 25.23 kW/m2 and saturation pressure of 460 kPa. Vapor quality ranged from 0 to 1. A well-known flow pattern map created by Wojtan et al. was used to predict the flow patterns noticed during the study. The experimental results were correlated with well-known flow boiling heat transfer correlations in literature. The findings show that, heat transfer coefficient was influenced by both mass flux and heat fluxes. However, for an increasing heat flux, nucleate boiling was observed to be the dominant mechanism controlling the heat transfer especially at low vapor quality region. For an increasing mass flux, convective boiling was the dominant mechanism controlling the heat transfer especially in the high vapor quality region. Also, the study observed an unusual high heat transfer coefficient at low vapor qualities which could be due to periodic wetting of the walls of the tube due to slug flow pattern and stratified wavy flow patterns. The flow patterns predicted by Wojtan et al. flow pattern map were mixture of slug and stratified wavy, purely stratified wavy and dry out. Statistical assessment of the experimental data with various well-known correlations from literature showed that, none of the correlations reported in literature could predicted the experimental data with enough accuracy.

Keywords: flow boiling, heat transfer coefficient, mass flux, heat flux.

Procedia PDF Downloads 106
14637 Ductility Spectrum Method for the Design and Verification of Structures

Authors: B. Chikh, L. Moussa, H. Bechtoula, Y. Mehani, A. Zerzour

Abstract:

This study presents a new method, applicable to evaluation and design of structures has been developed and illustrated by comparison with the capacity spectrum method (CSM, ATC-40). This method uses inelastic spectra and gives peak responses consistent with those obtained when using the nonlinear time history analysis. Hereafter, the seismic demands assessment method is called in this paper DSM, Ductility Spectrum Method. It is used to estimate the seismic deformation of Single-Degree-Of-Freedom (SDOF) systems based on DDRS, Ductility Demand Response Spectrum, developed by the author.

Keywords: seismic demand, capacity, inelastic spectra, design and structure

Procedia PDF Downloads 388
14636 Analyses and Optimization of Physical and Mechanical Properties of Direct Recycled Aluminium Alloy (AA6061) Wastes by ANOVA Approach

Authors: Mohammed H. Rady, Mohd Sukri Mustapa, S Shamsudin, M. A. Lajis, A. Wagiman

Abstract:

The present study is aimed at investigating microhardness and density of aluminium alloy chips when subjected to various settings of preheating temperature and preheating time. Three values of preheating temperature were taken as 450 °C, 500 °C, and 550 °C. On the other hand, three values of preheating time were chosen (1, 2, 3) hours. The influences of the process parameters (preheating temperature and time) were analyzed using Design of Experiments (DOE) approach whereby full factorial design with center point analysis was adopted. The total runs were 11 and they comprise of two factors of full factorial design with 3 center points. The responses were microhardness and density. The results showed that the density and microhardness increased with decreasing the preheating temperature. The results also found that the preheating temperature is more important to be controlled rather than the preheating time in microhardness analysis while both the preheating temperature and preheating time are important in density analysis. It can be concluded that setting temperature at 450 °C for 1 hour resulted in the optimum responses.

Keywords: AA6061, density, DOE, hot extrusion, microhardness

Procedia PDF Downloads 343