Search results for: theoretical domains framework
1900 Investigation of Nucleation and Thermal Conductivity of Waxy Crude Oil on Pipe Wall via Particle Dynamics
Authors: Jinchen Cao, Tiantian Du
Abstract:
As waxy crude oil is easy to crystallization and deposition in the pipeline wall, it causes pipeline clogging and leads to the reduction of oil and gas gathering and transmission efficiency. In this paper, a mesoscopic scale dissipative particle dynamics method is employed, and constructed four pipe wall models, including smooth wall (SW), hydroxylated wall (HW), rough wall (RW), and single-layer graphene wall (GW). Snapshots of the simulation output trajectories show that paraffin molecules interact with each other to form a network structure that constrains water molecules as their nucleation sites. Meanwhile, it is observed that the paraffin molecules on the near-wall side are adsorbed horizontally between inter-lattice gaps of the solid wall. In the pressure range of 0 - 50 MPa, the pressure change has less effect on the affinity properties of SS, HS, and GS walls, but for RS walls, the contact angle between paraffin wax and water molecules was found to decrease with the increase in pressure, while the water molecules showed the opposite trend, the phenomenon is due to the change in pressure, leading to the transition of paraffin wax molecules from amorphous to crystalline state. Meanwhile, the minimum crystalline phase pressure (MCPP) was proposed to describe the lowest pressure at which crystallization of paraffin molecules occurs. The maximum number of crystalline clusters formed by paraffin molecules at MCPP in the system showed NSS (0.52 MPa) > NHS (0.55 MPa) > NRS (0.62 MPa) > NGS (0.75 MPa). The MCPP on the graphene surface, with the least number of clusters formed, indicates that the addition of graphene inhibited the crystallization process of paraffin deposition on the wall surface. Finally, the thermal conductivity was calculated, and the results show that on the near-wall side, the thermal conductivity changes drastically due to the occurrence of adsorption crystallization of paraffin waxes; on the fluid side the thermal conductivity gradually tends to stabilize, and the average thermal conductivity shows: ĸRS(0.254W/(m·K)) > ĸRS(0.249W/(m·K)) > ĸRS(0.218W/(m·K)) > ĸRS(0.188W/(m·K)).This study provides a theoretical basis for improving the transport efficiency and heat transfer characteristics of waxy crude oil in terms of wall type, wall roughness, and MCPP.Keywords: waxy crude oil, thermal conductivity, crystallization, dissipative particle dynamics, MCPP
Procedia PDF Downloads 751899 The Selectivities of Pharmaceutical Spending Containment: Social Profit, Incentivization Games and State Power
Authors: Ben Main Piotr Ozieranski
Abstract:
State government spending on pharmaceuticals stands at 1 trillion USD globally, promoting criticism of the pharmaceutical industry's monetization of drug efficacy, product cost overvaluation, and health injustice. This paper elucidates the mechanisms behind a state-institutional response to this problem through the sociological lens of the strategic relational approach to state power. To do so, 30 expert interviews, legal and policy documents are drawn on to explain how state elites in New Zealand have successfully contested a 30-year “pharmaceutical spending containment policy”. Proceeding from Jessop's notion of strategic “selectivity”, encompassing analyses of the enabling features of state actors' ability to harness state structures, a theoretical explanation is advanced. First, a strategic context is described that consists of dynamics around pharmaceutical dealmaking between the state bureaucracy, pharmaceutical pricing strategies (and their effects), and the industry. Centrally, the pricing strategy of "bundling" -deals for packages of drugs that combine older and newer patented products- reflect how state managers have instigated an “incentivization game” that is played by state and industry actors, including HTA professionals, over pharmaceutical products (both current and in development). Second, a protective context is described that is comprised of successive legislative-judicial responses to the strategic context and characterized by the regulation and the societalisation of commercial law. Third, within the policy, the achievement of increased pharmaceutical coverage (pharmaceutical “mix”) alongside contained spending is conceptualized as a state defence of a "social profit". As such, in contrast to scholarly expectations that political and economic cultures of neo-liberalism drive pharmaceutical policy-making processes, New Zealand's state elites' approach is shown to be antipathetic to neo-liberals within an overall capitalist economy. The paper contributes an analysis of state pricing strategies and how they are embedded in state regulatory structures. Additionally, through an analysis of the interconnections of state power and pharmaceutical value Abrahams's neo-liberal corporate bias model for pharmaceutical policy analysis is problematised.Keywords: pharmaceutical governance, pharmaceutical bureaucracy, pricing strategies, state power, value theory
Procedia PDF Downloads 731898 An Investigative Study into Good Governance in the Non-Profit Sector in South Africa: A Systems Approach Perspective
Authors: Frederick M. Dumisani Xaba, Nokuthula G. Khanyile
Abstract:
There is a growing demand for greater accountability, transparency and ethical conduct based on sound governance principles in the developing world. Funders, donors and sponsors are increasingly demanding more transparency, better value for money and adherence to good governance standards. The drive towards improved governance measures is largely influenced by the need to ‘plug the leaks’, deal with malfeasance, engender greater levels of accountability and good governance and to ultimately attract further funding or investment. This is the case with the Non-Profit Organizations (NPOs) in South Africa in general, and in the province of KwaZulu-Natal in particular. The paper draws from the good governance theory, stakeholder theory and systems thinking to critically examine the requirements for good governance for the NPO sector from a theoretical and legislative point and to systematically looks at the contours of governance currently among the NPOs. The paper did this through the rigorous examination of the vignettes of cases of governance among selected NPOs based in KwaZulu-Natal. The study used qualitative and quantitative research methodologies through document analysis, literature review, semi-structured interviews, focus groups and statistical analysis from the various primary and secondary sources. It found some good cases of good governance but also found frightening levels of poor governance. There was an exponential growth of NPOs registered during the period under review, equally so there was an increase in cases of non-compliance to good governance practices. NPOs operate in an increasingly complex environment. There is contestation for influence and access to resources. Stakeholder management is poorly conceptualized and executed. Recognizing that the NPO sector operates in an environment characterized by complexity, constant changes, unpredictability, contestation, diversity and divergent views of different stakeholders, there is a need to apply legislative and systems thinking approaches to strengthen governance to withstand this turbulence through a capacity development model that recognizes these contextual and environmental challenges.Keywords: good governance, non-profit organizations, stakeholder theory, systems theory
Procedia PDF Downloads 1231897 Understanding Cyber Kill Chains: Optimal Allocation of Monitoring Resources Using Cooperative Game Theory
Authors: Roy. H. A. Lindelauf
Abstract:
Cyberattacks are complex processes consisting of multiple interwoven tasks conducted by a set of agents. Interdictions and defenses against such attacks often rely on cyber kill chain (CKC) models. A CKC is a framework that tries to capture the actions taken by a cyber attacker. There exists a growing body of literature on CKCs. Most of this work either a) describes the CKC with respect to one or more specific cyberattacks or b) discusses the tools and technologies used by the attacker at each stage of the CKC. Defenders, facing scarce resources, have to decide where to allocate their resources given the CKC and partial knowledge on the tools and techniques attackers use. In this presentation CKCs are analyzed through the lens of covert projects, i.e., interrelated tasks that have to be conducted by agents (human and/or computer) with the aim of going undetected. Various aspects of covert project models have been studied abundantly in the operations research and game theory domain, think of resource-limited interdiction actions that maximally delay completion times of a weapons project for instance. This presentation has investigated both cooperative and non-cooperative game theoretic covert project models and elucidated their relation to CKC modelling. To view a CKC as a covert project each step in the CKC is broken down into tasks and there are players of which each one is capable of executing a subset of the tasks. Additionally, task inter-dependencies are represented by a schedule. Using multi-glove cooperative games it is shown how a defender can optimize the allocation of his scarce resources (what, where and how to monitor) against an attacker scheduling a CKC. This study presents and compares several cooperative game theoretic solution concepts as metrics for assigning resources to the monitoring of agents.Keywords: cyber defense, cyber kill chain, game theory, information warfare techniques
Procedia PDF Downloads 1431896 Understanding Ambivalent Behaviors of Social Media Users toward the 'Like' Function: A Social Capital Perspective
Abstract:
The 'Like' function in social media platforms represents the immediate responses of social media users to postings and other users. A large number of 'likes' is often attributed to fame, agreement, and support from others that many users are proud of and happy with. However, what 'like' implies exactly in social media context is still in discussion. Some argue that it is an accurate parameter of the preferences of social media users, whereas others refute that it is merely an instant reaction that is volatile and vague. To address this gap, this study investigates how social media users perceive the 'like' function and behave differently based on their perceptions. This study posits the following arguments. First, 'like' is interpreted as a quantified form of social capital that resides in social media platforms. This incarnated social capital rationalizes the attraction of people to social media and belief that social media platforms bring benefits to their relationships with others. This social capital is then conceptualized into cognitive and emotive dimensions, where social capital in the cognitive dimension represents the awareness of the 'likes' quantitatively, whereas social capital in the emotive dimension represents the receptions of the 'likes' qualitatively. Finally, the ambivalent perspective of the social media users on 'like' (i.e., social capital) is applied. This view rationalizes why social media users appreciate the reception of 'likes' from others but are aware that those 'likes' can distort the actual responses of other users by sending erroneous signals. The rationale on this ambivalence is based on whether users perceive social media as private or public spheres. When social media is more publicized, the ambivalence is more strongly observed. By combining the ambivalence and dimensionalities of the social capital, four types of social media users with different mechanisms on liking behaviors are identified. To validate this work, a survey with 300 social media users is conducted. The analysis results support most of the hypotheses and confirm that people have ambivalent perceptions on 'like' as a social capital and that perceptions influence behavioral patterns. The implication of the study is clear. First, this study explains why social media users exhibit different behaviors toward 'likes' in social media. Although most of the people believe that the number of 'likes' is the simplest and most frank measure of supports from other social media users, this study introduces the users who do not trust the 'likes' as a stable and reliable parameter of social media. In addition, this study links the concept of social media openness to explain the different behaviors of social media users. Social media openness has theoretical significance because it defines the psychological boundaries of social media from the perspective of users.Keywords: ambivalent attitude, like function, social capital, social media
Procedia PDF Downloads 2431895 Impact of Foreign Aid on Economic Development
Authors: Saeed Anwar
Abstract:
Foreign aid has long been a prominent tool in the pursuit of economic development in recipient countries. This research paper aims to analyze the impact of foreign aid on economic development and explore the effectiveness of aid in promoting sustainable growth, poverty reduction, and improvements in human development indicators. Drawing upon a comprehensive review of existing literature, both theoretical frameworks and empirical evidence are synthesized to provide insights into the complex relationship between foreign aid and economic development. The paper examines various channels through which foreign aid influences economic development, including infrastructure development, education and healthcare investments, technology transfer, and institutional capacity building. It explores the potential positive effects of aid in stimulating economic growth, reducing poverty, and enhancing human capital formation. Additionally, it investigates the potential challenges and limitations associated with aid, such as aid dependency, governance issues, and the potential crowding out of domestic resources. Furthermore, the study assesses the heterogeneity of aid effectiveness across different types of aid modalities, recipient country characteristics, and aid allocation mechanisms. It considers the role of aid conditionality, aid fragmentation, and aid targeting in influencing the effectiveness of aid in promoting economic development. The findings of this research contribute to the ongoing discourse on foreign aid and economic development by providing a comprehensive analysis of the existing literature. The study highlights the importance of context-specific factors, recipient country policies, and aid effectiveness frameworks in determining the impact of foreign aid on economic development outcomes. The insights derived from this research can inform policymakers, donor agencies, and practitioners in designing and implementing effective aid strategies to maximize the positive impact of foreign aid on economic development.Keywords: foreign aid, economic development, sustainable growth, poverty reduction, human development indicators, infrastructure development, education, healthcare, technology transfer, institutional capacity building, aid effectiveness, aid dependency, governance, crowding out, aid conditionality, aid fragmentation, aid targeting, recipient country policies, aid strategies, donor agencies, policymaking
Procedia PDF Downloads 681894 Examining Risk Based Approach to Financial Crime in the Charity Sector: The Challenges and Solutions, Evidence from the Regulation of Charities in England and Wales
Authors: Paschal Ohalehi
Abstract:
Purpose - The purpose of this paper, which is part of a PhD thesis is to examine the role of risk based approach in minimising financial crime in the charity sector as well as offer recommendations to improving the quality of charity regulation whilst still retaining risk based approach as a regulatory framework and also making a case for a new regulatory model. Increase in financial crimes in the charity sector has put the role of regulation in minimising financial crime up for debates amongst researchers and practitioners. Although previous research has addressed the regulation of charities, research on the role of risk based approach to minimising financial crime in the charity sector is limited. Financial crime is a concern for all organisation including charities. Design/methodology/approach - This research adopts a social constructionist’s epistemological position. This research is carried out using semi structured in-depth interviews amongst randomly selected 24 charity trustees divided into three classes: 10 small charities, 10 medium charities and 4 large charities. The researcher also interviewed 4 stakeholders (NFA, Charity Commission and two different police forces in terms of size and area of coverage) in the charity sector. Findings - The results of this research show that reliance on risk based approach to financial crime in the sector is weak and fragmented with the research pointing to a clear evidence of disconnect between the regulator and the regulated leading to little or lack of regulation of trustees’ activities, limited monitoring of charities and lack of training and awareness on financial crime in the sector. Originality – This paper shows how regulation of charities in general and risk based approach in particular can be improved in order to meet the expectations of the stakeholders, the public, the regulator and the regulated.Keywords: risk, risk based approach, financial crime, fraud, self-regulation
Procedia PDF Downloads 3791893 ADP Approach to Evaluate the Blood Supply Network of Ontario
Authors: Usama Abdulwahab, Mohammed Wahab
Abstract:
This paper presents the application of uncapacitated facility location problems (UFLP) and 1-median problems to support decision making in blood supply chain networks. A plethora of factors make blood supply-chain networks a complex, yet vital problem for the regional blood bank. These factors are rapidly increasing demand; criticality of the product; strict storage and handling requirements; and the vastness of the theater of operations. As in the UFLP, facilities can be opened at any of $m$ predefined locations with given fixed costs. Clients have to be allocated to the open facilities. In classical location models, the allocation cost is the distance between a client and an open facility. In this model, the costs are the allocation cost, transportation costs, and inventory costs. In order to address this problem the median algorithm is used to analyze inventory, evaluate supply chain status, monitor performance metrics at different levels of granularity, and detect potential problems and opportunities for improvement. The Euclidean distance data for some Ontario cities (demand nodes) are used to test the developed algorithm. Sitation software, lagrangian relaxation algorithm, and branch and bound heuristics are used to solve this model. Computational experiments confirm the efficiency of the proposed approach. Compared to the existing modeling and solution methods, the median algorithm approach not only provides a more general modeling framework but also leads to efficient solution times in general.Keywords: approximate dynamic programming, facility location, perishable product, inventory model, blood platelet, P-median problem
Procedia PDF Downloads 5091892 Sustainability in Retaining Wall Construction with Geosynthetics
Authors: Sateesh Kumar Pisini, Swetha Priya Darshini, Sanjay Kumar Shukla
Abstract:
This paper seeks to present a research study on sustainability in construction of retaining wall using geosynthetics. Sustainable construction is a way for the building and infrastructure industry to move towards achieving sustainable development, taking into account environmental, socioeconomic and cultural issues. Geotechnical engineering, being very resource intensive, warrants an environmental sustainability study, but a quantitative framework for assessing the sustainability of geotechnical practices, particularly at the planning and design stages, does not exist. In geotechnical projects, major economic issues to be addressed are in the design and construction of stable slopes and retaining structures within space constraints. In this paper, quantitative indicators for assessing the environmental sustainability of retaining wall with geosynthetics are compared with conventional concrete retaining wall through life cycle assessment (LCA). Geosynthetics can make a real difference in sustainable construction techniques and contribute to development in developing countries in particular. Their imaginative application can result in considerable cost savings over the use of conventional designs and materials. The acceptance of geosynthetics in reinforced retaining wall construction has been triggered by a number of factors, including aesthetics, reliability, simple construction techniques, good seismic performance, and the ability to tolerate large deformations without structural distress. Reinforced retaining wall with geosynthetics is the best cost-effective and eco-friendly solution as compared with traditional concrete retaining wall construction. This paper presents an analysis of the theme of sustainability applied to the design and construction of traditional concrete retaining wall and presenting a cost-effective and environmental solution using geosynthetics.Keywords: sustainability, retaining wall, geosynthetics, life cycle assessment
Procedia PDF Downloads 20651891 Unshackled Slaves: An Analysis of the Adjudication of Degrading Conditions of Work by Brazilian Labour Courts
Authors: Aline F. C. Pereira
Abstract:
In recent years, modern slavery has increasingly gathered attention in scholarly discussions and policy debates. Whereas the mainstream studies focus on forced labour and trafficking, little attention is paid to other forms of exploitation, such as degrading conditions of work –criminalised in Brazil as an autonomous type of slavery since 2003. This paper aims to bridge this gap. It adopts a mixed method that comprises both qualitative and quantitative analysis, to investigate the adjudication of 164 cases of degrading conditions of work by Brazilian labour courts. The research discloses an ungrounded reluctance to apply the domestic legal framework, as in most of the cases degrading conditions of work are not recognised as contemporary slavery, despite the law. In some cases, not even situations described as subhuman and degrading of human dignity were framed as slavery. The analysis also suggests that, as in chattel times, lack of freedom and subjection remain relevant in the legal characterisation of slave labour. The examination has further unraveled a phenomenon absent in previous studies: normalisation of precarity. By depicting precarity as natural and inevitable in rural areas, labour courts ensure conformity to the status quo and reduce the likelihood of resistance by victims. Moreover, compensations afforded to urban workers are higher than granted to rural employees, which seems to place human beings in hierarchical categories -a trace of colonialism. In sum, the findings challenge the worldwide spread assumption that Brazil addresses slavery efficiently. Conversely, the Brazilian Labour Judiciary seems to remain subservient to a colonial perspective of slavery, legitimising, and sanctioning abusive practices.Keywords: adjudication, contemporary slavery, degrading conditions of work, normalisation of precarity
Procedia PDF Downloads 1151890 Real-Time Radar Tracking Based on Nonlinear Kalman Filter
Authors: Milca F. Coelho, K. Bousson, Kawser Ahmed
Abstract:
To accurately track an aerospace vehicle in a time-critical situation and in a highly nonlinear environment, is one of the strongest interests within the aerospace community. The tracking is achieved by estimating accurately the state of a moving target, which is composed of a set of variables that can provide a complete status of the system at a given time. One of the main ingredients for a good estimation performance is the use of efficient estimation algorithms. A well-known framework is the Kalman filtering methods, designed for prediction and estimation problems. The success of the Kalman Filter (KF) in engineering applications is mostly due to the Extended Kalman Filter (EKF), which is based on local linearization. Besides its popularity, the EKF presents several limitations. To address these limitations and as a possible solution to tracking problems, this paper proposes the use of the Ensemble Kalman Filter (EnKF). Although the EnKF is being extensively used in the context of weather forecasting and it is being recognized for producing accurate and computationally effective estimation on systems with a very high dimension, it is almost unknown by the tracking community. The EnKF was initially proposed as an attempt to improve the error covariance calculation, which on the classic Kalman Filter is difficult to implement. Also, in the EnKF method the prediction and analysis error covariances have ensemble representations. These ensembles have sizes which limit the number of degrees of freedom, in a way that the filter error covariance calculations are a lot more practical for modest ensemble sizes. In this paper, a realistic simulation of a radar tracking was performed, where the EnKF was applied and compared with the Extended Kalman Filter. The results suggested that the EnKF is a promising tool for tracking applications, offering more advantages in terms of performance.Keywords: Kalman filter, nonlinear state estimation, optimal tracking, stochastic environment
Procedia PDF Downloads 1531889 Engaging Students in Learning through Visual Demonstration Models in Engineering Education
Authors: Afsha Shaikh, Mohammed Azizur Rahman, Ibrahim Hassan, Mayur Pal
Abstract:
Student engagement in learning is instantly affected by the sources of learning methods available for them, such as videos showing the applications of the concept or showing a practical demonstration. Specific to the engineering discipline, there exist enormous challenging concepts that can be simplified when they are connected to real-world scenarios. For this study, the concept of heat exchangers was used as it is a part of multidisciplinary engineering fields. To make the learning experience enjoyable and impactful, 3-D printed heat exchanger models were created for students to use while working on in-class activities and assignments. Students were encouraged to use the 3-D printed heat exchanger models to enhance their understanding of theoretical concepts associated with its applications. To assess the effectiveness of the method, feedback was received by students pursuing undergraduate engineering via an anonymous electronic survey. To make the feedback more realistic, unbiased, and genuine, students spent nearly two to three weeks using the models in their in-class assignments. The impact of these tools on their learning was assessed through their performance in their ungraded assignments as well as their interactive discussions with peers. ‘Having to apply the theory learned in class whilst discussing with peers on a class assignment creates a relaxed and stress-free learning environment in classrooms’; this feedback was received by more than half the students who took the survey and found 3-D models of heat exchanger very easy to use. Amongst many ways to enhance learning and make students more engaged through interactive models, this study sheds light on the importance of physical tools that help create a lasting mental representation in the minds of students. Moreover, in this technologically enhanced era, the concept of augmented reality was considered in this research. E-drawings application was recommended to enhance the vision of engineering students so they can see multiple views of the detailed 3-D models and cut through its different sides and angles to visualize it properly. E-drawings could be the next tool to implement in classrooms to enhance students’ understanding of engineering concepts.Keywords: student engagement, life-long-learning, visual demonstration, 3-D printed models, engineering education
Procedia PDF Downloads 1161888 Analysis of Organizational Hybrid Agile Methods Environments: Frameworks, Benefits, and Challenges
Authors: Majid Alsubaie, Hamed Sarbazhosseini
Abstract:
Many working environments have experienced increased uncertainty due to the fast-moving and unpredictable world. IT systems development projects, in particular, face several challenges because of their rapidly changing environments and emerging technologies. Information technology organizations within these contexts adapt systems development methodology and new software approaches to address this issue. One of these methodologies is the Agile method, which has gained huge attention in recent years. However, due to failure rates in IT projects, there is an increasing demand for the use of hybrid Agile methods among organizations. The scarce research in the area means that organizations do not have solid evidence-based knowledge for the use of hybrid Agile. This research was designed to provide further insights into the development of hybrid Agile methods within systems development projects, including how frameworks and processes are used and what benefits and challenges are gained and faced as a result of hybrid Agile methods. This paper presents how three organizations (two government and one private) use hybrid Agile methods in their Agile environments. The data was collected through interviews and a review of relevant documents. The results indicate that these organizations do not predominantly use pure Agile. Instead, they are waterfall organizations by virtue of systems nature and complexity, and Agile is used underneath as the delivery model. Prince2 Agile framework, SAFe, Scrum, and Kanban were the identified models and frameworks followed. This study also found that customer satisfaction and the ability to build quickly are the most frequently perceived benefits of using hybrid Agile methods. In addition, team resistance and scope changes are the common challenges identified by research participants in their working environments. The findings can help to understand Agile environmental conditions and projects that can help get better success rates and customer satisfaction.Keywords: agile, hybrid, IT systems, management, success rate, technology
Procedia PDF Downloads 1101887 Comparison of Serum Protein Fraction between Healthy and Diarrhea Calf by Electrophoretogram
Authors: Jinhee Kang, Kwangman Park, Ruhee Song, Suhee Kim, Do-Hyeon Yu, Kyoungseong Choi, Jinho Park
Abstract:
Statement of the Problem: Animal blood components maintain homeostasis when animals are healthy, and changes in chemical composition of the blood and body fluids can be observed if animals have a disease. In particular, newborn calves are susceptible to disease and therefore hematologic tests and serum chemistry tests could become an important guideline to the diagnosis and the treatment of diseases. Diarrhea in newborn calves is the most damaging to cattle ranch, whether dairy or cattle fattening, and is a large part of calf atrophy and death. However, since the study on calf electrophoresis was not carried out, a survey analysis was conducted on it. Methodology and Theoretical Orientation: The calves were divided into healthy calves and disease (diarrhea) calves, and calves were classified by 1-14d, 15-28d, and more than 28d, respectively. The fecal state was classified by solid (0-value), semi-solid (1-value), loose (2-value) and watery (3-value). In the solid (0-value) and semi-solid (1-value) feces valuable pathogen was not detected, but loose (2-value) and watery (3-value) feces were detected. Findings: ALB, α-1, α-2, α-SUM, β and γ (Gamma) were examined by electrophoresis analysis of healthy calves and diarrhea calves. Test results showed that there were age differences between healthy calves and diarrheic calves. When we look at the γ-globulin at 1-14 days of age, we can see that the average calf of healthy calves is 16.8% and the average of diarrheal calves is 7.7%, when we look at the figures for the α-2 at 1-14 days, we found that healthy calves average 5.2% and diarrheal calves 8.7% higher than healthy cows. On α-1, 15-28 days, and after 28 days, healthy calves average 10.4% and diarrheal calves average 7.5% diarrhea calves were 12.6% and 12.4% higher than healthy calves. In the α-SUM, the healthy calves were 21.6%, 16.8%, and 14.5%, respectively, after 1-14 days, 15-28 days and 28 days. diarrheal calves were 23.1%, 19.5%, and 19.8%. Conclusion and Significance: In this study, we examined the electrophoresis results of healthy calves and diseased (diarrhea) calves, gamma globulin at 1-14 days of age were lower than those of healthy calves (diarrhea), indicating that the calf was unable to consume colostrum from the mother when it was a new calf. α-1, α-2, α-SUM may be associated with an acute inflammatory response as a result of increased levels of calves with diarrhea (diarrhea). Further research is needed to investigate the effects of acute inflammatory responses on additional calf-forming proteins. Information on the results of the electrophoresis test will be provided where necessary according to the item.Keywords: alpha, electrophoretogram, serum protein, γ, gamma
Procedia PDF Downloads 1411886 Game-Theory-Based on Downlink Spectrum Allocation in Two-Tier Networks
Authors: Yu Zhang, Ye Tian, Fang Ye Yixuan Kang
Abstract:
The capacity of conventional cellular networks has reached its upper bound and it can be well handled by introducing femtocells with low-cost and easy-to-deploy. Spectrum interference issue becomes more critical in peace with the value-added multimedia services growing up increasingly in two-tier cellular networks. Spectrum allocation is one of effective methods in interference mitigation technology. This paper proposes a game-theory-based on OFDMA downlink spectrum allocation aiming at reducing co-channel interference in two-tier femtocell networks. The framework is formulated as a non-cooperative game, wherein the femto base stations are players and frequency channels available are strategies. The scheme takes full account of competitive behavior and fairness among stations. In addition, the utility function reflects the interference from the standpoint of channels essentially. This work focuses on co-channel interference and puts forward a negative logarithm interference function on distance weight ratio aiming at suppressing co-channel interference in the same layer network. This scenario is more suitable for actual network deployment and the system possesses high robustness. According to the proposed mechanism, interference exists only when players employ the same channel for data communication. This paper focuses on implementing spectrum allocation in a distributed fashion. Numerical results show that signal to interference and noise ratio can be obviously improved through the spectrum allocation scheme and the users quality of service in downlink can be satisfied. Besides, the average spectrum efficiency in cellular network can be significantly promoted as simulations results shown.Keywords: femtocell networks, game theory, interference mitigation, spectrum allocation
Procedia PDF Downloads 1591885 Externalised Migration Controls and the Deportation of Minors and Potential Refugees from Mexico
Authors: Vickie Knox
Abstract:
Since the ‘urgent humanitarian crisis’ of the arrival of tens of thousands of Central American minors at the Mexico-US border in early 2014, the USA has increasingly externalised migration controls to Mexico. Although the resulting policy ‘Plan Frontera Sur’ claimed to protect migrants’ human rights, it has manifested as harshly delivered in-country controls and an alarming increase in deportations, particularly of minors. This is of particular concern given the ongoing situation of forced migration caused by criminal violence in Central America because these deportations do not all comply with Mexico’s international obligations and with its own legal framework for international protection that allows inter alia verbal asylum claims and grants minors additional protection against deportation. Notably, the volume of deportations, the speed with which they are carried out and the lack of adequate screening indicate non-compliance with the principle of non-refoulement and the right to claim asylum or other forms of protection. Based on qualitative data gathered in fieldwork in 2015 and quantitative data covering the period 2014-2016, this research details three types of adverse outcome resulting from these externalised controls: human rights violations perpetrated in order to deliver the policy–namely, deportations that may not comply with the principle of non-refoulement or the protection of minors; human rights violations perpetrated in the execution of policy–such as violations by state actors during apprehension and detention; and adverse consequences of the policy – such as increased risk during transit. This research has particular resonance as the Trump era brings tighter enforcement in the region, and has broader relevance for the study of externalisation tools on a global level.Keywords: deportation, externalisation, forced migration, non-refoulement
Procedia PDF Downloads 1531884 Establishing a Computational Screening Framework to Identify Environmental Exposures Using Untargeted Gas-Chromatography High-Resolution Mass Spectrometry
Authors: Juni C. Kim, Anna R. Robuck, Douglas I. Walker
Abstract:
The human exposome, which includes chemical exposures over the lifetime and their effects, is now recognized as an important measure for understanding human health; however, the complexity of the data makes the identification of environmental chemicals challenging. The goal of our project was to establish a computational workflow for the improved identification of environmental pollutants containing chlorine or bromine. Using the “pattern. search” function available in the R package NonTarget, we wrote a multifunctional script that searches mass spectral clusters from untargeted gas-chromatography high-resolution mass spectrometry (GC-HRMS) for the presence of spectra consistent with chlorine and bromine-containing organic compounds. The “pattern. search” function was incorporated into a different function that allows the evaluation of clusters containing multiple analyte fragments, has multi-core support, and provides a simplified output identifying listing compounds containing chlorine and/or bromine. The new function was able to process 46,000 spectral clusters in under 8 seconds and identified over 150 potential halogenated spectra. We next applied our function to a deidentified dataset from patients diagnosed with primary biliary cholangitis (PBC), primary sclerosing cholangitis (PSC), and healthy controls. Twenty-two spectra corresponded to potential halogenated compounds in the PSC and PBC dataset, including six significantly different in PBC patients, while four differed in PSC patients. We have developed an improved algorithm for detecting halogenated compounds in GC-HRMS data, providing a strategy for prioritizing exposures in the study of human disease.Keywords: exposome, metabolome, computational metabolomics, high-resolution mass spectrometry, exposure, pollutants
Procedia PDF Downloads 1401883 Standard Essential Patents for Artificial Intelligence Hardware and the Implications For Intellectual Property Rights
Authors: Wendy de Gomez
Abstract:
Standardization is a critical element in the ability of a society to reduce uncertainty, subjectivity, misrepresentation, and interpretation while simultaneously contributing to innovation. Technological standardization is critical to codify specific operationalization through legal instruments that provide rules of development, expectation, and use. In the current emerging technology landscape Artificial Intelligence (AI) hardware as a general use technology has seen incredible growth as evidenced from AI technology patents between 2012 and 2018 in the United States Patent Trademark Office (USPTO) AI dataset. However, as outlined in the 2023 United States Government National Standards Strategy for Critical and Emerging Technology the codification through standardization of emerging technologies such as AI has not kept pace with its actual technological proliferation. This gap has the potential to cause significant divergent possibilities for the downstream outcomes of AI in both the short and long term. This original empirical research provides an overview of the standardization efforts around AI in different geographies and provides a background to standardization law. It quantifies the longitudinal trend of Artificial Intelligence hardware patents through the USPTO AI dataset. It seeks evidence of existing Standard Essential Patents from these AI hardware patents through a text analysis of the Statement of patent history and the Field of the invention of these patents in Patent Vector and examines their determination as a Standard Essential Patent and their inclusion in existing AI technology standards across the four main AI standards bodies- European Telecommunications Standards Institute (ETSI); International Telecommunication Union (ITU)/ Telecommunication Standardization Sector (-T); Institute of Electrical and Electronics Engineers (IEEE); and the International Organization for Standardization (ISO). Once the analysis is complete the paper will discuss both the theoretical and operational implications of F/Rand Licensing Agreements for the owners of these Standard Essential Patents in the United States Court and Administrative system. It will conclude with an evaluation of how Standard Setting Organizations (SSOs) can work with SEP owners more effectively through various forms of Intellectual Property mechanisms such as patent pools.Keywords: patents, artifical intelligence, standards, F/Rand agreements
Procedia PDF Downloads 911882 Latinx Adult ELLs: Exploring English Instructors’ Perceptions of Classroom Diversity and Culturally Diverse Teaching Strategies
Authors: Sharon Diaz Ruiz
Abstract:
This qualitative study addresses college English instructors’ perceptions of classroom diversity and culturally diverse teaching strategies within the adult English language learning classroom environment. Every year, English college instructors face numerous challenges as the adult Latinx population keeps rising. To better understand the Latinx adult learners and the language classroom dynamics, research should focus on the experiences, pedagogical methods, and teaching insights of full-time and adjunct minority professors at degree-granting postsecondary institutions. Culturally responsive teaching is used as the framework to understand and explore the perceptions of English instructors on the realities and needs of Latinx adult emergent bilinguals enrolled in developmental English courses. Snowball sampling allows the researcher to locate members who meet these specific criteria: adjunct and part-time English instructors of adult Latinx language learners. Participants answered a demographic questionnaire and then contributed to 45-minute in-depth interviews to explore their perceptions of culturally responsive practices in the Latinx adult emergent bilinguals’ basic and intermediate developmental English courses. The interviews shed light on topics such as teaching biases, educators’ cultural experiences, and resources and strategies faculty recommend for effective culturally responsive teaching strategies. The result of this investigation will shed light on the gap in the literature documenting the application of culturally responsive pedagogy to Latino adult language learners.Keywords: Latinx, English language learners, English faculty, adult learners, critical theory, culturally responsive theory
Procedia PDF Downloads 681881 An Introduction to Giulia Annalinda Neglia Viewpoint on Morphology of the Islamic City Using Written Content Analysis Approach
Authors: Mohammad Saber Eslamlou
Abstract:
Morphology of Islamic cities has been extensively studied by researchers of Islamic cities and different theories could be found about it. In this regard, there exist much difference in method of analysis, classification, recognition, confrontation and comparative method of urban morphology. The present paper aims to examine the previous methods, approaches and insights and that how Dr. Giulia Annalinda Neglia dealt with the analysis of morphology of Islamic cities. Neglia is assistant professor in University of Bari, Italy (UNIBA) who has published numerous papers and books on Islamic cities. I introduce her works in the field of morphology of Islamic cities. And then, her thoughts, insights and research methodologies are presented and analyzed in critical perspective. This is a qualitative research on her written works, which have been classified in three major categories. The first category consists mainly of her works on morphology and physical shape of Islamic cities. The results of her works’ review suggest that she has used Moratoria typology in investigating morphology of Islamic cities. Moreover, overall structure of the cities under investigation is often described linear; however, she’s against to define a single framework for the recognition of morphology in Islamic cities. She states that ‘to understand the physical complexity and irregularities in Islamic cities, it is necessary to study the urban fabric by typology method, focusing on transformation processes of the buildings’ form and their surrounding open spaces’ and she believes that fabric of each region in the city follows from the principles of an specific period or urban pattern, in particular, Hellenistic and Roman structures. Furthermore, she believes that it is impossible to understand the morphology of a city without taking into account the obvious and hidden developments associated with it, because form of building and their surrounding open spaces are written history of the city.Keywords: city, Islamic city, Giulia Annalinda Neglia, morphology
Procedia PDF Downloads 1001880 The Effect of Bilingualism on Prospective Memory
Authors: Aslı Yörük, Mevla Yahya, Banu Tavat
Abstract:
It is well established that bilinguals outperform monolinguals on executive function tasks. However, the effects of bilingualism on prospective memory (PM), which also requires executive functions, have not been investigated vastly. This study aimed to compare bi and monolingual participants' PM performance in focal and non-focal PM tasks. Considering that bilinguals have greater executive function abilities than monolinguals, we predict that bilinguals’ PM performance would be higher than monolinguals on the non-focal PM task, which requires controlled monitoring processes. To investigate these predictions, we administered the focal and non-focal PM task and measured the PM and ongoing task performance. Forty-eight Turkish-English bilinguals residing in North Macedonia and forty-eight Turkish monolinguals living in Turkey between the ages of 18-30 participated in the study. They were instructed to remember responding to rarely appearing PM cues while engaged in an ongoing task, i.e., spatial working memory task. The focality of the task was manipulated by giving different instructions for PM cues. In the focal PM task, participants were asked to remember to press an enter key whenever a particular target stimulus appeared in the working memory task; in the non-focal PM task, instead of responding to a specific target shape, participants were asked to remember to press the enter key whenever the background color of the working memory trials changes to a specific color (yellow). To analyze data, we performed a 2 × 2 mixed factorial ANOVA with the task (focal versus non-focal) as a within-subject variable and language group (bilinguals versus monolinguals) as a between-subject variable. The results showed no direct evidence for a bilingual advantage in PM. That is, the group’s performance did not differ in PM accuracy and ongoing task accuracy. However, bilinguals were overall faster in the ongoing task, yet this was not specific to PM cue’s focality. Moreover, the results showed a reversed effect of PM cue's focality on the ongoing task performance. That is, both bi and monolinguals showed enhanced performance in the non-focal PM cue task. These findings raise skepticism about the literature's prevalent findings and theoretical explanations. Future studies should investigate possible alternative explanations.Keywords: bilingualism, executive functions, focality, prospective memory
Procedia PDF Downloads 1171879 The Investigate Relationship between Moral Hazard and Corporate Governance with Earning Forecast Quality in the Tehran Stock Exchange
Authors: Fatemeh Rouhi, Hadi Nassiri
Abstract:
Earning forecast is a key element in economic decisions but there are some situations, such as conflicts of interest in financial reporting, complexity and lack of direct access to information has led to the phenomenon of information asymmetry among individuals within the organization and external investors and creditors that appear. The adverse selection and moral hazard in the investor's decision and allows direct assessment of the difficulties associated with data by users makes. In this regard, the role of trustees in corporate governance disclosure is crystallized that includes controls and procedures to ensure the lack of movement in the interests of the company's management and move in the direction of maximizing shareholder and company value. Therefore, the earning forecast of companies in the capital market and the need to identify factors influencing this study was an attempt to make relationship between moral hazard and corporate governance with earning forecast quality companies operating in the capital market and its impact on Earnings Forecasts quality by the company to be established. Getting inspiring from the theoretical basis of research, two main hypotheses and sub-hypotheses are presented in this study, which have been examined on the basis of available models, and with the use of Panel-Data method, and at the end, the conclusion has been made at the assurance level of 95% according to the meaningfulness of the model and each independent variable. In examining the models, firstly, Chow Test was used to specify either Panel Data method should be used or Pooled method. Following that Housman Test was applied to make use of Random Effects or Fixed Effects. Findings of the study show because most of the variables are positively associated with moral hazard with earnings forecasts quality, with increasing moral hazard, earning forecast quality companies listed on the Tehran Stock Exchange is increasing. Among the variables related to corporate governance, board independence variables have a significant relationship with earnings forecast accuracy and earnings forecast bias but the relationship between board size and earnings forecast quality is not statistically significant.Keywords: corporate governance, earning forecast quality, moral hazard, financial sciences
Procedia PDF Downloads 3241878 Qualitative Analysis of Current Child Custody Evaluation Practices
Authors: Carolyn J. Ortega, Stephen E. Berger
Abstract:
The role of the custody evaluator is perhaps one of the most controversial and risky endeavors in clinical practice. Complaints filed with licensing boards regarding a child-custody evaluation constitute the second most common reason for such an event. Although the evaluator is expected to answer for the family-law court what is in the “best interest of the child,” there is a lack of clarity on how to establish this in any empirically validated manner. Hence, practitioners must contend with a nebulous framework in formulating their methodological procedures that inherently places them at risk in an already litigious context. This study sought to qualitatively investigate patterns of practice among doctoral practitioners conducting child custody evaluations in the area of Southern California. Ten psychologists were interviewed who devoted between 25 and 100% of their California private practice to custody work. All held Ph.D. degrees with a range of eight to 36 years of experience in custody work. Semi-structured interviews were used to investigate assessment practices, ensure adherence to guidelines, risk management, and qualities of evaluators. Forty-three Specific Themes were identified using Interpretive Phenomenological Analysis (IPA). Seven Higher Order Themes clustered on salient factors such as use of Ethics, Law, Guidelines; Parent Variables; Child Variables; Psychologist Variables; Testing; Literature; and Trends. Evaluators were aware of the ever-present reality of a licensure complaint and thus presented idiosyncratic descriptions of risk management considerations. Ambiguity about quantifying and validly tapping parenting abilities was also reviewed. Findings from this study suggested a high reliance on unstructured and observational methods in child custody practices.Keywords: forensic psychology, psychological testing, assessment methodology, child custody
Procedia PDF Downloads 2851877 Architectural Adaptation for Road Humps Detection in Adverse Light Scenario
Authors: Padmini S. Navalgund, Manasi Naik, Ujwala Patil
Abstract:
Road hump is a semi-cylindrical elevation on the road made across specific locations of the road. The vehicle needs to maneuver the hump by reducing the speed to avoid car damage and pass over the road hump safely. Road Humps on road surfaces, if identified in advance, help to maintain the security and stability of vehicles, especially in adverse visibility conditions, viz. night scenarios. We have proposed a deep learning architecture adaptation by implementing the MISH activation function and developing a new classification loss function called "Effective Focal Loss" for Indian road humps detection in adverse light scenarios. We captured images comprising of marked and unmarked road humps from two different types of cameras across South India to build a heterogeneous dataset. A heterogeneous dataset enabled the algorithm to train and improve the accuracy of detection. The images were pre-processed, annotated for two classes viz, marked hump and unmarked hump. The dataset from these images was used to train the single-stage object detection algorithm. We utilised an algorithm to synthetically generate reduced visible road humps scenarios. We observed that our proposed framework effectively detected the marked and unmarked hump in the images in clear and ad-verse light environments. This architectural adaptation sets up an option for early detection of Indian road humps in reduced visibility conditions, thereby enhancing the autonomous driving technology to handle a wider range of real-world scenarios.Keywords: Indian road hump, reduced visibility condition, low light condition, adverse light condition, marked hump, unmarked hump, YOLOv9
Procedia PDF Downloads 301876 The Ideology of the Jordanian Media Women’s Discourse: Lana Mamkgh as an Example
Authors: Amani Hassan Abu Atieh
Abstract:
This study aims at examining the patterns of ideology reflected in the written discourse of women writers in the media of Jordan; Lana Mamkgh is taken as an example. This study critically analyzes the discursive, linguistic, and cognitive representations that she employs as an agent in the institutionalized discourse of the media. Grounded in van Dijk’s critical discourse analysis approach to Sociocognitive Discourse Studies, the present study builds a multilayer framework that encompasses van Dijk’s triangle: discourse, society, and cognition. Specifically, the study attempts to analyze, at both micro and macro levels, the underlying cognitive processes and structures, mainly ideology and discursive strategies, which are functional in the production of women’s discourse in terms of meaning, forms, and functions. Cognitive processes that social actors adopt are underlined by experience/context and semantic mental models on the one hand and social cognition on the other. This study is based on qualitative research and adopts purposive sampling, taking as an example a sample of an opinion article written by Lana Mamkgh in the Arabic Jordanian Daily, Al Rai. Taking her role as an agent in the public sphere, she stresses the National and feminist ideologies, demonstrating the use of assertive, evaluative, and expressive linguistic and rhetorical devices that appeal to the logic, ethics, and emotions of the addressee. Highlighting the agency of Jordanian writers in the media, the study sought to achieve the macro goal of dispensing political and social justice to the underprivileged. Further, the study seeks to prove that the voice of Jordanian women, viewed as underrepresented and invisible in the public arena, has come through clearly.Keywords: critical discourse analysis, sociocognitive theory, ideology, women discourse, media
Procedia PDF Downloads 1111875 Learning Physics Concepts through Language Syntagmatic Paradigmatic Relations
Authors: C. E. Laburu, M. A. Barros, A. F. Zompero, O. H. M. Silva
Abstract:
The work presents a teaching strategy that employs syntagmatic and paradigmatic linguistic relations in order to monitor the understanding of physics students’ concepts. Syntagmatic and paradigmatic relations are theoretical elements of semiotics studies and our research circumstances and justified them within the research program of multi-modal representations. Among the multi-modal representations to learning scientific knowledge, the scope of action of syntagmatic and paradigmatic relations belongs to the discursive writing form. The use of such relations has the purpose to seek innovate didactic work with discourse representation in the write form before translate to another different representational form. The research was conducted with a sample of first year high school students. The students were asked to produce syntagmatic and paradigmatic of Newton’ first law statement. This statement was delivered in paper for each student that should individually write the relations. The student’s records were collected for analysis. It was possible observed in one student used here as example that their monemes replaced and rearrangements produced by, respectively, syntagmatic and paradigmatic relations, kept the original meaning of the law. In paradigmatic production he specified relevant significant units of the linguistic signs, the monemas, which constitute the first articulation and each word substituted kept equivalence to the original meaning of original monema. Also, it was noted a number of diverse and many monemas were chosen, with balanced combination of grammatical (grammatical monema is what changes the meaning of a word, in certain positions of the syntagma, along with a relatively small number of other monemes. It is the smallest linguistic unit that has grammatical meaning) and lexical (lexical monema is what belongs to unlimited inventories; is the monema endowed with lexical meaning) monemas. In syntagmatic production, monemas ordinations were syntactically coherent, being linked with semantic conservation and preserved number. In general, the results showed that the written representation mode based on linguistic relations paradigmatic and syntagmatic qualifies itself to be used in the classroom as a potential identifier and accompanist of meanings acquired from students in the process of scientific inquiry.Keywords: semiotics, language, high school, physics teaching
Procedia PDF Downloads 1341874 Intersectionality and Sensemaking: Advancing the Conversation on Leadership as the Management of Meaning
Authors: Clifford Lewis
Abstract:
This paper aims to advance the conversation of an alternative view of leadership, namely ‘leadership as the management of meaning’. Here, leadership is considered as a social process of the management of meaning within an employment context, as opposed to a psychological trait, set of behaviours or relational consequence as seen in mainstream leadership research. Specifically, this study explores the relationship between intersectional identities and the management of meaning. Design: Semi-structured, one-on-one interviews were conducted with women and men of colour working in the South African private sector organisations in various leadership positions. Employing an intersectional approach using gender and race, participants were selected by using purposive and snowball sampling concurrently. Thematic and Axial coding was used to identify dominant themes. Findings: Findings suggest that, both gender and race shape how leaders manage meaning. Findings also confirm that intersectionality is an appropriate approach when studying the leadership experiences of those groups who are underrepresented in organisational leadership structures. The findings points to the need for further research into the differential effects of intersecting identities on organisational leadership experiences and that ‘leadership as the management of meaning’ is an appropriate approach for addressing this knowledge gap. Theoretical Contribution: There is a large body of literature on the complex challenges faced by women and people of colour in leadership but there is relatively little empirical work on how identity influences the management of meaning. This study contributes to the leadership literature by providing insight into how intersectional identities influence the management of meaning at work and how this impacts the leadership experiences of largely marginalised groups. Practical Implications: Understanding the leadership experiences of underrepresented groups is important because of both legal mandates and for building diverse talent for organisations and societies. Such an understanding assists practitioners in being sensitive to simplistic notions of challenges individuals might face in accessing and practicing leadership in organisations. Advancing the conversation on leadership as the management of meaning allows for a better understanding of complex challenges faced by women and people of colour and an opportunity for organisations to systematically remove unfair structural obstacles and develop their diverse leadership capacity.Keywords: intersectionality, diversity, leadership, sensemaking
Procedia PDF Downloads 2761873 Decision Support System for the Management of the Shandong Peninsula, China
Authors: Natacha Fery, Guilherme L. Dalledonne, Xiangyang Zheng, Cheng Tang, Roberto Mayerle
Abstract:
A Decision Support System (DSS) for supporting decision makers in the management of the Shandong Peninsula has been developed. Emphasis has been given to coastal protection, coastal cage aquaculture and harbors. The investigations were done in the framework of a joint research project funded by the German Ministry of Education and Research (BMBF) and the Chinese Academy of Sciences (CAS). In this paper, a description of the DSS, the development of its components, and results of its application are presented. The system integrates in-situ measurements, process-based models, and a database management system. Numerical models for the simulation of flow, waves, sediment transport and morphodynamics covering the entire Bohai Sea are set up based on the Delft3D modelling suite (Deltares). Calibration and validation of the models were realized based on the measurements of moored Acoustic Doppler Current Profilers (ADCP) and High Frequency (HF) radars. In order to enable cost-effective and scalable applications, a database management system was developed. It enhances information processing, data evaluation, and supports the generation of data products. Results of the application of the DSS to the management of coastal protection, coastal cage aquaculture and harbors are presented here. Model simulations covering the most severe storms observed during the last decades were carried out leading to an improved understanding of hydrodynamics and morphodynamics. Results helped in the identification of coastal stretches subjected to higher levels of energy and improved support for coastal protection measures.Keywords: coastal protection, decision support system, in-situ measurements, numerical modelling
Procedia PDF Downloads 1961872 The Identification of Environmentally Friendly People: A Case of South Sumatera Province, Indonesia
Authors: Marpaleni
Abstract:
The intergovernmental Panel on Climate Change (IPCC) declared in 2007 that global warming and climate change are not just a series of events caused by nature, but rather caused by human behaviour. Thus, to reduce the impact of human activities on climate change it is required to have information about how people respond to the environmental issues and what constraints they face. However, information on these and other phenomena remains largely missing, or not fully integrated within the existing data systems. The proposed study is aimed at filling the gap in this knowledge by focusing on Environmentally Friendly Behaviour (EFB) of the people of Indonesia, by taking the province of South Sumatera as a case of study. EFB is defined as any activity in which people engage to improve the conditions of the natural resources and/or to diminish the impact of their behaviour on the environment. This activity is measured in terms of consumption in five areas at the household level, namely housing, energy, water usage, recycling and transportation. By adopting the Indonesia’s Environmentally Friendly Behaviour conducted by Statistics Indonesia in 2013, this study aims to precisely identify one’s orientation towards EFB based on socio demographic characteristics such as: age, income, occupation, location, education, gender and family size. The results of this research will be useful to precisely identify what support people require to strengthen their EFB, to help identify specific constraints that different actors and groups face and to uncover a more holistic understanding of EFB in relation to particular demographic and socio-economics contexts. As the empirical data are examined from the national data sample framework, which will continue to be collected, it can be used to forecast and monitor the future of EFB.Keywords: environmentally friendly behavior, demographic, South Sumatera, Indonesia
Procedia PDF Downloads 2871871 Raising Forest Voices: A Cross-Country Comparative Study of Indigenous Peoples’ Engagement with Grassroots Climate Change Mitigation Projects in the Initial Pilot Phase of Community-Based Reducing Emissions from Deforestation and forest Degradation
Authors: Karl D. Humm
Abstract:
The United Nations’ Community-based REDD+ (Reducing Emissions from Deforestation and forest Degradation) (CBR+) is a programme that directly finances grassroots climate change mitigation strategies that uplift Indigenous Peoples (IPs) and other marginalised groups. A pilot for it in six countries was developed in response to criticism of the REDD+ programme for excluding IPs from dialogues about climate change mitigation strategies affecting their lands and livelihoods. Despite the pilot’s conclusion in 2017, no complete report has yet been produced on the results of CBR+. To fill this gap, this study investigated the experiences with involving IPs in the CBR+ programmes and local projects across all six pilot countries. A literature review of official UN reports and academic articles identified challenges and successes with IP participation in REDD+ which became the basis for a framework guiding data collection. A mixed methods approach was used to collect and analyse qualitative and quantitative data from CBR+ documents and written interviews with CBR+ National Coordinators in each country for a cross-country comparative analysis. The study found that the most frequent challenges were lack of organisational capacity, illegal forest activities, and historically-based contentious relationships in IP and forest-dependent communities. Successful programmes included IPs and incorporated respect and recognition of IPs as major stakeholders in managing sustainable forests. Findings are summarized and shared with a set of recommendations for improvement of future projects.Keywords: climate change, forests, indigenous peoples, REDD+
Procedia PDF Downloads 125