Search results for: multi-layers decision engine
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4747

Search results for: multi-layers decision engine

2977 A Negotiation Model for Understanding the Role of International Law in Foreign Policy Crises

Authors: William Casto

Abstract:

Studies that consider the actual impact of international law upon foreign affairs crises are flawed by an unrealistic model of decision making. The common, unexamined assumption is that a nation has a unitary executive or ruler who considers a wide variety of considerations, including international law, in attempting to resolve a crisis. To the extent that negotiation theory is considered, the focus is on negotiations between or among nations. The unsettling result is a shallow focus that concentrates on each country’s public posturing about international law. The country-to-country model ignores governments’ internal negotiations that lead to their formal position in a crisis. The model for foreign policy crises needs to be supplemented to include a model of internal negotiations. Important foreign policy decisions come from groups within a government committee, advisers, etc. Within these groups, participants may have differing agendas and resort to international law to bolster their positions. To understand the influence of international law in international crises, these internal negotiations must be considered. These negotiations are crucial to creating a foreign policy agenda or recommendations. External negotiations between the two nations are significant, but the internal negotiations provide a better understanding of the actual influence of international law upon international crises. Discovering the details of specific internal negotiations is quite difficult but not necessarily impossible. The present proposal will use a specific crisis to illustrate the role of international law. In 1861 during the American Civil War, a United States navy captain stopped a British mail ship and removed two ambassadors of the rebelling southern states. The result was what is commonly called the Trent Affair. In the wake of the captain’s unauthorized and rash action, Great Britain seriously considered going to war against the United States. A detailed analysis of the Trent Affair is possible using the available and extensive internal British correspondence and memoranda to reach an understanding of the effect of international law upon decision making. The extensive trove of internal British documents is particularly valuable because in 1861, the only effective means of communication was face-to-face or through letters. Telephones did not exist, and travel by horse and carriage was tedious. The British documents tell us how individual participants viewed the process. We can approach an accurate understanding of what actually happened as the British government strove to resolve the crisis. For example, British law officers initially concluded that the American captain’s rash act was permissible under international law. Later, the law officers revised their opinion. A model of internal negotiation is particularly valuable because it strips away nations’ public posturing about disputed international law principles. In internal decision making, there is room for meaningful debate over the relevant principles. This fluid debate tells how international law is used to develop a hard, public bargaining position. The Trent Affair indicates that international law had an actual influence upon the crisis and that law was not mere window dressing for the government’s public position.

Keywords: foreign affairs crises, negotiation, international law, Trent affair

Procedia PDF Downloads 127
2976 Role of Desire in Risk-Perception: A Case Study of Syrian Refugees’ Migration towards Europe

Authors: Lejla Sunagic

Abstract:

The aim of the manuscript is to further the understanding of risky decision-making in the context of forced and irregular migration. The empirical evidence is collected through interviews with Syrian refugees who arrived in Europe via irregular pathways. Analytically, it has been approached through the juxtaposition between risk perception and the notion of desire. As different frameworks have been developed to address differences in risk perception, the common thread was the understanding that individual risk-taking has been addressed in terms of benefits outweighing risks. However, this framework cannot explain a big risk an individual takes because of an underprivileged position and due to a lack of positive alternatives, termed as risk-taking from vulnerability. The accounts of the field members of this study that crossed the sea in rubber boats to arrive in Europe make an empirical fit to such a postulate by reporting that the risk they have taken was not the choice but the only coping strategy. However, the vulnerability argument falls short of explaining why the interviewees, thinking retrospectively, find the risky journey they have taken to be worth it, while they would strongly advise others to restrain from taking such a huge risk. This inconsistency has been addressed by adding the notion of desire to migrate to the elements of risk perception. Desire, as a subjective experience, was what made the risk appear smaller in cost-benefit analysis at the time of decision-making of those who have realized migration. However, when they reflect on others in the context of potential migration via the same pathway, the interviewees addressed the others’ lack of capacity to avoid the same obstacles that they themselves were able to circumvent while omitting to reflect on others’ desire to migrate. Thus, in the risk-benefit analysis performed for others, the risk remains unblurred and tips over the benefits, given the inability to take into account the desire of others. If desire, as the transformative potential of migration, is taken out of the cost-benefit analysis of irregular migration, refugees might not have taken the risky journey. By casting the theoretical argument in the language of configuration, the study is filling in the gap of knowledge on the combination of migration drivers and the way they interact and produce migration outcomes.

Keywords: refugees, risk perception, desire, irregular migration

Procedia PDF Downloads 96
2975 Developing Indicators in System Mapping Process Through Science-Based Visual Tools

Authors: Cristian Matti, Valerie Fowles, Eva Enyedi, Piotr Pogorzelski

Abstract:

The system mapping process can be defined as a knowledge service where a team of facilitators, experts and practitioners facilitate a guided conversation, enable the exchange of information and support an iterative curation process. System mapping processes rely on science-based tools to introduce and simplify a variety of components and concepts of socio-technical systems through metaphors while facilitating an interactive dialogue process to enable the design of co-created maps. System maps work then as “artifacts” to provide information and focus the conversation into specific areas around the defined challenge and related decision-making process. Knowledge management facilitates the curation of that data gathered during the system mapping sessions through practices of documentation and subsequent knowledge co-production for which common practices from data science are applied to identify new patterns, hidden insights, recurrent loops and unexpected elements. This study presents empirical evidence on the application of these techniques to explore mechanisms by which visual tools provide guiding principles to portray system components, key variables and types of data through the lens of climate change. In addition, data science facilitates the structuring of elements that allow the analysis of layers of information through affinity and clustering analysis and, therefore, develop simple indicators for supporting the decision-making process. This paper addresses methodological and empirical elements on the horizontal learning process that integrate system mapping through visual tools, interpretation, cognitive transformation and analysis. The process is designed to introduce practitioners to simple iterative and inclusive processes that create actionable knowledge and enable a shared understanding of the system in which they are embedded.

Keywords: indicators, knowledge management, system mapping, visual tools

Procedia PDF Downloads 195
2974 Numerical Analysis of Flow in the Gap between a Simplified Tractor-Trailer Model and Cross Vortex Trap Device

Authors: Terrance Charles, Zhiyin Yang, Yiling Lu

Abstract:

Heavy trucks are aerodynamically inefficient due to their un-streamlined body shapes, leading to more than of 60% engine power being required to overcome the aerodynamics drag at 60 m/hr. There are many aerodynamics drag reduction devices developed and this paper presents a study on a drag reduction device called Cross Vortex Trap Device (CVTD) deployed in the gap between the tractor and the trailer of a simplified tractor-trailer model. Numerical simulations have been carried out at Reynolds number 0.51×106 based on inlet flow velocity and height of the trailer using the Reynolds-Averaged Navier-Stokes (RANS) approach. Three different configurations of CVTD have been studied, ranging from single to three slabs, equally spaced on the front face of the trailer. Flow field around three different configurations of trap device have been analysed and presented. The results show that a maximum of 12.25% drag reduction can be achieved when a triple vortex trap device is used. Detailed flow field analysis along with pressure contours are presented to elucidate the drag reduction mechanisms of CVTD and why the triple vortex trap configuration produces the maximum drag reduction among the three configurations tested.

Keywords: aerodynamic drag, cross vortex trap device, truck, Reynolds-Averaged Navier-Stokes, RANS

Procedia PDF Downloads 134
2973 Finite Element Analysis of Connecting Rod

Authors: Mohammed Mohsin Ali H., Mohamed Haneef

Abstract:

The connecting rod transmits the piston load to the crank causing the latter to turn, thus converting the reciprocating motion of the piston into a rotary motion of the crankshaft. Connecting rods are subjected to forces generated by mass and fuel combustion. This study investigates and compares the fatigue behavior of forged steel, powder forged and ASTM a 514 steel cold quenched connecting rods. The objective is to suggest for a new material with reduced weight and cost with the increased fatigue life. This has entailed performing a detailed load analysis. Therefore, this study has dealt with two subjects: first, dynamic load and stress analysis of the connecting rod, and second, optimization for material, weight and cost. In the first part of the study, the loads acting on the connecting rod as a function of time were obtained. Based on the observations of the dynamic FEA, static FEA, and the load analysis results, the load for the optimization study was selected. It is the conclusion of this study that the connecting rod can be designed and optimized under a load range comprising tensile load and compressive load. Tensile load corresponds to 360o crank angle at the maximum engine speed. The compressive load is corresponding to the peak gas pressure. Furthermore, the existing connecting rod can be replaced with a new connecting rod made of ASTM a 514 steel cold quenched that is 12% lighter and 28% cheaper.

Keywords: connecting rod, ASTM a514 cold quenched material, static analysis, fatigue analysis, stress life approach

Procedia PDF Downloads 300
2972 Information Technology: Assessing Indian Realities Vis-à-Vis World Trade Organisation Disciplines

Authors: Saloni Khanderia

Abstract:

The World Trade Organisation’s (WTO) Information Technology Agreement (ITA), was concluded at the Singapore Ministerial Conference in 1996. The ITA is considered to be one of the biggest tariff-cutting deals because it eliminates all customs-related duties on the exportation of specific categories of information technology products to the territory of any other signatory to the Agreement. Over time, innovations in the information and communication technology (ICT) sector mandated the consideration of expanding the list of products covered by the ITA, which took place in the form of ITA-II negotiations during the WTO’s Nairobi Ministerial Conference. India, which was an original Member of the ITA-I, however, decided to opt-out of the negotiations to expand the list of products covered by the agreement. Instead, it preferred to give priority to its national policy initiative, namely the ‘Make-in-India’ programme [the MiI programme], which embarks upon fostering the domestic production of, inter alia, the ICT sector. India claims to have abstained from the ITA-II negotiations by stating that the zero-tariff regime created by the ITA-I debilitated its electronics-manufacturing sectors and on the contrary resulted in an over-reliance on imported electronic inputs. The author undertakes doctrinal research to examine India’s decision to opt-out of ITA-II negotiations, against the backdrop of the MiI Programme, which endeavours to improve productivity across-the-board. This paper accordingly scrutinises the tariff-cutting strategies of India to weigh the better alternative for India. Apropos, it examines whether initiatives like the MiI programme could plausibly resuscitate the ailing domestic electronics-manufacturing sector. The author opines that the country’s present decision to opt-out of ITA-II negotiations should be perceived as a welcome step. Thus, market-oriented reforms such as the MiI Programme, which focuses on indigenous innovation to improve domestic manufacturing in the ICT sector, should instead, in the present circumstances gain priority. Consequently, the MiI Programme would aid in moulding the country’s current tariff policy in a manner that will concurrently assist the promotion and sustenance of domestic manufacturing in the IT sector.

Keywords: electronics-manufacturing sector, information technology agreement, make in india programme, world trade organisation

Procedia PDF Downloads 229
2971 Multi-Objective Optimization of Run-of-River Small-Hydropower Plants Considering Both Investment Cost and Annual Energy Generation

Authors: Amèdédjihundé H. J. Hounnou, Frédéric Dubas, François-Xavier Fifatin, Didier Chamagne, Antoine Vianou

Abstract:

This paper presents the techno-economic evaluation of run-of-river small-hydropower plants. In this regard, a multi-objective optimization procedure is proposed for the optimal sizing of the hydropower plants, and NSGAII is employed as the optimization algorithm. Annual generated energy and investment cost are considered as the objective functions, and number of generator units (n) and nominal turbine flow rate (QT) constitute the decision variables. Site of Yeripao in Benin is considered as the case study. We have categorized the river of this site using its environmental characteristics: gross head, and first quartile, median, third quartile and mean of flow. Effects of each decision variable on the objective functions are analysed. The results gave Pareto Front which represents the trade-offs between annual energy generation and the investment cost of hydropower plants, as well as the recommended optimal solutions. We noted that with the increase of the annual energy generation, the investment cost rises. Thus, maximizing energy generation is contradictory with minimizing the investment cost. Moreover, we have noted that the solutions of Pareto Front are grouped according to the number of generator units (n). The results also illustrate that the costs per kWh are grouped according to the n and rise with the increase of the nominal turbine flow rate. The lowest investment costs per kWh are obtained for n equal to one and are between 0.065 and 0.180 €/kWh. Following the values of n (equal to 1, 2, 3 or 4), the investment cost and investment cost per kWh increase almost linearly with increasing the nominal turbine flowrate while annual generated. Energy increases logarithmically with increasing of the nominal turbine flowrate. This study made for the Yeripao river can be applied to other rivers with their own characteristics.

Keywords: hydropower plant, investment cost, multi-objective optimization, number of generator units

Procedia PDF Downloads 157
2970 Implications about the Impact of COVID-19 on Business

Authors: Anwar Kashgari

Abstract:

COVID-19 has severe impacts on business all over the world. The great lockdown of many business owners requires a sage deal with this pandemic. This paper seeks to support business leaders with a standpoint about the COVID-19 situation and provides implications for the (Small and Medium Enterprises) SMEs and companies. The paper reflects the author's view about the impact of COVID-19 on business activities. We discussed the impact of COVID-19 upon three aspects, namely, startups, SMEs, and e-commerce. The KSA is an example of the developing countries about which we present the current situation. Finally, recommendations to policy and decision-makers are given.

Keywords: COVID 19, business networking, globalization

Procedia PDF Downloads 215
2969 Dynamic Modeling of Energy Systems Adapted to Low Energy Buildings in Lebanon

Authors: Nadine Yehya, Chantal Maatouk

Abstract:

Low energy buildings have been developed to achieve global climate commitments in reducing energy consumption. They comprise energy efficient buildings, zero energy buildings, positive buildings and passive house buildings. The reduced energy demands in Low Energy buildings call for advanced building energy modeling that focuses on studying active building systems such as heating, cooling and ventilation, improvement of systems performances, and development of control systems. Modeling and building simulation have expanded to cover different modeling approach i.e.: detailed physical model, dynamic empirical models, and hybrid approaches, which are adopted by various simulation tools. This paper uses DesignBuilder with EnergyPlus simulation engine in order to; First, study the impact of efficiency measures on building energy behavior by comparing Low energy residential model to a conventional one in Beirut-Lebanon. Second, choose the appropriate energy systems for the studied case characterized by an important cooling demand. Third, study dynamic modeling of Variable Refrigerant Flow (VRF) system in EnergyPlus that is chosen due to its advantages over other systems and its availability in the Lebanese market. Finally, simulation of different energy systems models with different modeling approaches is necessary to confront the different modeling approaches and to investigate the interaction between energy systems and building envelope that affects the total energy consumption of Low Energy buildings.

Keywords: physical model, variable refrigerant flow heat pump, dynamic modeling, EnergyPlus, the modeling approach

Procedia PDF Downloads 221
2968 Intergenerational Technology Learning in the Family

Authors: Chih-Chun Wu

Abstract:

Learning information and communication technologies (ICT) helps people survive in current society. For the internet generation also referred as digital natives, learning new technology is like breathing; however, for the elder generations also called digital immigrants, including parents and grandparents, learning new technology could be challenged and frustrated. While majority research focused on the effects of elders’ ICT learning, less attention was paid to the help that the elders got from their other family members while learning ICT. This study utilized the anonymous questionnaire to survey 3,749 undergraduates and demonstrated that families are great places for intergenerational technology learning to be carried out. Results from this study confirmed that in the family, the younger generation both helped set up technology products and educated the elder ones needed technology knowledge and skills. The family elder members in this study applied to those who lived under the same roof with relative relations. Results from this study revealed that 2,331 (62.2%) and 2,656 (70.8%) undergraduates revealed that they helped their family elder members set up and taught them how to use LINE respectively. In addition, 1,481 (49.1%) undergraduates helped their family elder members set up, and 2,222 (59.3%) taught them. When it came to Apps, 2,527 (67.4%) helped their family elder members download them, and 2,876 (76.7%) taught how to use them. As for search engine, 2,317 (61.8%) undergraduates taught their family elders. Furthermore, 3,118 (83.2%), 2,639 (70.4%) and 2,004 (53.7%) undergraduates illustrated that they taught their family elder members smartphones, computers and tablets respectively. Meanwhile, only 904 (24.2%) undergraduates taught their family elders how to make a doctor appointment online. This study suggests to making good use of intergenerational technology learning in the family, since it increases family elders’ technology capital, and thus strengthens our country’s human capital and competitiveness.

Keywords: intergenerational technology learning, adult technology learning, family technology learning, ICT learning

Procedia PDF Downloads 235
2967 A Verification Intellectual Property for Multi-Flow Rate Control on Any Single Flow Bus Functional Model

Authors: Pawamana Ramachandra, Jitesh Gupta, Saranga P. Pogula

Abstract:

In verification of high volume and complex packet processing IPs, finer control of flow management aspects (for example, rate, bits/sec etc.) per flow class (or a virtual channel or a software thread) is needed. When any Software/Universal Verification Methodology (UVM) thread arbitration is left to the simulator (e.g., Verilog Compiler Simulator (VCS) or Incisive Enterprise Simulator core simulation engine (NCSIM)), it is hard to predict its pattern of resulting distribution of bandwidth by the simulator thread arbitration. In many cases, the patterns desired in a test scenario may not be accomplished as the simulator might give a different distribution than what was required. This can lead to missing multiple traffic scenarios, specifically deadlock and starvation related. We invented a component (namely Flow Manager Verification IP) to be intervening between the application (test case) and the protocol VIP (with UVM sequencer) to control the bandwidth per thread/virtual channel/flow. The Flow Manager has knobs visible to the UVM sequence/test to configure the required distribution of rate per thread/virtual channel/flow. This works seamlessly and produces rate stimuli to further harness the Design Under Test (DUT) with asymmetric inputs compared to the programmed bandwidth/Quality of Service (QoS) distributions in the Design Under Test.

Keywords: flow manager, UVM sequencer, rated traffic generation, quality of service

Procedia PDF Downloads 99
2966 Manufacturing Facility Location Selection: A Numercal Taxonomy Approach

Authors: Seifoddini Hamid, Mardikoraeem Mahsa, Ghorayshi Roya

Abstract:

Manufacturing facility location selection is an important strategic decision for many industrial corporations. In this paper, a new approach to the manufacturing location selection problem is proposed. In this approach, cluster analysis is employed to identify suitable manufacturing locations based on economic, social, environmental, and political factors. These factors are quantified using the existing real world data.

Keywords: manufacturing facility, manufacturing sites, real world data

Procedia PDF Downloads 563
2965 Unveiling the Nexus: A Holistic Investigation on the Role of Cultural Beliefs and Family Dynamics in Shaping Maternal Health in Primigravida Women

Authors: Anum Obaid, Bushra Noor, Zoshia Zainab

Abstract:

In South Asian countries, Pakistan faces significant public health challenges regarding maternal and neonatal health (MNH). Despite global efforts to improve maternal, newborn, child, and health (MNCH) outcomes through initiatives like the Millennium Development Goals (MDGs) and Sustainable Development Goals (SDGs), high maternal and neonatal mortality rates persist. In patriarchal societies, cultural norms, family dynamics, and gender roles heavily influence healthcare accessibility and decision-making processes, often leading to delayed and inadequate maternal care. Addressing these socio-cultural barriers and enhancing healthcare resources is crucial to improving maternal health outcomes in areas like Faisalabad. A qualitative study was conducted involving two groups of informants: gynecologists practicing in private clinics and first-time pregnant women receiving care in government hospitals. Data collection included obtaining institutional permission, conducting semi-structured in-depth interviews, and using non-probability sampling techniques. A proactive strategy to overcome maternal health challenges involves using aversion therapy and disseminating knowledge among family members. This approach aims to foster a deep understanding within the family unit regarding the importance of maternal well-being, thereby creating a supportive environment and facilitating informed decision-making related to healthcare access and lifestyle choices. The findings indicate that maternal health is compromised both physiologically and psychologically, with significant implications for the baby's health. Mental well-being is profoundly affected, largely due to familial behavior and entrenched cultural taboos.

Keywords: maternal health, neonatal health, socio-cultural norms, primigravida women, gynecologist, familial conduct, cultural taboos

Procedia PDF Downloads 40
2964 Impact of Ozone Produced by Vehicular Emission on Chronic Obstructive Pulmonary Disease

Authors: Mohd Kamil Vakil

Abstract:

Air Pollution is caused by the introduction of chemicals in the biosphere. Primary pollutants on reaction with the components of the earth produce Secondary Pollutants like Smog. Ozone is the main ingredient of Smog. The ground level ozone is created by the chemical reactions between Nitrogen Oxides (NOx) and Volatile Organic Compounds (VOCs) in the presence of Sunlight. This ozone can enter inside and call as indoor ozone. The automobile emissions in both moving and idling conditions contribute to the indoor ozone formation. During engine ignition and shutdown, motor vehicles emit the ozone forming pollutants like NOx and VOCs, and the phenomena are called Cold Start and Hot-Soak respectively. Subjects like Chronic Obstructive Pulmonary Disease (COPD) and asthma associated with chronic respiratory diseases are susceptible to the harmful effects of Indoor Ozone. The most common cause of COPD other than smoking is the long-term contract with harmful pollutants like ground-level ozone. It is estimated by WHO that COPD will become the third leading cause of all deaths worldwide by 2030. In this paper, the cold-start and hot-soak vehicle emissions are studied in the context of accumulation of oxides of nitrogen at the outer walls of the building which may cause COPD. The titanium oxide coated building material is further discussed as an absorber of NOx when applied to the walls and roof.

Keywords: indoor air quality, cold start emission, hot-soak, ozone

Procedia PDF Downloads 204
2963 Human-Automation Interaction in Law: Mapping Legal Decisions and Judgments, Cognitive Processes, and Automation Levels

Authors: Dovile Petkeviciute-Barysiene

Abstract:

Legal technologies not only create new ways for accessing and providing legal services but also transform the role of legal practitioners. Both lawyers and users of legal services expect automated solutions to outperform people with objectivity and impartiality. Although fairness of the automated decisions is crucial, research on assessing various characteristics of automated processes related to the perceived fairness has only begun. One of the major obstacles to this research is the lack of comprehensive understanding of what legal actions are automated and could be meaningfully automated, and to what extent. Neither public nor legal practitioners oftentimes cannot envision technological input due to the lack of general without illustrative examples. The aim of this study is to map decision making stages and automation levels which are and/or could be achieved in legal actions related to pre-trial and trial processes. Major legal decisions and judgments are identified during the consultations with legal practitioners. The dual-process model of information processing is used to describe cognitive processes taking place while making legal decisions and judgments during pre-trial and trial action. Some of the existing legal technologies are incorporated into the analysis as well. Several published automation level taxonomies are considered because none of them fit well into the legal context, as they were all created for avionics, teleoperation, unmanned aerial vehicles, etc. From the information processing perspective, analysis of the legal decisions and judgments expose situations that are most sensitive to cognitive bias, among others, also help to identify areas that would benefit from the automation the most. Automation level analysis, in turn, provides a systematic approach to interaction and cooperation between humans and algorithms. Moreover, an integrated map of legal decisions and judgments, information processing characteristics, and automation levels all together provide some groundwork for the research of legal technology perceived fairness and acceptance. Acknowledgment: This project has received funding from European Social Fund (project No 09.3.3-LMT-K-712-19-0116) under grant agreement with the Research Council of Lithuania (LMTLT).

Keywords: automation levels, information processing, legal judgment and decision making, legal technology

Procedia PDF Downloads 142
2962 Determining the Policy Space of the Partido Socialista Obrero Español Government in Managing Spain's Economic and Financial Crisis

Authors: A. Pascual Ramsay

Abstract:

Accounts of the management of the economic and euro crisis in Spain have been dominated by an emphasis on external constraints. However, this approach leaves unanswered important questions about the role of domestic political factors. Using systematic qualitative primary research and employing elite interviewing and process tracing, this paper aims to fill this gap for the period of the Partido Socialista Obrero Español (PSOE) administration. The paper shows that domestic politics played a crucial role in the management of the crisis, most importantly by determining the shape of the measures undertaken. In its three distinct stages – downplaying/inaction, reaction/stimulus, and austerity/reform – the PSOE's response was certainly constrained by external factors, most notably EMU membership and the actions of sovereign-bond investors, the ECB and Germany. Yet while these external constraints forced the government to act, domestic political factors fundamentally shaped the content of key measures: the fiscal stimulus, the labour, financial and pension reforms, the refusal to accept a bailout or the reform of the Constitution. Seven factors were particularly influential: i) electoral and political cost, ii) party and partisanship, iii) organised interests, iv) domestic institutions, v) ideological preferences, vi) ineffective decision-making, and vii) judgement and personal characteristics of decision-makers. In conclusion, domestic politics played an important role in the management of the crisis, a role that has been underestimated by dominant approaches focusing on external constraints and weak domestic policy autonomy. The findings provide empirical evidence to support research agendas that identify significant state discretion in the face of international economic integration and an important role for domestic political factors such as institutions, material interests, partisanship and ideology in shaping economic outcomes.

Keywords: economic crisis, Euro, PSOE, Spain

Procedia PDF Downloads 120
2961 Professional Skills Development of Educational Leaders Through Drama in Education: An Example of Best Practice in Greece

Authors: Christina Zourna, Ioanna Papavassiliou-Alexiou

Abstract:

Drama in Education (DiE) is a dynamic experiential method that can be used in many interdisciplinary contexts. In the Educational and Social Policy Department, University of Macedonia, Thessaloniki, Greece, DiE is being used as a core method for developing professional competences in pre- and postgraduate courses as well as adult education training programs. In this presentation, an innovative DiE application will be described concerning the development of educational leaders’ skills necessary to meet unprecedented, unexpected challenges in the 21st century schools. In a non-threatening risk-taking no-penalty environment, future educational leaders live-in-role problems, challenges, and dilemmas before having to face similar ones in their profession. Through personal involvement, emotional engagement, and reflection, via individual and group activities, they experience the behaviour, dilemmas, decision-making processes, and informed choices of a recognized leader and are able to make connections with their own life. As pretext serves the life of Alexander the Great, the Macedonian King who defeated the vast Persian empire in the 4th century BC and, by uniting all Greeks, conquered the up-to-date known eastern world thanks to his authentic leadership skills and exceptional personality traits. Since the early years of his education mastered by the famous Greek philosopher Aristotle, Alexander proved his unique qualities by providing the world with the example of an undeniably genuine, inspirational, effective, and most recognizable authentic leader. Through questionnaires and individual interviews, participants in these workshops revealed how they developed active listening, empathy, creativity, imagination, critical strategic and out-of-the-box thinking, cooperation and own vision communicating, crisis management skills, self-efficacy, self-awareness, self-exposure, information management, negotiation and inspiration skills, enhanced sense of responsibility and commitment, and decision-making skills.

Keywords: drama in education method, educational leadership, professional competences, skills’ development

Procedia PDF Downloads 156
2960 Lineup Optimization Model of Basketball Players Based on the Prediction of Recursive Neural Networks

Authors: Wang Yichen, Haruka Yamashita

Abstract:

In recent years, in the field of sports, decision making such as member in the game and strategy of the game based on then analysis of the accumulated sports data are widely attempted. In fact, in the NBA basketball league where the world's highest level players gather, to win the games, teams analyze the data using various statistical techniques. However, it is difficult to analyze the game data for each play such as the ball tracking or motion of the players in the game, because the situation of the game changes rapidly, and the structure of the data should be complicated. Therefore, it is considered that the analysis method for real time game play data is proposed. In this research, we propose an analytical model for "determining the optimal lineup composition" using the real time play data, which is considered to be difficult for all coaches. In this study, because replacing the entire lineup is too complicated, and the actual question for the replacement of players is "whether or not the lineup should be changed", and “whether or not Small Ball lineup is adopted”. Therefore, we propose an analytical model for the optimal player selection problem based on Small Ball lineups. In basketball, we can accumulate scoring data for each play, which indicates a player's contribution to the game, and the scoring data can be considered as a time series data. In order to compare the importance of players in different situations and lineups, we combine RNN (Recurrent Neural Network) model, which can analyze time series data, and NN (Neural Network) model, which can analyze the situation on the field, to build the prediction model of score. This model is capable to identify the current optimal lineup for different situations. In this research, we collected all the data of accumulated data of NBA from 2019-2020. Then we apply the method to the actual basketball play data to verify the reliability of the proposed model.

Keywords: recurrent neural network, players lineup, basketball data, decision making model

Procedia PDF Downloads 133
2959 A Review of Benefit-Risk Assessment over the Product Lifecycle

Authors: M. Miljkovic, A. Urakpo, M. Simic-Koumoutsaris

Abstract:

Benefit-risk assessment (BRA) is a valuable tool that takes place in multiple stages during a medicine's lifecycle, and this assessment can be conducted in a variety of ways. The aim was to summarize current BRA methods used during approval decisions and in post-approval settings and to see possible future directions. Relevant reviews, recommendations, and guidelines published in medical literature and through regulatory agencies over the past five years have been examined. BRA implies the review of two dimensions: the dimension of benefits (determined mainly by the therapeutic efficacy) and the dimension of risks (comprises the safety profile of a drug). Regulators, industry, and academia have developed various approaches, ranging from descriptive textual (qualitative) to decision-analytic (quantitative) models, to facilitate the BRA of medicines during the product lifecycle (from Phase I trials, to authorization procedure, post-marketing surveillance and health technology assessment for inclusion in public formularies). These approaches can be classified into the following categories: stepwise structured approaches (frameworks); measures for benefits and risks that are usually endpoint specific (metrics), simulation techniques and meta-analysis (estimation techniques), and utility survey techniques to elicit stakeholders’ preferences (utilities). All these approaches share the following two common goals: to assist this analysis and to improve the communication of decisions, but each is subject to its own specific strengths and limitations. Before using any method, its utility, complexity, the extent to which it is established, and the ease of results interpretation should be considered. Despite widespread and long-time use, BRA is subject to debate, suffers from a number of limitations, and currently is still under development. The use of formal, systematic structured approaches to BRA for regulatory decision-making and quantitative methods to support BRA during the product lifecycle is a standard practice in medicine that is subject to continuous improvement and modernization, not only in methodology but also in cooperation between organizations.

Keywords: benefit-risk assessment, benefit-risk profile, product lifecycle, quantitative methods, structured approaches

Procedia PDF Downloads 154
2958 A Non-Parametric Analysis of District Disaster Management Authorities in Punjab, Pakistan

Authors: Zahid Hussain

Abstract:

Provincial Disaster Management Authority (PDMA) Punjab was established under NDM Act 2010 and now working under Senior Member Board of Revenue, deals with the whole spectrum of disasters including preparedness, mitigation, early warning, response, relief, rescue, recovery and rehabilitation. The District Disaster Management Authorities (DDMA) are acting as implementing arms of PDMA in the districts to respond any disaster. DDMAs' role is very important in disaster mitigation, response and recovery as they are the first responder and closest tier to the community. Keeping in view the significant role of DDMAs, technical and human resource capacity are need to be checked. For calculating the technical efficiencies of District Disaster Management Authority (DDMA) in Punjab, three inputs like number of labour, the number of transportation and number of equipment, two outputs like relief assistance and the number of rescue and 25 districts as decision making unit have been selected. For this purpose, 8 years secondary data from 2005 to 2012 has been used. Data Envelopment Analysis technique has been applied. DEA estimates the relative efficiency of peer entities or entities performing the similar tasks. The findings show that all decision making unit (DMU) (districts) are inefficient on techonological and scale efficiency scale while technically efficient on pure and total factor productivity efficiency scale. All DMU are found technically inefficient only in the year 2006. Labour and equipment were not efficiently used in the year 2005, 2007, 2008, 2009 and 2012. Furthermore, only three years 2006, 2010 and 2011 show that districts could not efficiently use transportation in a disaster situation. This study suggests that all districts should curtail labour, transportation and equipment to be efficient. Similarly, overall all districts are not required to achieve number of rescue and relief assistant, these should be reduced.

Keywords: DEA, DMU, PDMA, DDMA

Procedia PDF Downloads 246
2957 Analytic Hierarchy Process

Authors: Hadia Rafi

Abstract:

To make any decision in any work/task/project it involves many factors that needed to be looked. The analytic Hierarchy process (AHP) is based on the judgments of experts to derive the required results this technique measures the intangibles and then by the help of judgment and software analysis the comparisons are made which shows how much a certain element/unit leads another. AHP includes how an inconsistent judgment should be made consistent and how the judgment should be improved when possible. The Priority scales are obtained by multiplying them with the priority of their parent node and after that they are added.

Keywords: AHP, priority scales, parent node, software analysis

Procedia PDF Downloads 406
2956 Comparing Performance of Neural Network and Decision Tree in Prediction of Myocardial Infarction

Authors: Reza Safdari, Goli Arji, Robab Abdolkhani Maryam zahmatkeshan

Abstract:

Background and purpose: Cardiovascular diseases are among the most common diseases in all societies. The most important step in minimizing myocardial infarction and its complications is to minimize its risk factors. The amount of medical data is increasingly growing. Medical data mining has a great potential for transforming these data into information. Using data mining techniques to generate predictive models for identifying those at risk for reducing the effects of the disease is very helpful. The present study aimed to collect data related to risk factors of heart infarction from patients’ medical record and developed predicting models using data mining algorithm. Methods: The present work was an analytical study conducted on a database containing 350 records. Data were related to patients admitted to Shahid Rajaei specialized cardiovascular hospital, Iran, in 2011. Data were collected using a four-sectioned data collection form. Data analysis was performed using SPSS and Clementine version 12. Seven predictive algorithms and one algorithm-based model for predicting association rules were applied to the data. Accuracy, precision, sensitivity, specificity, as well as positive and negative predictive values were determined and the final model was obtained. Results: five parameters, including hypertension, DLP, tobacco smoking, diabetes, and A+ blood group, were the most critical risk factors of myocardial infarction. Among the models, the neural network model was found to have the highest sensitivity, indicating its ability to successfully diagnose the disease. Conclusion: Risk prediction models have great potentials in facilitating the management of a patient with a specific disease. Therefore, health interventions or change in their life style can be conducted based on these models for improving the health conditions of the individuals at risk.

Keywords: decision trees, neural network, myocardial infarction, Data Mining

Procedia PDF Downloads 429
2955 Automation of AAA Game Development using AI and Procedural Generation

Authors: Paul Toprac, Branden Heng, Harsheni Siddharthan, Allison Tseng, Sarah Abraham, Etienne Vouga

Abstract:

The goal of this project was to evaluate and document the capabilities and limitations of AI tools for empowering small teams to create high budget, high profile (AAA) 3D games typically developed by large studios. Two teams of novice game developers attempted to create two different games using AI and Unreal Engine 5.3. First, the teams evaluated 60 AI art, design, sound, and programming tools by considering their capability, ease of use, cost, and license restrictions. Then, the teams used a shortlist of 13 AI tools for game development. During this process, the following tools were found to be the most productive: (1) ChatGPT 4.0 for both game and narrative concepting and documentation; (2) Dall-E 3 and OpenArt for concept art; (3) Beatoven for music drafting; (4) Epic PCG for level design; and (5) ChatGPT 4.0 and Github Copilot for generating simple code and to complement human-made tutorials as an additional learning resource. While current generative AI may appear impressive at first glance, the assets they produce fall short of AAA industry standards. Generative AI tools are helpful when brainstorming ideas such as concept art and basic storylines, but they still cannot replace human input or creativity at this time. Regarding programming, AI can only effectively generate simple code and act as an additional learning resource. Thus, generative AI tools are at best tools to enhance developer productivity rather than as a system to replace developers.

Keywords: AAA games, AI, automation tools, game development

Procedia PDF Downloads 23
2954 Powering Connections: Synergizing Sales and Marketing for Electronics Engineering with Web Development.

Authors: Muhammad Awais Kiani, Abdul Basit Kiani, Maryam Kiani

Abstract:

Synergizing Sales and Marketing for Electronics Engineering with Web Development, explores the dynamic relationship between sales, marketing, and web development within the electronics engineering industry. This study is important for the power of digital platforms to connect with customers. Which increases brand visibility and drives sales. It highlights the need for collaboration between sales and marketing teams, as well as the integration of web development strategies to create seamless user experiences and effective lead generation. Furthermore, It also emphasizes the role of data analytics and customer insights in optimizing sales and marketing efforts in the ever-evolving landscape of electronics engineering. Sales and marketing play a crucial role in driving business growth, and in today's digital landscape, web development has become an integral part of these strategies. Web development enables businesses to create visually appealing and user-friendly websites that effectively showcase their products or services. It allows for the integration of e-commerce functionalities, enabling seamless online transactions. Furthermore, web development helps businesses optimize their online presence through search engine optimization (SEO) techniques, social media integration, and content management systems. This abstract highlights the symbiotic relationship between sales marketing in the electronics industry and web development, emphasizing the importance of a strong online presence in achieving business success.

Keywords: electronics industry, web development, sales, marketing

Procedia PDF Downloads 116
2953 Systematic and Meta-Analysis of Navigation in Oral and Maxillofacial Trauma and Impact of Machine Learning and AI in Management

Authors: Shohreh Ghasemi

Abstract:

Introduction: Managing oral and maxillofacial trauma is a multifaceted challenge, as it can have life-threatening consequences and significant functional and aesthetic impact. Navigation techniques have been introduced to improve surgical precision to meet this challenge. A machine learning algorithm was also developed to support clinical decision-making regarding treating oral and maxillofacial trauma. Given these advances, this systematic meta-analysis aims to assess the efficacy of navigational techniques in treating oral and maxillofacial trauma and explore the impact of machine learning on their management. Methods: A detailed and comprehensive analysis of studies published between January 2010 and September 2021 was conducted through a systematic meta-analysis. This included performing a thorough search of Web of Science, Embase, and PubMed databases to identify studies evaluating the efficacy of navigational techniques and the impact of machine learning in managing oral and maxillofacial trauma. Studies that did not meet established entry criteria were excluded. In addition, the overall quality of studies included was evaluated using Cochrane risk of bias tool and the Newcastle-Ottawa scale. Results: Total of 12 studies, including 869 patients with oral and maxillofacial trauma, met the inclusion criteria. An analysis of studies revealed that navigation techniques effectively improve surgical accuracy and minimize the risk of complications. Additionally, machine learning algorithms have proven effective in predicting treatment outcomes and identifying patients at high risk for complications. Conclusion: The introduction of navigational technology has great potential to improve surgical precision in oral and maxillofacial trauma treatment. Furthermore, developing machine learning algorithms offers opportunities to improve clinical decision-making and patient outcomes. Still, further studies are necessary to corroborate these results and establish the optimal use of these technologies in managing oral and maxillofacial trauma

Keywords: trauma, machine learning, navigation, maxillofacial, management

Procedia PDF Downloads 58
2952 Effects of Inlet Distorted Flows on the Performance of an Axial Compressor

Authors: Asad Islam, Khalid Parvez

Abstract:

Compressor fans in modern aircraft engines are of considerate importance, as they provide majority of thrust required by the aircraft. Their challenging environment is frequently subjected to non-uniform inflow conditions. These conditions could be either due to the flight operating requirements such as take-off and landing, wake interference from aircraft fuselage or cross-flow wind conditions. So, in highly maneuverable flights regimes of fighter aircrafts affects the overall performance of an engine. Since the flow in compressor of an aircraft application is highly sensitive because of adverse pressure gradient due to different flow orientations of the aircraft. Therefore, it is prone to unstable operations. This paper presents the study that focuses on axial compressor response to inlet flow orientations for the range of angles as 0 to 15 degrees. For this purpose, NASA Rotor-37 was taken and CFD mesh was developed. The compressor characteristics map was generated for the design conditions of pressure ratio of 2.106 with the rotor operating at rotational velocity of 17188.7 rpm using CFD simulating environment of ANSYS-CFX®. The grid study was done to see the effects of mesh upon computational solution. Then, the mesh giving the best results, (when validated with the available experimental NASA’s results); was used for further distortion analysis. The flow in the inlet nozzle was given angle orientations ranging from 0 to 15 degrees. The CFD results are analyzed and discussed with respect to stall margin and flow separations due to induced distortions.

Keywords: axial compressor, distortions, angle, CFD, ANSYS-CFX®, bladegen®

Procedia PDF Downloads 456
2951 Machine Learning Approach for Predicting Students’ Academic Performance and Study Strategies Based on Their Motivation

Authors: Fidelia A. Orji, Julita Vassileva

Abstract:

This research aims to develop machine learning models for students' academic performance and study strategy prediction, which could be generalized to all courses in higher education. Key learning attributes (intrinsic, extrinsic, autonomy, relatedness, competence, and self-esteem) used in building the models are chosen based on prior studies, which revealed that the attributes are essential in students’ learning process. Previous studies revealed the individual effects of each of these attributes on students’ learning progress. However, few studies have investigated the combined effect of the attributes in predicting student study strategy and academic performance to reduce the dropout rate. To bridge this gap, we used Scikit-learn in python to build five machine learning models (Decision Tree, K-Nearest Neighbour, Random Forest, Linear/Logistic Regression, and Support Vector Machine) for both regression and classification tasks to perform our analysis. The models were trained, evaluated, and tested for accuracy using 924 university dentistry students' data collected by Chilean authors through quantitative research design. A comparative analysis of the models revealed that the tree-based models such as the random forest (with prediction accuracy of 94.9%) and decision tree show the best results compared to the linear, support vector, and k-nearest neighbours. The models built in this research can be used in predicting student performance and study strategy so that appropriate interventions could be implemented to improve student learning progress. Thus, incorporating strategies that could improve diverse student learning attributes in the design of online educational systems may increase the likelihood of students continuing with their learning tasks as required. Moreover, the results show that the attributes could be modelled together and used to adapt/personalize the learning process.

Keywords: classification models, learning strategy, predictive modeling, regression models, student academic performance, student motivation, supervised machine learning

Procedia PDF Downloads 128
2950 Neighborhood-Scape as a Methodology for Enhancing Gulf Region Cities' Quality of Life: Case of Doha, Qatar

Authors: Eman AbdelSabour

Abstract:

Sustainability is increasingly being considered as a critical aspect in shaping the urban environment. It works as an invention development basis for global urban growth. Currently, different models and structures impact the means of interpreting the criteria that would be included in defining a sustainable city. There is a collective need to improve the growth path to an extremely durable path by presenting different suggestions regarding multi-scale initiatives. The global rise in urbanization has led to increased demand and pressure for better urban planning choice and scenarios for a better sustainable urban alternative. The need for an assessment tool at the urban scale was prompted due to the trend of developing increasingly sustainable urban development (SUD). The neighborhood scale is being managed by a growing research committee since it seems to be a pertinent scale through which economic, environmental, and social impacts could be addressed. Although neighborhood design is a comparatively old practice, it is in the initial years of the 21st century when environmentalists and planners started developing sustainable assessment at the neighborhood level. Through this, urban reality can be considered at a larger scale whereby themes which are beyond the size of a single building can be addressed, while it still stays small enough that concrete measures could be analyzed. The neighborhood assessment tool has a crucial role in helping neighborhood sustainability to perform approach and fulfill objectives through a set of themes and criteria. These devices are also known as neighborhood assessment tool, district assessment tool, and sustainable community rating tool. The primary focus of research has been on sustainability from the economic and environmental aspect, whereas the social, cultural issue is rarely focused. Therefore, this research is based on Doha, Qatar, the current urban conditions of the neighborhoods is discussed in this study. The research problem focuses on the spatial features in relation to the socio-cultural aspects. This study is outlined in three parts; the first section comprises of review of the latest use of wellbeing assessment methods to enhance decision process of retrofitting physical features of the neighborhood. The second section discusses the urban settlement development, regulations and the process of decision-making rule. An analysis of urban development policy with reference to neighborhood development is also discussed in this section. Moreover, it includes a historical review of the urban growth of the neighborhoods as an atom of the city system present in Doha. Last part involves developing quantified indicators regarding subjective well-being through a participatory approach. Additionally, applying GIS will be utilized as a visualizing tool for the apparent Quality of Life (QoL) that need to develop in the neighborhood area as an assessment approach. Envisaging the present QoL situation in Doha neighborhoods is a process to improve current condition neighborhood function involves many days to day activities of the residents, due to which areas are considered dynamic.

Keywords: neighborhood, subjective wellbeing, decision support tools, Doha, retrofiring

Procedia PDF Downloads 138
2949 Signals Monitored During Anaesthesia

Authors: Launcelot McGrath, Xiaoxiao Liu, Colin Flanagan

Abstract:

It is widely recognised that a comprehensive understanding of physiological data is a vital aid to the anaesthesiologist in monitoring and maintaining the well-being of a patient undergoing surgery. Bio signal analysis is one of the most important topics that researchers have tried to develop over the last century to understand numerous human diseases. There are tremendous biological signals during anaesthesia, and not all of them are important, which to choose to observe is a significant decision. It is important that the anaesthesiologist understand both the signals themselves, and the limitations introduced by the processes of acquisition. In this article, we provide an all-sided overview of different types of biological signals as well as the mechanisms applied to acquire them.

Keywords: general biosignals, anaesthesia, biological, electroencephalogram

Procedia PDF Downloads 105
2948 Testing and Validation Stochastic Models in Epidemiology

Authors: Snigdha Sahai, Devaki Chikkavenkatappa Yellappa

Abstract:

This study outlines approaches for testing and validating stochastic models used in epidemiology, focusing on the integration and functional testing of simulation code. It details methods for combining simple functions into comprehensive simulations, distinguishing between deterministic and stochastic components, and applying tests to ensure robustness. Techniques include isolating stochastic elements, utilizing large sample sizes for validation, and handling special cases. Practical examples are provided using R code to demonstrate integration testing, handling of incorrect inputs, and special cases. The study emphasizes the importance of both functional and defensive programming to enhance code reliability and user-friendliness.

Keywords: computational epidemiology, epidemiology, public health, infectious disease modeling, statistical analysis, health data analysis, disease transmission dynamics, predictive modeling in health, population health modeling, quantitative public health, random sampling simulations, randomized numerical analysis, simulation-based analysis, variance-based simulations, algorithmic disease simulation, computational public health strategies, epidemiological surveillance, disease pattern analysis, epidemic risk assessment, population-based health strategies, preventive healthcare models, infection dynamics in populations, contagion spread prediction models, survival analysis techniques, epidemiological data mining, host-pathogen interaction models, risk assessment algorithms for disease spread, decision-support systems in epidemiology, macro-level health impact simulations, socioeconomic determinants in disease spread, data-driven decision making in public health, quantitative impact assessment of health policies, biostatistical methods in population health, probability-driven health outcome predictions

Procedia PDF Downloads 6